From: Kenny Tilton
Subject: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <OYFue.13881$XB2.3578516@twister.nyc.rr.com>
[apologies if this has materialised in similar form or does so soon 
unbeknownst to me, but from where I sit it appears Google ate a similar 
report posted yesterday via google groups.]

Dr. McCarthy joined with Henry Baker, his predecessor at the microphone, 
in bemoaning the standardization of Common Lisp as stultifying if not 
mortifying, in that it ended innovation.

When rahul defended standardization as allowing his code to run ten 
years from now, McCarthy indicated that (paraphrasing) by the looks of 
Rahul it was unlikely he would produce code that anyone would want to 
run ten years from now.*

XML had the honor of having McCarthy stop in the middle of a meandering 
bit of reflection to mention how much he disliked XML.

And when your correspondent asked why he had chosen such a crappy name 
for such a great language and whether he regretted, in what is becoming 
an annual rite of humiliation, he pretty much ignored my question, but 
did mention that his preference had been FLPL, for Fortran List 
Processing Language, because he liked Fortran.

Intriguingly, there is a Fortran package with that exact name and 
acronym and function, created in 1960 as far as I can make out from some 
light googling.

* McCarthy actually meant that very little code lasts ten years.

-- 
Kenny

Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film

"If you plan to enter text which our system might consider to be 
obscene, check here to certify that you are old enough to hear the 
resulting output." -- Bell Labs text-to-speech interactive Web page

From: Pascal Costanza
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <3i177sFj9k8gU1@individual.net>
Kenny Tilton wrote:
> [apologies if this has materialised in similar form or does so soon 
> unbeknownst to me, but from where I sit it appears Google ate a similar 
> report posted yesterday via google groups.]
> 
> Dr. McCarthy joined with Henry Baker, his predecessor at the microphone, 
> in bemoaning the standardization of Common Lisp as stultifying if not 
> mortifying, in that it ended innovation.

As much as I like Common Lisp, I think he has a point here.

> When rahul defended standardization as allowing his code to run ten 
> years from now, McCarthy indicated that (paraphrasing) by the looks of 
> Rahul it was unlikely he would produce code that anyone would want to 
> run ten years from now.*

This was one of the most bizarre moments I have experienced ever, that 
people tried to convince John McCarthy that standardization is actually 
a good thing. As if he would ever care.

It was clear from his talk that he cares about a long-term vision 
(namely how to achieve human-level artificial intelligence). Language 
standardization is worth zilch in that regard.



Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Peter Seibel
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <m2ll50h1lj.fsf@gigamonkeys.com>
Pascal Costanza <··@p-cos.net> writes:

> Kenny Tilton wrote:
>> [apologies if this has materialised in similar form or does so soon
>> unbeknownst to me, but from where I sit it appears Google ate a
>> similar report posted yesterday via google groups.]
>> Dr. McCarthy joined with Henry Baker, his predecessor at the
>> microphone, in bemoaning the standardization of Common Lisp as
>> stultifying if not mortifying, in that it ended innovation.
>
> As much as I like Common Lisp, I think he has a point here.

As did Baker, or rather a dozen or so good points--can't wait until
his full slidedeck is available.

>> When rahul defended standardization as allowing his code to run ten
>> years from now, McCarthy indicated that (paraphrasing) by the looks
>> of Rahul it was unlikely he would produce code that anyone would
>> want to run ten years from now.*
>
> This was one of the most bizarre moments I have experienced ever,
> that people tried to convince John McCarthy that standardization is
> actually a good thing. As if he would ever care.

While concuring with Pascal that that was a weird moment (and it
wasn't just Rahul who tried to convince McCarthy that the standard was
a good thing), I'd like to point out that I don't think McCarthy was
insulting Rahul--merely misunderstanding him. From where I was sitting
it sounded like Rahul started his comment by saying something along
the lines of, "I don't care about standardization because it's going
to ensure that code that was written 20 years ago still runs today
...". In that he was riffing off a previously comment from someone
else in the audience. He went on to say that the reason he was glad
there was a standard was because it meant there were multiple
implementations *today* that could all run his code, each with
different strengths and weaknesses. However McCarthy appeared to have
heard him to say that he did care about having code from 20 years ago
that ran today and said that based on Rahul's appearance, it didn't
seem that he could have any code from 20 years ago that he'd need to
run today, i.e. Rahul is too young. A slight dig, perhaps but not
actually an insult. Just didn't want folks to think that McCarthy went
out of his way to be rude to folks.

-Peter

-- 
Peter Seibel           * ·····@gigamonkeys.com
Gigamonkeys Consulting * http://www.gigamonkeys.com/
Practical Common Lisp  * http://www.gigamonkeys.com/book/
From: Sashank Varma
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1119591166.381878.17180@g49g2000cwa.googlegroups.com>
Peter Seibel wrote:

> However McCarthy appeared to have
> heard him to say that he did care about having code from 20 years ago
> that ran today and said that based on Rahul's appearance, it didn't
> seem that he could have any code from 20 years ago that he'd need to
> run today, i.e. Rahul is too young. A slight dig, perhaps but not
> actually an insult.

This is how I interpreted it as well.

The more general question -- Did standardization produce
stultification? -- is quite provocative though. Really, there are two
questions here:

(1) Has progress in Lisp slowed dramatically since CLtL1? (And this is
really what Baker and McCarthy meant by standardization -- the
ascension of Common Lisp.)

(2) Did CLtL1 *cause* this slowdown?

IMO, the answer to (1) is "yes" and the answer to (2) is "no." The
*real* reason progress slowed -- again, IMO -- was the dramatic drop in
both interest in and funding for Lisp following AI Winter, which began
around........1984. If this is correct, then standardization was
probably critical in keeping the dwindling community together.

This brings up an interesting question: Is the binding constraint of
the standard, which was critical during the 1980s and 1990s, gonna
choke the community now that it is again showing signs of growth?

It's a real question. One possibility is that as the community grows,
so will a parallel movement to open, clean up, modify, and extend the
standard. A harbinger of this is the CLRFI process, which is currently
trying to bootstrap itself. Another possibility is that the community
will split in a healthy way, with business users adhering closely to
the standard in the interests of portability, and with academics again
experimenting with new features and birthing new dialects.

Sashank
From: Pascal Costanza
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <3i1ochFj6dfeU1@individual.net>
Sashank Varma wrote:

> The more general question -- Did standardization produce
> stultification? -- is quite provocative though. Really, there are two
> questions here:
> 
> (1) Has progress in Lisp slowed dramatically since CLtL1? (And this is
> really what Baker and McCarthy meant by standardization -- the
> ascension of Common Lisp.)
> 
> (2) Did CLtL1 *cause* this slowdown?
> 
> IMO, the answer to (1) is "yes" and the answer to (2) is "no."

I think this is still oversimplified, but a good step closer to the truth.

> The
> *real* reason progress slowed -- again, IMO -- was the dramatic drop in
> both interest in and funding for Lisp following AI Winter, which began
> around........1984. If this is correct, then standardization was
> probably critical in keeping the dwindling community together.
> 
> This brings up an interesting question: Is the binding constraint of
> the standard, which was critical during the 1980s and 1990s, gonna
> choke the community now that it is again showing signs of growth?

I don't think so. I see a difference between minor improvements of the 
language and fundamental improvements. Minor improvements are things 
like case sensitivity of symbols, better function names, improved 
collection frameworks, GUI APIs, database libraries, iterate, etc. pp. 
They are minor in the sense that the basic concepts are already there, 
they provably work, noone is hindered in making use of these things. It 
may or may not be useful to integrate them into the Common Lisp 
standard, but it doesn't cause serious problems that they are not in the 
standard. Common Lisp is flexible enough to make these things work anyway.

Major improvements would be new programming paradigms. Imagine something 
like OOP or neural networks didn't exist yet. Lisp has provably shown 
that it is again flexible enough to provide an excellent framework for 
developing new programming approaches.

I think the negative effect of standards is not a real one - noone 
hinders anyone to start even completely from scratch and build 
completely new languages - but a psychological one. Standards induce the 
belief that we have somehow reached a final stage in computer science 
and that we only need to fill a few gaps and fix some annoying details, 
and we're done. I think this is far from the truth.

In the industrial arena, companies like Sun and Microsoft and their 
pseudo-standards are much worse in making people belief that we are 
already "there", and in the acadamic realm, programming language 
theorists are stifling real progress. When real breakthroughs first 
appeared, they have always been useless in practice and unsound in 
theory, and only much later developed into something practical and 
well-understood.

In that regard, standards and standardization hurt.



Pascal


-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Tayssir John Gabbour
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1119601311.985154.320930@g44g2000cwa.googlegroups.com>
Pascal Costanza wrote:
> Major improvements would be new programming paradigms. Imagine something
> like OOP or neural networks didn't exist yet. Lisp has provably shown
> that it is again flexible enough to provide an excellent framework for
> developing new programming approaches.
>
> I think the negative effect of standards is not a real one - noone
> hinders anyone to start even completely from scratch and build
> completely new languages - but a psychological one. Standards induce the
> belief that we have somehow reached a final stage in computer science
> and that we only need to fill a few gaps and fix some annoying details,
> and we're done. I think this is far from the truth.
>
> In the industrial arena, companies like Sun and Microsoft and their
> pseudo-standards are much worse in making people belief that we are
> already "there", and in the acadamic realm, programming language
> theorists are stifling real progress. When real breakthroughs first
> appeared, they have always been useless in practice and unsound in
> theory, and only much later developed into something practical and
> well-understood.
>
> In that regard, standards and standardization hurt.

I'm surprised because it seems almost obvious that standards are there
to put a halt on certain kinds of innovation. One guy who helped out
the standards claimed that CL standardization killed Interlisp:

"Common Lisp as a dialect arose because institutions with a large
investment in Lisp (primarily ARPA, actually, but also some
commercial entitites) were tired of the semantics of Lisp changing
daily and they wanted to be able to invest in something which was
both powerful and stable.  My understanding is that they had proposed
Interlisp because it appeared to have the largest installed base,
and this caused the myriad Maclisp-variants to declare that they
differed in only-gratuitous ways which if solidified would constitute
a larger installed base than Interlisp.  CL was the result, and
Interlisp was killed."
http://groups-beta.google.com/group/comp.lang.lisp/msg/b525c6d8367c3d4a?hl=en

Further, innovation motive can clearly conflict with the commercial
motive, since some customers don't want to recompile and retest apps
built on evolving platforms. And you almost get proof from Microsoft
who endlessly repeats "innovation" because that's one of its weaknesses
-- customers apparently want solutions far more than tech.

So based on what I read on this thread, it seems odd this would be
controversial if there were old-timers there who could point out what
actually happened. Everytime I hear about the CL standard effort, it
sounds like a pretty violent thing.


Tayssir
From: Pupeno
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <d9hglp$is9$1@domitilla.aioe.org>
Tayssir John Gabbour wrote:
> Further, innovation motive can clearly conflict with the commercial
> motive, since some customers don't want to recompile and retest apps
> built on evolving platforms. And you almost get proof from Microsoft
> who endlessly repeats "innovation" because that's one of its weaknesses
> -- customers apparently want solutions far more than tech.

I can understand that and put in that way it makes sense, but there are
languages that are not governed by an standard which although than are more
primitive than Lisp (or Common Lisp) they are evolving faster and may
eventually reach CL levels (which of course, where reached by CL years
ago).
My examples are PHP, Python, Perl, Ruby. They are instead governed by the
single-implementation (dictatorship if you want) and when other
implementations appear, they just follow the main one.
And if we do the industry comparition, well any of those languages (Ruby
maybe not) seem to be used more than CL. So if the lack of a standard is
harmfull, it should be very harmful for them. And I don't really see an
industry claiming for a Python or PHP or Perl standard.
If we talk about portability, I'm more confident that my Python or PHP code
will run on other platforms (same implementation though) than that my CL
code would run on another platform (where I may encounter a different
implementation).
Having said that someone could argue that nobody is stopped anyone to start
a new Lisp governed by implementation instead of standard. True, but nobody
is doing in seriously either (there are only toy implementations that are
not CL, or othres like Scheme, elisp, etc). If there is, please tell me, I
am interested. The only reason I am not doing it myself is because I don't
know how (something which may change eventually, for which I already
started talking notes).
To not make just a rant I invite all of those that are here claiming that
the standard is harmfull to start another implementation not governed by
the standard (yes, I'd help, as much as I can, which might not be much
now).
-- 
Pupeno <······@pupeno.com> (http://pupeno.com)
Reading ? Science Fiction ? http://sfreaders.com.ar
From: Tayssir John Gabbour
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1119644771.981372.201840@g47g2000cwa.googlegroups.com>
Pupeno wrote:
> Tayssir John Gabbour wrote:
> > Further, innovation motive can clearly conflict with the commercial
> > motive, since some customers don't want to recompile and retest apps
> > built on evolving platforms. And you almost get proof from Microsoft
> > who endlessly repeats "innovation" because that's one of its weaknesses
> > -- customers apparently want solutions far more than tech.
>
> Having said that someone could argue that nobody is stopped anyone to start
> a new Lisp governed by implementation instead of standard. True, but nobody
> is doing in seriously either (there are only toy implementations that are
> not CL, or othres like Scheme, elisp, etc). If there is, please tell me, I
> am interested. The only reason I am not doing it myself is because I don't
> know how (something which may change eventually, for which I already
> started talking notes).
> To not make just a rant I invite all of those that are here claiming that
> the standard is harmfull to start another implementation not governed by
> the standard (yes, I'd help, as much as I can, which might not be much
> now).

The assumption though is that people care about innovation. ;) But some
people don't.

It is like the notion of "strength." Regular expressions are not very
strong, but are often used in preference to stronger languages.
Strength and innovation do not necessarily mean better. Sometimes it
means worse.

Now, looking at Henry Baker's slides, it may appear that the loss of
innovation is bad. But that doesn't mean we have to do anything. It's
just a historical datapoint. If you're in a position to do something
about it, as with say Erlisp and Qi, great.

Many things in the world are suboptimal. I'm grateful that someone like
Baker pointed out the suboptimality of Common Lisp, as it takes unusual
honesty and even courage.


Tayssir
From: Christian Lynbech
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <87k6kf5dr0.fsf@chateau.defun.dk>
>>>>> "Tayssir" == Tayssir John Gabbour <···········@yahoo.com> writes:

Tayssir> The assumption though is that people care about innovation. ;) But some
Tayssir> people don't.

I think that everybody would agree that both innovation and backwards
compability are good, the difficult discussion is whether one is
better than the other, and here people have wild disagreements.

Unfortunately, in practical life a tradeoff between the two is always
involved. It is not very fruitful to try discuss the merits of either
isolated from the other, or any given tradeoff isolated from the
circumstances in which it was made.

As so many others have pointed out, we really need some examples of
what we are missing (and how CL is to blame for it), at least if
practioners are to have much respect for the view.

Allowing something to start over from scratch is always an attractive
mental exercise, especially for something as old and successful as
Lisp, but hardly one that is transferrable to real life.


------------------------+-----------------------------------------------------
Christian Lynbech       | christian ··@ defun #\. dk
------------------------+-----------------------------------------------------
Hit the philistines three times over the head with the Elisp reference manual.
                                        - ·······@hal.com (Michael A. Petonic)
From: Tayssir John Gabbour
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1119934794.319274.143750@z14g2000cwz.googlegroups.com>
Christian Lynbech wrote:
> >>>>> "Tayssir" == Tayssir John Gabbour <···········@yahoo.com> writes:
> Tayssir> The assumption though is that people care about innovation. ;) But some
> Tayssir> people don't.
>
> I think that everybody would agree that both innovation and backwards
> compability are good, the difficult discussion is whether one is
> better than the other, and here people have wild disagreements.

I respectfully disagree. I believe innovation is valueless, and many
classes of its uses are in fact "bad."

There are some quick examples we can examine.

* Java is probably an example of anti-programmer innovation, if you
look at its original whitepaper
http://java.sun.com/docs/white/langenv/Intro.doc2.html#349

* Take howstuffworks.com's creator Marshall Brain:
"In the past, increases in productivity have meant higher wages and
reduced hours for workers. Today, worker wages are stagnant. Most of
the money from productivity improvements flows to the wealthy, creating
a gigantic Concentration of wealth."
http://marshallbrain.com/robotic-faq.htm

* We can go on with Martin Luther King Jr.'s "moloch of automation,"
David Noble's careful studies
http://www.nooranch.com/synaesmedia/wiki/wiki.cgi?DavidNoble/ForcesOfProduction
and so forth.


If we are serious about innovation, we'd remark that there are probably
millions of Einsteins alive who didn't have decent educations and
whatnot. Common Lisp's standardization hasn't yet merited a footnote in
comparison. Military grants have allowed Lisp to flourish without
regard to profit margins for a while, which is a huge gift from the US
citizenry. If we want more innovation, we should go to the Federal
Reserve and tell them to create more money.

Even mighty Microsoft's research group has pressures to produce
medium-term results.


Tayssir
From: alex goldman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1320373.1ZfNOzzfW3@yahoo.com>
What's with all of the commies in this forum? Where's McCarthy (either one)
when you need him?
From: Robert Uhl
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <m3ekama3ll.fsf@4dv.net>
"Tayssir John Gabbour" <···········@yahoo.com> writes:
>
> * We can go on with Martin Luther King Jr.'s "moloch of automation,"

The man's correct judgement on certain moral issues says nothing at all
about his knowledge of economics or industry...

> If we are serious about innovation, we'd remark that there are
> probably millions of Einsteins alive who didn't have decent educations
> and whatnot.

Highly unlikely, since there are only 6 billion people in the world,
For there to be millions (i.e. at least 2 million) possible
Einstein-level intellects than an Einstein would be 1 in 3,000.  The
consensus appears to be that his IQ was 160-200; IQs of 160 or higher
are 1 in 11,000; thus one can expect that in the world there are
approximately half a million Einstein-level intellects.

-- 
Robert Uhl <http://public.xdi.org/=ruhl>
Skill without imagination is craftsmanship and gives us many useful
objects such as wickerwork picnic baskets.  Imagination without skill
gives us modern art.                                   --Tom Stoppard
From: lin8080
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <42C322C1.3033B5BE@freenet.de>
Robert Uhl schrieb:

> Highly unlikely, since there are only 6 billion people in the world,
> For there to be millions (i.e. at least 2 million) possible
> Einstein-level intellects than an Einstein would be 1 in 3,000.  The
> consensus appears to be that his IQ was 160-200; IQs of 160 or higher
> are 1 in 11,000; thus one can expect that in the world there are
> approximately half a million Einstein-level intellects.

... but keep in mind, that IQ-200 person can not open a can without
loosing a finger.

stefan
From: lin8080
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <42C159ED.A1D753B6@freenet.de>
Christian Lynbech schrieb:

> As so many others have pointed out, we really need some examples of
> what we are missing (and how CL is to blame for it), at least if
> practioners are to have much respect for the view.

Hallo Christian Lynbech

Maybe this is not exactly the piont, but what I missed is something like
Ketman for ASM is. See for more:

http://www.btinternet.com/~btketman

It works like an interpreter and for me it is very useful to test some
code whether it works like I imagine or not. It has some abstractions to
keep it easy but it is easy to use and clearly shows, whats going on. To
do this with lisp I often use the newLisp Interpreter v.6.3 which is
also small and good for testing some thoughts. Maybe there are other
lisp-tools like that I did not find untill today (note, I do not like
emacs on windows).

stefan
From: Juliusz Chroboczek
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <7iirzyrp43.fsf@lanthane.pps.jussieu.fr>
Christian Lynbech <·········@defun.dk>:

> As so many others have pointed out, we really need some examples of
> what we are missing (and how CL is to blame for it), at least if
> practioners are to have much respect for the view [that innovation
> is stagnant in the Lisp community].

I think that something slightly worrying has happened to part of the
Lisp community: the feeling that Common Lisp is the best that can be
done within the Lisp space of programming languages (the infamous
``local optimum'' phrase), and that further thought about what Lisp
should be is pointless.

Now don't take me wrong.  Common Lisp is my favourite programming
language.  It's a horrible programming language, but of the languages
that I've tried, it's the best.

There are a lot of directions in which I'd like to see Common Lisp
change.  A few of them are minor tweaks, such as fixing the array
library to be properly layered, providing user-customisable equality
predicates[1], having a (LIST x) type specifier, or fixing the
computation of the class precedence list.  Some of them require
new ideas altogether.

A good example of the latter is how compile-time programming is done.
I think nobody is going to deny that CL has the most powerful macro
system in existence; paradoxically, the fact that we've got the full
power of the language at compile time means that we haven't been
investigating more subtle, declarative ways of communicating with the
compiler.

                                        Juliusz

[1] Before Kent takes up his favourite subject: ANSI CL does specify a
user-extensible equality predicate in 3.2.4.2.2.
From: Paolo Amoroso
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <87wtod32v8.fsf@plato.moon.paoloamoroso.it>
Juliusz Chroboczek <···@pps.jussieu.fr> writes:

> I think that something slightly worrying has happened to part of the
> Lisp community: the feeling that Common Lisp is the best that can be
> done within the Lisp space of programming languages (the infamous
> ``local optimum'' phrase), and that further thought about what Lisp
> should be is pointless.

Besides thoughts, it may be useful to have actual new experimental
Lisp systems or extensions to play with.


Paolo
-- 
Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film
Recommended Common Lisp libraries/tools:
- ASDF/ASDF-INSTALL: system building/installation
- CL-PPCRE: regular expressions
- UFFI: Foreign Function Interface
From: Rainer Joswig
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <BEE8450E.E45A%joswig@lisp.de>
Am 29.06.2005 2:19 Uhr schrieb "Juliusz Chroboczek" unter
<···@pps.jussieu.fr> in ··············@lanthane.pps.jussieu.fr:

> I think that something slightly worrying has happened to part of the
> Lisp community: the feeling that Common Lisp is the best that can be
> done within the Lisp space of programming languages (the infamous
> ``local optimum'' phrase), and that further thought about what Lisp
> should be is pointless.

Personally I think that Common Lisp is only a local optimum. But it
is good enough for lots of things. What use is the innovation, when
lots of things in Common Lisp are still not usable by the mainstream.
Why innovate more? I would make Common Lisp more usable. For me a
programming language is a language for writing programs. The programming
language as little value on ist own. The applications are much more
important, they drive the need, they drive the innovation, they drive
the investment, etc.

So leave the discussion about Common Lisp to the academics, the pinheads,
the old-timers...

I'm much more interested in seeing applications driving the innovation.

A very good example is McCLIM. CLIM is cool, but complicated. McCLIM
itself as a framework implementation is immature and does a lot
of things wrong. It might even be the wrong idea to invest into
CLIM at all (which I don't believe - I think lots of things are
fixable). BUT, and here is the BIG BUT, a few people are trying
to develop applications with it. One example is Climacs, an
editor written in Common Lisp and McCLIM. I'm pretty sure
the enthusiam of those people will drive things forward and they
will make sure that McCLIM (and the Common Lisp stuff below) is
usable for their purpose

So, if somebody has something to offer, he/she should join those guys
writing Climacs - there is plenty to learn and it will drive things forward.
That one can write editors in Lisp and even Common Lisp is already
known (and demonstrated) and there is no need to discuss Common Lisp
in that context. So one can concentrate on the more practical things
and on the software engineering issues one is facing as a developer:
how to write a maintainable, well structured, well documented,
extensible application in Common Lisp.

> Now don't take me wrong.  Common Lisp is my favourite programming
> language.  It's a horrible programming language, but of the languages
> that I've tried, it's the best.

So, let's start from that. It is sufficient and what can we do now?
What are best practices? What are paths to get things done? Where
can the benefits of Lisp used in applications sucessfully etc...

[...]
 
> A good example of the latter is how compile-time programming is done.
> I think nobody is going to deny that CL has the most powerful macro
> system in existence; paradoxically, the fact that we've got the full
> power of the language at compile time means that we haven't been
> investigating more subtle, declarative ways of communicating with the
> compiler.

Macros are overrated anyway. Start from a software engineering view.
What do you want to achieve and how? Then macros might or might not
be an answer. Think for each problem, whether a macro-based solution
will increase readability, maintainability, will make debugging
easier, how it fits into the development process, and so on. But macros
themselve don't have a value. Don't introduce macros into a project
because they are cool, you happen to understand the macros you
are writing now (atleast for some minutes), macros make your job
safer, or similar ideas. As a project manager, I would make sure that
such a person is not allowed to write unmaintainable code in my
project. And lots of code I see floating around is just that:
unmaintainable and unmotivated use of macros is high on my list
of issues.
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uy88tfjv8.fsf@nhplace.com>
Rainer Joswig <······@lisp.de> writes:

> Macros are overrated anyway.

Can you, uh, expand on this?

The reason I contest this is that there is some evidence that macros
are what enabled sudden explosive growth in the Lisp community back in
the days of the Lisp Machine.  (You might think that just the
availability of large address space did this, but most people didn't
have access to the LispM. It was a scarce resource.  And PDP10 Maclisp
had had both macros and autoloading for a long time, but then suddenly
when we decided to make a few macros (like LET) be a standard part of
the language, there became overnight a huge interest in creating a
kind of trade in macro packages, with everyone becoming his/her own
language designer, including the evolution of looping mechanisms,
custom object systems, etc.

I can't think of any other specific change that occurred suddenly
other than a kind of showcasing of this one language feature and
saying "hey, guys, this is powerful--look what you can do", and the
creation of the autoloading macro (which was a procedural
workaround/kludge for small address spaces that allowed having
reliably macros pre-defined without taking up lots of space) that
happened at that time and that would account for all the sudden
excited growth in that community.

You could blame it on autoloading per se (not related to autoloading
macros), except that that feature had existed for a long time.  What
was new at the time was autoloading macros, and to me that seemed 
an important trigger.

I remember discussing this phenomenon at the time and deciding to
attribute it to macros, and I continue to stand by that today.  Your
mileage may, of course, vary.  This is a subjective judgment.

But I'm still curious to hear both why you think macros are overrated
and also to what you would attribute the long-time non-presence of
user-defined libraries suddenly turning suddenly into a large number
of user-defined libraries.
From: Rainer Joswig
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <BEE8689C.E489%joswig@lisp.de>
Am 29.06.2005 14:05 Uhr schrieb "Kent M Pitman" unter <······@nhplace.com>
in ·············@nhplace.com:

Hi Kent,

> Rainer Joswig <······@lisp.de> writes:
> 
>> Macros are overrated anyway.
> 
> Can you, uh, expand on this?
> 
> The reason I contest this is that there is some evidence that macros
> are what enabled sudden explosive growth in the Lisp community back in
> the days of the Lisp Machine.  (You might think that just the
> availability of large address space did this, but most people didn't
> have access to the LispM. It was a scarce resource.  And PDP10 Maclisp
> had had both macros and autoloading for a long time, but then suddenly
> when we decided to make a few macros (like LET) be a standard part of
> the language, there became overnight a huge interest in creating a
> kind of trade in macro packages, with everyone becoming his/her own
> language designer, including the evolution of looping mechanisms,
> custom object systems, etc.
> 
> I can't think of any other specific change that occurred suddenly
> other than a kind of showcasing of this one language feature and
> saying "hey, guys, this is powerful--look what you can do", and the
> creation of the autoloading macro (which was a procedural
> workaround/kludge for small address spaces that allowed having
> reliably macros pre-defined without taking up lots of space) that
> happened at that time and that would account for all the sudden
> excited growth in that community.
> 
> You could blame it on autoloading per se (not related to autoloading
> macros), except that that feature had existed for a long time.  What
> was new at the time was autoloading macros, and to me that seemed
> an important trigger.
> 
> I remember discussing this phenomenon at the time and deciding to
> attribute it to macros, and I continue to stand by that today.  Your
> mileage may, of course, vary.  This is a subjective judgment.
> 
> But I'm still curious to hear both why you think macros are overrated
> and also to what you would attribute the long-time non-presence of
> user-defined libraries suddenly turning suddenly into a large number
> of user-defined libraries.

I see/saw that growth, too. And I see/saw it imploding also.
The code at some time did not evolve anymore. It could be that the
business imploded, the communities imploded, the code was dead end anyway,
the code was unmaintainable or unevolvable (or both), the
code might have been maintainable by wizards (and those disappeared),
it might have been the legal situation, or what else...

If you look at the source code in the Lisp Machine, it is written
in several dialects and also often with heavy macrology.

Why wasn't it possible to migrate older code to the new dialects?
If that happened it always looked like a major effort or even
a reimplementation. Were macros part of the problem?

If I look how the software looks like when you see at them now
(not from within the surroundings of a live software project
developing this stuff), I have the feeling that there was too
much hackery at work and that the code was not enough written
to be ported, to be understandable (by non-original authors),
to be debuggable and so on.

For me macros are violating principles like late-binding. They
are creating static structures that are no longer evovable. The
make too much going on behind the scenes that are non-obvious.

Take for example an interface builder type of application. If you
represent the interface objects as first class objects you can
modify the interface with an interface builder, since you have
an explicit representation of it and you can work on top of that.

Now take DW or CLIM. One of the main drivers for the interface is the
frame definition macro. It has several drawbacks:

- it does have extremely poor checking of the input and does not provide
  you with useful ideas why a certain description produces results
  that are not desirable (layout problems, description grammar violations,
  ...).

- it is non reversable. You cannot go from an interface back to an editable
  description (-> a macro form that can be changed and evaluated again).

- it introduces all kinds of side-effects that are non-reversable or where
  there is no dependency tracking. Example: you define a frame and several
  methods will get generated. Later you change that frame definition
  and all kinds of no longer needed methods will NOT be removed.

- the compile-time debugging of such a frame definition is black art.

- There is no idea what the frame definition macro does and so it is
  hard to replicated it by other people in their implementation.
  The various CLIM implementations had several incompatible versions
  of the frame definition macro, where you can get to a working
  usage of it only by trial and error.

- it wires dependency into the resulting code. You can redefine a
  function in a patch and the effect will be visible. You can redefine
  a macro in a patch and the change will NOT be visible in
  user level code, since the user then needs to compile his code
  and it is non-trivial to track dependency and undo/redo
  side-effects.

- the macro definition is not tool friendly. Frame-Up for example
  could easily generate such a frame description, but you couldn't
  take one and edit it in Frame-Up.

So from a Lisp hacking perspective there the frame definition macro
is very cool, but from a software engineering perspective (taking
into account maintainability) it is very questionable, since it does
only barely the job and is not developed for robustness and other
demands. Unfortunately there is no established best practice how
to write maintainable macros at that scope and level. I'm not
talking about the kind of macros that transform a LAMBDA into a LET
expression.
I'm talking about the stuff that has been used in the Lisp machine software
like DEFWHOPPERs/WRAPPERs or like the complex macros to define
application frames (or what it was called) and similar. The stuff that
has been written with heavy macrology has not been able to evolve
(that's what I see). Why was that? Are there other reasons, or is
there some misuse of macros, some anti-patterns implemented?
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uekalgmj0.fsf@nhplace.com>
Rainer Joswig <······@lisp.de> writes:

> Why wasn't it possible to migrate older code to the new dialects?

What makes you think it wasn't? ;)

I think in a lot of cases, it was copyright that kept it from happening.

MACSYMA was, for example, quickly upgraded for porting.  It was written
in Maclisp/Franz/Zetalisp and I ported the bulk of it (100,000 lines of
code) to Common Lisp in about 3 months.  So there's one data point about
code durability.

But in other cases, there simply wasn't a will for it to happen.  The
issue was not technical.  David Moon told me at one point that Zmacs
couldn't run in Common Lisp because of problems with array-leaders and
special instance variables.  In my "free time" (not a sponsored
project, just something I squeezed in between my other
responsibilities, so it couldn't have been a huge effort) in my last
few months at symbolics (1992), I managed to use macrology to hide
these issues and to port Zmacs successfully to Common Lisp (in that
case, to Symbolics Common Lisp, not raw Common Lisp, but that
difference is minor compared to the gap between Zetalisp+Flavors and
Common Lisp with CLOS, I think) and had it running intenally at
Symbolics. [My project, incidentally, was called TRES, for TRES
Replaces Eine's Successor... you always have to have a good name, and
I needed a proper title for the series that began with Eine (Eine Is
Not Emacs) and Zwei (Zwei Was Eine Initially).]  It still used TV
windows, and needed to use CLIM instead, but I considered that a
separate task.  I'm confident the task could have been completed.  Why
wasn't it?  Symbolics was then focused on the VLM (Virtual Lisp
Machine, the emulator product rolled out later on the DEC Alpha) and
didn't care any more about other things.  I considered my project a
"backup" in case the VLM didn't work and they needed to toss the
hardware, but of course I didn't realize I'd be laid off and that my
project would get lost in the dust.  

Nevertheless, these two projects lead me to believe that porting of
large amounts of the Lisp Machine was largely an issue of bad business
planning by Symbolics more than a technical artifact.  Experience with
the non-acceptance of CLOE at Symbolics (a project to take a native Lisp,
originally Robertson Common Lisp, RCL, developed by Paul Robertson and
to morph it into a native Lisp on the 386 running Symbolics-like 
functionality) showed up the problem.  The project had pretty good success
given the resources thrown at it (minimal), but was not pushed very hard
by Symbolics because at the time it was thought by the controlling forces
there to be The Wrong Thing and mostly heretical.  (Incidentally, CLOE was
another brainchild of Howard Cannon, who once again showed that he had
a very good theory of what was needed in Lisp, this time not just 
architecturally but business-wise.)  CLOE was picked up by Macsyma Inc.,
which bought MACSYMA as well, and rolled it out on native PC to
escape the demise of Symbolics and to continue on with technology.

At this level, we're talking pretty coarse data.  A few anecdotes, not
a widespread phenomenon, so my observations are perhaps less
systematic.  But here my impression is that what was happening
systematically was a lack of will by Symbolics to invest in trying these
things.  There was a certain hubris that made people believe that only the
hardware was enabling all this magic, and there were some important assists
from the hardware, but ultimately the code was not dependent on the hardware.
And at every step where it was tried, macrology hid the porting issues
and enabled platform-hopping in a natural way.  

So, respectfully, I dispute your apparent assumption that it was not
possible, and I suggest instead that what you infer as lack of possibility
might be equally well explained by business complications.

> If that happened it always looked like a major effort or even
> a reimplementation. Were macros part of the problem?

I think they'd have been part of the solution if it had been tried.

The parts that are often hardest, if you examine the code, are the
function call parts, not the macros.  Rewriting SEND is way harder
than rewriting DEFFLAVOR.  DEFFLAVOR can be rewritten once as a macro
to expand differently and it fixes all the uses of DEFFLAVOR.  Just to
take an example: SEND, by contrast, computes its args oddly and
arguably inefficiently; if you rewrite just the function, you still
have the problem that at the source end, you're computing the second
arg (at runtime) as a keyword, and that (again at runtime) you must
re-convert it to a generic function name.  That's more painful, all in
all, than dealing with the macrology because by their nature, macros are
resolvable at compile time (if our friends in the "runtime macros" thread
going on now at c.l.l. don't accidentally get their way - heh) and
functions are not.

> If I look how the software looks like when you see at them now
> (not from within the surroundings of a live software project
> developing this stuff), I have the feeling that there was too
> much hackery at work and that the code was not enough written
> to be ported, to be understandable (by non-original authors),
> to be debuggable and so on.

Orthogonally to the issue of macrology per se, there was also a belief
by some people (for some reason I attribute it, correctly or not, to
Mike McMahon, but I think it was more widespread) that if you followed
all the abstractions, you'd lose efficiency.  I personally felt that 
efficiency lost by abstractions should be gained back in other ways and
that code should be kept clean for longevity's sake.  But there are 
certainly abstraction violations in the name of "speed bums" all over
the place, and those are inhibitors to easy transformation in some cases.
All systems probably suffer from this to some degree, but all the moreso
if it's part of the design strategy.  You may be somewhat up against
that as you examine things.  I think you're right that it wasn't written
to be ported, but I don't know why it couldn't have been.  I don't think
it was essential.  I came across some of this in the Zmacs->Tres port, 
and just added abstractions where I saw fit.

I once had a discussion with Dave Moon where he alleged to me that
complex systems were inherently messy and couldn't be coded cleanly.
I alleged otherwise.  The problem here is that complex systems are
hard to write and who has the ability to go test this claim?  Dave had
written many of them, and they were often messy internally here and there.
But he is a prolific creator of robust large systems.  I haven't created
the systems he has, and so I can't say he's wrong.  But I simply believe
he is.  New Flavors had all sorts of messiness in it, and I had occasion
to re-implement it for CLOE and it didn't come out nearly as messy. That's
just one datapoint, and my implementation was not as comprehensive as his,
nor did it have all the error-checking bells and whistles since we were
using it to port already-debugged code, but it supported correct code
efficiently and with clean abstractions.  Does that disprove Dave's claim?
No.  But it gives at least a little credence to the idea that I might
not just be some annoyed loser who's jealous of Moon's ability to create
cool systems and looking for ways to nitpick him.  (I also don't know
if Moon still holds to these beliefs either--it's been quite a while since
we talked.)

> For me macros are violating principles like late-binding. They
> are creating static structures that are no longer evovable. The
> make too much going on behind the scenes that are non-obvious.

I understand what you're saying here, but consider that the static
structure they create is late-bound.  That is, the bulk of macrology
is often in the use... most macros are not like LOOP or DEFCLASS that
has a whole huge library supporting it.  And even for those that are,
a lot of the work they're doing is lexical analysis (portable stuff)
not implementation, so even that is probably more robust than you're
giving it credit for.

What I allege most macros do is solve what I'll call the "CORBA problem".
They shield you from idiotic little idioms where every program that wants
to make an object first makes an object factory, then asks it to crank out
a singleton object of a generic type, then tests that it succeeds, then
finally casts it to something that the program can use.  That kind of 
idiomatic stuff, in line in regular code, is a disaster for programming
intelligibility except by idiom recognition.  It's like in old Maclisp 
having to know that (APPEND X '()) meant what we now call (COPY-LIST x),
only worse because it's strung over separate "statements" and hence it's
something a naive programmer, not realizing that three statements alone
are working in lockstep might interpose another statement between, making
the thing harder to read.  At least there's less likelihood that someone 
will turn (APPEND X '()) into (APPEND X (PROGN (PRINT Y) '())) when they
can instead write (APPEND X '()) (PRINT Y) so mostly that kind of idiom
doesn't get wrecked.  But macros serve the accidental purpose of making
abstract operations atomic, and easier to manage.

And in the typical case, I think the bulk of macrology is in the use, not
the definition, so macros also tighten code and move a complex set of
operations to a central spot, collapsing lots of 
 (LET ((X NIL))
   (UNWIND-PROTECT (PROGn (SETQ X (OPEN ...))  ... (CLOSE X) (SETQ X NIL))
      (IF X (CLOSE X))))
kinds of things to a single instance of that in one place that is easily
updated when a port occurs, rather than leaving large numbers of them to
update.  So I suggest the opposite of your claim is true.  "Early binding"
would be coding this stuff inline.  Sure, you can turn this into
 (CALL-WITH-OPEN-FILE #'(LAMBDA ...) file)
and then you've hidden the portability, but you've still left cumbersome
syntax that I can't for the life of me find aesthetic. If there is one 
thing I don't understand in the Scheme community, it's the passion for
tolerating this junk.  We used to write tons of it in the late 1970's before
there were macros, but instantly outgrew it when we came up with the idea
of adding:
 (DEFMACRO WITH-OPEN-FILE ((name file) &body forms)
   `(CALL-WITH-OPEN-FILE #'(lambda (,name) ,@forms) ,file))
... and then you're back to macrology, this time not hiding implementation
but hiding syntax tedium.  And what difference does it make at that point
whether it's early or late binding?  One doesn't worry when discussing 
early/late binding in other languages like C or C++ or Pascal that the 
syntax of assignment might change from '=' to ':=' or vice versa.
Why should I worry that the syntax of macro that is using only tools 
that are well-understood are involved.  

So maybe lumping all macrology into one discussion is bad because there
are several distinct activities (and associated risks) going on here.
Good style in macrology reduces the risk of "change" to as small a place
as possible.  But always the risk is there, macro or function.  And
surely it's better to have syntax abstraction than not?

> Take for example an interface builder type of application. If you
> represent the interface objects as first class objects you can
> modify the interface with an interface builder, since you have
> an explicit representation of it and you can work on top of that.

The interface representations that Symbolics FrameUp were much easier
to manipulate than the interface representations that I see Visual
Cafe or Qt or other like kinds of things using these days.  How could
it possibly be easier for a programming system to interpret imperative
commands than declarative commands, since the former has an associated
halting problem in the worst case and the latter can be restricted to
a subset that is reliably syntactically analyzable.

> Now take DW or CLIM. One of the main drivers for the interface is the
> frame definition macro.

Ah, now we get to the meat of the matter.

I'm going to guess ahead of time, before reading further, that the answer  
is going to be "it is possible to do bad design in any system, whether
macro based or not".   Where by "good" and "bad" I mean here "suitable to
the needs of your project", not "God will take you into heaven for doing
good or strike you down for doing bad, independent of context".

> It has several drawbacks:
> 
> - it does have extremely poor checking of the input and does not provide
>   you with useful ideas why a certain description produces results
>   that are not desirable (layout problems, description grammar violations,
>   ...).

And a functional interface to the same kind of magic that CLIM tries to do
would fix that?  I don't think so.  It would just make it harder to analyze
the connectivity.

But what is getting you the trouble is not how you assemble the connectivity,
it is the fact that CLIM's job is to take input in one shape and to give 
output in another shape, as if by magic, since part of its design (its
"charm" if you will) is that it is allowed a free hand in how to do this
in many cases.

But surely the problem is in the lack of clear mapping between input and
output, and surely this is an intentional part of the design, not an 
accidental artifact of the use of macros to implement it.

> - it is non reversable. You cannot go from an interface back to an editable
>   description (-> a macro form that can be changed and evaluated again).

That's because you're asking it to undo a full set of  design choices.
CLIM's charter is not to manage a well-known design, it is to create 
a design and then manage it.  Of course, in general, that process is
not undoable. If this is your desire, you don't want CLIM.  And it doesn't
matter if CLIM is expressed in macros or not.

> - it introduces all kinds of side-effects that are non-reversable or where
>   there is no dependency tracking. Example: you define a frame and several
>   methods will get generated. Later you change that frame definition
>   and all kinds of no longer needed methods will NOT be removed.

These are criticisms of the design of CLIM and/or statements that you are
not CLIM's proposed client.

The same criticisms would be true of a consulting organization that you hire 
to maintain your web pages and that use Javascript as a solution when you
only know PHP, or that use server-side when you might want client-side.
When you don't specify how the organization does something, it uses its 
choice, not yours.  If you have a design constraint you want to introduce,
you can indeed do so, but CLIM is a consulting organization that doesn't
offer you that choice.  Different CLIMs may solve the problem differently,
but the whole idea is to make it easy for people who don't want to do certain
kinds of things, not for people who do.  I know this because I've done some
web hosting, and what I offer to people who don't understand HTML is VERY
different than I would offer to people who do... in fact, I just tell people
who do want their own maintenance that I won't support them because I'm not
set up to do that kind of hands-on stuff efficiently--it means someone else
could be introducing bugs I'm not aware of and that I'm not billing for, and
I just don't want the headache.  CLIM has this identical problem.  It gets
power out of not cooperating with you, and offers you something in return.
If you know how to offer that power without that cost, make a new window
system that does both. I think that's great.  But I don't think the use or
non-use of macros will make the difference in your ability to succeed.
What will matter is your coming up with an intelligible collaboration/design
model.

> - the compile-time debugging of such a frame definition is black art.

When you don't expose how something works, indeed debugging it at the
frame level is hard.  

I'm not 100% sure this is made better by not using macros since all that
does is expose implementation.  If you don't expose all the functions you
use, then you still have unexplained stack frames.  If you do expose them,
then the value you have offered the user is completely documenting your
implementation, not your use of macros.

> - There is no idea what the frame definition macro does and so it is
>   hard to replicated it by other people in their implementation.
>   The various CLIM implementations had several incompatible versions
>   of the frame definition macro, where you can get to a working
>   usage of it only by trial and error.

Already answered.
 
> - it wires dependency into the resulting code. You can redefine a
>   function in a patch and the effect will be visible. You can redefine
>   a macro in a patch and the change will NOT be visible in
>   user level code, since the user then needs to compile his code
>   and it is non-trivial to track dependency and undo/redo
>   side-effects.

It depends on how you design the system. This is not intrinsic to macros.
That is, you can both write macros that don't have this problem (e.g., if
all the macros can be expressed as functions, you can use the CALL-WITH-
technology I mentioned above).

If you can't convert solely to that, and if
macros are doing something truly syntactic, I don't see how open-coding that
without macros is going to make your code cleaner.  If every thing a macro
collapses into one form has to be done as three consecutive magic words,
I don't see how inlining those three magic words makes things better. You
still have to patch all the uses whether you write
 (FOO) (BAR) (BAZ)
or 
 (MACROLOGICAL-FOO-BAR-BAZ)
If you have made your code depend on foo, bar, and baz happening in order
and you change it to (FOO) (BAZ) (BAR) or (FOO) (QUUX) (BAZ), either way you
still have to recompile the same number of forms.  The mention of macros
seems a red herring.

> - the macro definition is not tool friendly. Frame-Up for example
>   could easily generate such a frame description, but you couldn't
>   take one and edit it in Frame-Up.

That's just a weakness in Frame-Up, but if you believe that you CAN edit 
a procedural description then
 (a) you're not realizing how much users can muck that up 
or
 (b) you plan to not 'understand' the user's description but just execute
     it blindly.
The latter strategy has the minor problem that if there are side-effects
(file deletions, and worse) in the setup code, you can't detect them, and
you might really injure the user by executing them in your editor.

I just don't see that you're offering anything better, nor do I think 
criticizing an implementation as if it were the representative of all
implementations is well-founded.

> So from a Lisp hacking perspective there the frame definition macro
> is very cool, but from a software engineering perspective (taking
> into account maintainability) it is very questionable, since it does
> only barely the job and is not developed for robustness and other
> demands. Unfortunately there is no established 

documented

> best practice how
> to write maintainable macros at that scope and level. I'm not
> talking about the kind of macros that transform a LAMBDA into a LET
> expression.
> I'm talking about the stuff that has been used in the Lisp machine software
> like DEFWHOPPERs/WRAPPERs or like the complex macros to define
> application frames (or what it was called) and similar.

I stand by my earlier guess that some design is good and bad, independent
of the use of macros.

DEFWHOPPER and DEFWRAPPER have pretty simple definitiions that are probably
in many cases as conceptually simple as LET when it comes to the issue of
syntax.  Most of what they do is functional.  If one wrote
 (ADD-WHOPPER #'(LAMBDA (X) ...) 'TV:FOO-FLAVOR)
would that make there suddenly not be an issue?  I think not.

Hence I am inclined to exonerate macros here.

> The stuff that
> has been written with heavy macrology has not been able to evolve
> (that's what I see). Why was that?

Business priorities.  And in that I include both:

 * lack of will to do what is possible (business planner is confused)

 * lack of desire to do what is possible (business planner has a 
     legitimately different goal and so doesn't want this as starting point)

> Are there other reasons, or is
> there some misuse of macros, some anti-patterns implemented?

A reasonable summary questions for you to end on, but I'm not going to
summarize here. Hopefully the above counts as my answer to this question.

I appreciate your taking the time to detail an answer.  Even if we
disagree, I hope you've found my repsonse to have enough substance
that it was worthwhile for you to do all this detail.  It's certainly
a matter about which reasonable people could disagree, but I'll be
interested to see if you find any of my arguments above to be
persuasive.
From: Joe Marshall
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <irzxf2ib.fsf@ccs.neu.edu>
Kent M Pitman <······@nhplace.com> writes:

> What I allege most macros do is solve what I'll call the "CORBA problem".
> They shield you from idiotic little idioms where every program that wants
> to make an object first makes an object factory, then asks it to crank out
> a singleton object of a generic type, then tests that it succeeds, then
> finally casts it to something that the program can use.  That kind of 
> idiomatic stuff, in line in regular code, is a disaster for programming
> intelligibility except by idiom recognition.  

Agreed.  `Idioms' mean that there is some functionality that the
language is incapable of expressing directly and concisely.

> Sure, you can turn this into
>  (CALL-WITH-OPEN-FILE #'(LAMBDA ...) file)
> and then you've hidden the portability, but you've still left cumbersome
> syntax that I can't for the life of me find aesthetic. If there is one 
> thing I don't understand in the Scheme community, it's the passion for
> tolerating this junk.  We used to write tons of it in the late 1970's before
> there were macros, but instantly outgrew it when we came up with the idea
> of adding:
>  (DEFMACRO WITH-OPEN-FILE ((name file) &body forms)
>    `(CALL-WITH-OPEN-FILE #'(lambda (,name) ,@forms) ,file))
> ... and then you're back to macrology, this time not hiding implementation
> but hiding syntax tedium.  

The advantage (such as it is) of the CALL-WITH-MUMBLE... version is
that you can see directly what is evaluated and what is unevaluated
and what identifiers are being bound.  In the WITH-OPEN-FILE macro,
you have to know that the first thing after the parenthesis is a name
to be bound, but the second thing is evaluated to produce a value to
bind it to (this is the usual order, but sometimes there are two
bindings or sometimes someone is perverse).  In the
CALL-WITH-OPEN-FILE version, you can read the bound variables names
right from the lambda expression.

Personally, I don't find the "call-with-mumble (lambda (" syntax to be
particularly nasty and I like knowing what's going on, but clearly you
prefer the briefer form of the WITH-MUMBLE macro.  

Having the macro *and* the function (where the macro just expands
into the function) seems a more than reasonable compromise.

> So maybe lumping all macrology into one discussion is bad because there
> are several distinct activities (and associated risks) going on here.

I think there are three patterns of usage of macros:

  1.  Ones that are so obvious that they are macros that no one would
      be confused.  The  DEFINE-ELEMENT  macro from the earlier post
      falls into this category.  Language extensions are in this
      class, too.

  2.  Ones that are so obvious to use that no one has a second thought
      about whether they are macros or not.  CHECK-TYPE is a good
      example of this.  Sure you can't MAPCAR a CHECK-TYPE or pass it
      as an argument, but that would be bizarre anyway.
      WITH-MUMBLE- macros are this type, too.

  3.  Ones that are hints to the compiler for how to compile
      something.  These are conceptually functions that for some
      reason or other the compiler can't handle efficiently.
      Compiler-macros are the right thing here, but they didn't always
      exist.

~jrm
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uoe9pyn8r.fsf@nhplace.com>
Joe Marshall <···@ccs.neu.edu> writes:

> Kent M Pitman <······@nhplace.com> writes:
> 
> > What I allege most macros do is solve what I'll call the "CORBA problem".
> > They shield you from idiotic little idioms where every program that wants
> > to make an object first makes an object factory, then asks it to crank out
> > a singleton object of a generic type, then tests that it succeeds, then
> > finally casts it to something that the program can use.  That kind of 
> > idiomatic stuff, in line in regular code, is a disaster for programming
> > intelligibility except by idiom recognition.  
> 
> Agreed.  `Idioms' mean that there is some functionality that the
> language is incapable of expressing directly and concisely.

They also define the line between what I call "expressive languages"
and "implementation languages", which I might loosely call,
respectively, "high-level languages" and "low-level languages",
although others have different definitions for the latter pair of
terms.  The expressive/high-level languages are about saying
something, and are places where how you write something matters
aesthetically, presentationally, maintenance-wise, etc. while the
implementation/low-level languages tend to be more write-only and/or
places where the presentation per se is secondary to getting a correct
answer.

> Personally, I don't find the "call-with-mumble (lambda (" syntax to be
> particularly nasty and I like knowing what's going on, but clearly you
> prefer the briefer form of the WITH-MUMBLE macro.  

Among other things, it indents better.  That matters to me hugely.
And it's less full of words that serve no purpose.  That is, the word
LAMBDA is useful in a non-idiomatic place because it identifies an
anonymous function.  But in an idiom, where I already know what's going
on, it's just clutter.  In a sense, I'd like to write
  (mapcar ((x) ..) ...)
because I already know that LAMBDA is so common there that I could leave
it out and a human would know what I meant (if he didn't come from Scheme
and think that (x) could mean a function call to x--I'm being somewhat
abstract here, not literal about what you can really leave out according
to real syntax rules, in other words).  

My point is that I use a certain rule of presentational aesthetic that says
that a program is expressed approximately correctly if it uses no more words
than you'd need in English.  So you write
  MAP CAR ACROSS A LIST.
and you see
 (mapcar #'car list)
and you think that's about right.  I'm not talking about being literal in
the same number of words, but same order of magnitude, and no words that are
just gratuitous.  And when you say open a file "foo"
and read its first line, then
 (with-open-file (f "foo") (read-line f))
seems about right. but
 (call-with-open-file (lambda (f) (read-line f)) "foo")
has an extra word, lambda, that isn't needed in the with-open-file scenario,
and that extra words offers flexiblity but in the common case just isn't
needed. so i consider the with-open-file solution more optimal.

There exists no formal formulation of expressiveness distinct from mere
turing computability, but I think if there were, it would have to take into
account, as at least one of probably several useful metrics, this notion 
of comparing numbers of tokens.  Neither shortness nor fewer words is an
absolute measure of good, but each is a potentially legitimate measure that
bears study.

> Having the macro *and* the function (where the macro just expands
> into the function) seems a more than reasonable compromise.
> 
> > So maybe lumping all macrology into one discussion is bad because there
> > are several distinct activities (and associated risks) going on here.
> 
> I think there are three patterns of usage of macros:
> 
>   1.  Ones that are so obvious that they are macros that no one would
>       be confused.  The  DEFINE-ELEMENT  macro from the earlier post
>       falls into this category.  Language extensions are in this
>       class, too.
> 
>   2.  Ones that are so obvious to use that no one has a second thought
>       about whether they are macros or not.  CHECK-TYPE is a good
>       example of this.  Sure you can't MAPCAR a CHECK-TYPE or pass it
>       as an argument, but that would be bizarre anyway.
>       WITH-MUMBLE- macros are this type, too.
> 
>   3.  Ones that are hints to the compiler for how to compile
>       something.  These are conceptually functions that for some
>       reason or other the compiler can't handle efficiently.
>       Compiler-macros are the right thing here, but they didn't always
>       exist.

Joe probably knows this, but other readers might want to review my
special forms paper for a study of what macros are used for (since in
that paper, macros were defined as a subset of special forms, even
though we have since shifted the terminology to make macros and
special forms mean different things... sometimes disjoint, except...)
http://www.nhplace.com/kent/Papers/Special-Forms.html

- - - - 
but on the issue of subtlety of seemingly harmless syntax variations ...

I recall a specific example from the design of dylan, where they killed
dylan with syntax. I don't even remember the exact symbol syntax they
picked, but it was like #"foo" or something.  The problem is that a list
looked like {#"how", #"are", #"you"}. You can say this is a minor deal,
but compare it to lisp (how are you) and ask yourself: would wiezenbaum's
eliza have been as compelling if it had caught people up in the issues of
lisp syntax.  the whole thing that caught my eye when i first learned it
was that it was like he had programs with english in them.
  (cond ((equal sentence '(how are you?)) ...))
just LOOKS LIKE code with english embedded.
  if [sentence={#"how", #"are", #"you?"}] 
or whatever does NOT look like code with English embedded. it looks like 
code that is manipulating tokens that are based on english words, not
english sentences wholesale. I think such distinctions matter in any
decent theory of aesthetic.  Those #'s and whatnot scream out to the user
"there is syntax here. go away. you don't know what you're doing"
while the lisp notation screamed out "this is syntax free. get involved."
that once mattered greatly to me as i got involved in lisp... i think it
still matters.

Anyone who says otherwise certainly cannot explain the rules of good 
typography, all of which hinge on much tinier aesthetic shifts than any
I've mentioned here.
From: Joe Marshall
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1x6kgbyq.fsf@ccs.neu.edu>
> Joe Marshall <···@ccs.neu.edu> writes:
>
>> Personally, I don't find the "call-with-mumble (lambda (" syntax to be
>> particularly nasty and I like knowing what's going on, but clearly you
>> prefer the briefer form of the WITH-MUMBLE macro.  

Kent M Pitman <······@nhplace.com> writes:
> Among other things, it indents better.  That matters to me hugely.

That's easy to fix in emacs.

> And it's less full of words that serve no purpose.  That is, the word
> LAMBDA is useful in a non-idiomatic place because it identifies an
> anonymous function.  But in an idiom, where I already know what's going
> on, it's just clutter.  In a sense, I'd like to write
>   (mapcar ((x) ..) ...)
> because I already know that LAMBDA is so common there that I could leave
> it out and a human would know what I meant (if he didn't come from Scheme
> and think that (x) could mean a function call to x--I'm being somewhat
> abstract here, not literal about what you can really leave out according
> to real syntax rules, in other words).  

I agree.  I wish there were some sort of `thunk' syntax that was more
concise than "(lambda ", but I haven't seen any good options.  (It
shouldn't be a reader macro or require weird characters, and it had
better be shorter than 4 chars or so.)
From: Juliusz Chroboczek
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <7imzp8gb4r.fsf@lanthane.pps.jussieu.fr>
> I agree.  I wish there were some sort of `thunk' syntax that was more
> concise than "(lambda ", but I haven't seen any good options.  (It
> shouldn't be a reader macro or require weird characters, and it had
> better be shorter than 4 chars or so.)

  (defmacro \\ (args &body body)
    (let ((args (if (listp args) args (list args))))
      `#'(lambda ,args ,@body)))

                                        Juliusz
From: Rob Warnock
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <Ao2dnSrQPYTKcVjfRVn-qg@speakeasy.net>
Juliusz Chroboczek  <···@pps.jussieu.fr> wrote:
+---------------
|   (defmacro \\ (args &body body)
|     (let ((args (if (listp args) args (list args))))
|       `#'(lambda ,args ,@body)))
+---------------

I tend to prefer the Scheme-style default for the non-list arg case:

    > (defmacro \\ (args &body body)
        `(lambda ,(if (listp args) args (list '&rest args)) ,@body))

    |\\|
    > (mapcar (\\ x (+ (third x) (* 2 (second x))))
	      '(1 2 3) '(4 5 6) '(7 8 9))

    (15 18 21)
    > 

But that's just me...  ;-}


-Rob

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: Rob Warnock
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <f86dnXf-bNzgbFjfRVn-ig@speakeasy.net>
p.s. I just wrote:
+---------------
| I tend to prefer the Scheme-style default for the non-list arg case:
| (defmacro \\ (args &body body)
|   `(lambda ,(if (listp args) args (list '&rest args)) ,@body))
+---------------

I forgot to mention that I actually prefer ML's name "FN" for a
LAMBDA abbreviation, since it's the same number of keystrokes as "\\"
but doesn't print weirdly:
  
    > (defmacro fn (args &body body)
        `(lambda ,(if (listp args) args (list '&rest args)) ,@body))

    FN
    > (mapcar (fn (x) (+ 3 x)) '(3 4 5))

    (6 7 8)
    > 


-Rob

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: Marcus Breiing
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <ee0fuamq0x1wj@breiing.com>
* Rob Warnock

> I forgot to mention that I actually prefer ML's name "FN" for a
> LAMBDA abbreviation, since it's the same number of keystrokes as
> "\\" but doesn't print weirdly
  
I use the FN name, but not the Scheme lambda lists.  It seems that
macros really *do* fragment the language:-)

For another dimension of potentially divergent design, my version adds
an IGNORABLE declaration for arguments beginning with "_":

(defmacro fn (args &body body)
  (let* ((args (if (listp args) args (list args)))
         (ign (remove-if-not (lambda (v)
                               (and (not (equal (symbol-name v) ""))
                                    (eql (elt (symbol-name v) 0) #\_)))
                             args)))
    `(lambda
      ,args
      ,@(and ign `((declare (ignorable ,@ign))))
      ,@body)))

Marcus
From: Rob Warnock
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1vadnbQcGICg11vfRVn-pw@speakeasy.net>
Marcus Breiing  <······@2005w20.mail.breiing.com> wrote:
+---------------
| * Rob Warnock
| > I forgot to mention that I actually prefer ML's name "FN" for a
| > LAMBDA abbreviation, since it's the same number of keystrokes as
| > "\\" but doesn't print weirdly
|   
| I use the FN name, but not the Scheme lambda lists.  It seems that
| macros really *do* fragment the language:-)
+---------------

They do, which is why I noted that I only use these abbrevs myself
directly in the REPL, and not in permanent code.

+---------------
| For another dimension of potentially divergent design, my version adds
| an IGNORABLE declaration for arguments beginning with "_":
+---------------

O.k., then, even though the OP said "no readmacros", I guess now
I have to show you my ugly #$ readmacro in all of its grossness:  ;-}

;;; SET-SHARP-DOLLAR-READER -- Experimental LAMBDA abbreviation (#1 of 2).
;;; SYNTAX: #$FORM
;;; An abbreviation of: (lambda (&optional $1 $2 $3 $4 $5 $6 $7 $8 $9 &rest $*)
;;;                       FORM)
;;; Within the FORM, args $1 ... $9 and $* are be lambda-bound as positional
;;; and &REST parameters, respectively. Usually, but not always, FORM will be
;;; an S-expr, e.g. #$(car $3), but this is legal: #$FOO ==> (lambda () FOO),
;;; that is, (CONSTANTLY FOO). Likewise, #$$3 ==> #'THIRD.
;;;
;;; As a convenience for interactive use, in the special case that FORM is a
;;; list and (car FORM) is also a list, then an implicit PROGN is provided,
;;; e.g., #$((foo) (bar)) ==> (lambda (args...) (foo) (bar)).
;;;
(defun set-sharp-dollar-reader ()
  (flet ((sharp-dollar-reader (s c p)
           (declare (ignore c p))
           (let* ((form (read s t nil t)))
             `(lambda (&optional $1 $2 $3 $4 $5 $6 $7 $8 $9 &rest $*)
                (declare (ignorable $1 $2 $3 $4 $5 $6 $7 $8 $9 $*))
                ,@(if (and (consp form) (consp (car form)))
                    form
                    (list form))))))
    (set-dispatch-macro-character #\# #\$ #'sharp-dollar-reader)))


-Rob

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: Joe Marshall
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <wto8n8h1.fsf@comcast.net>
Marcus Breiing <······@2005w20.mail.breiing.com> writes:

> I use the FN name, but not the Scheme lambda lists.  It seems that
> macros really *do* fragment the language:-)

That might be why I haven't seen anything better than the plain old
boring "(lambda (x y ....) ...)".  Everyone seems to have something
perhaps slightly better, but they are all different.  The long lambda
form may be somewhat clumsy, but it works right out of the box
everywhere.

-- 
~jrm
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <u64vuarhm.fsf@nhplace.com>
····@rpw3.org (Rob Warnock) writes:

> Juliusz Chroboczek  <···@pps.jussieu.fr> wrote:
> +---------------
> |   (defmacro \\ (args &body body)
> |     (let ((args (if (listp args) args (list args))))
> |       `#'(lambda ,args ,@body)))
> +---------------
> 
> I tend to prefer the Scheme-style default for the non-list arg case:
> 
>     > (defmacro \\ (args &body body)
>         `(lambda ,(if (listp args) args (list '&rest args)) ,@body))

Maybe you'd like

(defmacro \\ (args &body body)
  `(lambda ,(do ((args args (cdr args))
                 (temp '() (cons (car args) temp)))
                ((atom args)
             (if (null args)      ; was it a proper list?
                 (nreverse temp)  ; if so, just reverse saved args
                 (nreconc temp    ; otherwise, reverse most and add tail:
                          (list '&rest args)))))
     ,@body))

even better.
 
>     |\\|
>     > (mapcar (\\ x (+ (third x) (* 2 (second x))))
> 	      '(1 2 3) '(4 5 6) '(7 8 9))
> 
>     (15 18 21)
>     > 
> 
> But that's just me...  ;-}

My problem is that in Maclisp, \\ meant fixnum-only GCD.
But I'm probably one of the few that is bothered by that. ;)
Not to be confused with \, which was fixnum-only remainder.
(These were a lot of "fun" to convert by query-replace 
in 100,000 lines of MACSYMA code..)
From: Joe Marshall
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <64vundua.fsf@comcast.net>
Kent M Pitman <······@nhplace.com> writes:

> My problem is that in Maclisp, \\ meant fixnum-only GCD.
> But I'm probably one of the few that is bothered by that. ;)
> Not to be confused with \, which was fixnum-only remainder.

I think it was still in the LispM, so I wasn't too keen on \\ being a
substitute for lambda.

-- 
~jrm
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <ubr5mvsr2.fsf@nhplace.com>
Joe Marshall <·············@comcast.net> writes:

> Kent M Pitman <······@nhplace.com> writes:
> 
> > My problem is that in Maclisp, \\ meant fixnum-only GCD.
> > But I'm probably one of the few that is bothered by that. ;)
> > Not to be confused with \, which was fixnum-only remainder.
> 
> I think it was still in the LispM, so I wasn't too keen on \\ being a
> substitute for lambda.

Nah--there you're supposed to use backspace.
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uoe9lgkhf.fsf@nhplace.com>
Kent M Pitman <······@nhplace.com> writes:

 > Joe Marshall <·············@comcast.net> writes:
> 
> > Kent M Pitman <······@nhplace.com> writes:
> > 
> > > My problem is that in Maclisp, \\ meant fixnum-only GCD.
> > > But I'm probably one of the few that is bothered by that. ;)
> > > Not to be confused with \, which was fixnum-only remainder.
> > 
> > I think it was still in the LispM, so I wasn't too keen on \\ being a
> > substitute for lambda.
> 
> Nah--there you're supposed to use backspace.

Geez, is no one gonna post a "huh?" here.

Ok, ok, so no one's reading my posts. :(   Well, for the sake of the
occasional web archeologist who runs across my posts later and wishes
he/she knew what I as babbling about:

ASCII Backspace has the same character code as Lambda in the old SAIL
(Stanford AI Lab) character set, which was the basis of the Lisp Machine's
character set.  So when you used the LispM's lambda character (which was
the lowercase Greek letter) and then you edited your file on a conventional
processor using Emacs, you generally saw (^H ...).
From: Thomas F. Burdick
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <xcvll4pqajx.fsf@conquest.OCF.Berkeley.EDU>
Kent M Pitman <······@nhplace.com> writes:

> Kent M Pitman <······@nhplace.com> writes:
> 
>  > Joe Marshall <·············@comcast.net> writes:
> > 
> > > Kent M Pitman <······@nhplace.com> writes:
> > > 
> > > > My problem is that in Maclisp, \\ meant fixnum-only GCD.
> > > > But I'm probably one of the few that is bothered by that. ;)
> > > > Not to be confused with \, which was fixnum-only remainder.
> > > 
> > > I think it was still in the LispM, so I wasn't too keen on \\ being a
> > > substitute for lambda.
> > 
> > Nah--there you're supposed to use backspace.
> 
> Geez, is no one gonna post a "huh?" here.
> 
> Ok, ok, so no one's reading my posts. :(

Nah, you've just been partially outdated by Google :-)

I read that, thought, "Huh, how could backspace be lambda?"
The summary of the first google hit for lisp lambda backspace reads:

  In ASCII files, lambda became the same character as backspace (10 octal), ...
  The Lisp Machine's 8-bit character set included a separate backspace character ...

-- 
           /|_     .-----------------------.                        
         ,'  .\  / | Free Mumia Abu-Jamal! |
     ,--'    _,'   | Abolish the racist    |
    /       /      | death penalty!        |
   (   -.  |       `-----------------------'
   |     ) |                               
  (`-.  '--.)                              
   `. )----'                               
From: Joe Marshall
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1x6gon9d.fsf@comcast.net>
Kent M Pitman <······@nhplace.com> writes:

> So when you used the LispM's lambda character (which was
> the lowercase Greek letter) and then you edited your file on a conventional
> processor using Emacs, you generally saw (^H ...).

Who would use a conventional processor if there were a LispM around?



-- 
~jrm
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uslywreur.fsf@nhplace.com>
Joe Marshall <·············@comcast.net> writes:

> Kent M Pitman <······@nhplace.com> writes:
> 
> > So when you used the LispM's lambda character (which was
> > the lowercase Greek letter) and then you edited your file on a conventional
> > processor using Emacs, you generally saw (^H ...).
> 
> Who would use a conventional processor if there were a LispM around?

When I worked at the Lab for Computer Science, my main work was done on 
ITS and I only guested on Lisp Machines when others weren't using them.

But even so, later on when I had a dedicated LispM at work, I'd often
net in from home.  The Lisp Machine allowed telnet and supdup connections, 
but gave you a Lisp toplevel when you arrived.  It didn't have a good
file editor that worked over a remote connection--it was just never a 
priority to create.  So mostly one would just copy the file to a TOPS-20
or Unix (which was usually what you dialed up to hop to the LispM anyway,
most people didn't call their LispMs directly), and then you'd use Emacs.
From: Christophe Rhodes
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <sqr7egx2kr.fsf@cam.ac.uk>
Kent M Pitman <······@nhplace.com> writes:

> Kent M Pitman <······@nhplace.com> writes:
>> Nah--there you're supposed to use backspace.
>
> Geez, is no one gonna post a "huh?" here.
>
> Ok, ok, so no one's reading my posts. :( 

I'm reading, but to be honest, for me the minutiae of 20-year-old lisp
environments aren't so riveting as to warrant a request to explain
what is obviously an in-joke.  Feel free to amuse yourself, but don't
expect too much of a reaction from people who weren't there at the
time.

Christophe
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uoe9krdxr.fsf@nhplace.com>
Christophe Rhodes <·····@cam.ac.uk> writes:

> Kent M Pitman <······@nhplace.com> writes:
> 
> > Kent M Pitman <······@nhplace.com> writes:
> >> Nah--there you're supposed to use backspace.
> >
> > Geez, is no one gonna post a "huh?" here.
> >
> > Ok, ok, so no one's reading my posts. :( 
> 
> I'm reading, but to be honest, for me the minutiae of 20-year-old lisp
> environments aren't so riveting as to warrant a request to explain
> what is obviously an in-joke.  Feel free to amuse yourself, but don't
> expect too much of a reaction from people who weren't there at the
> time.

Oh, it's ok.  I was just kidding.  And this is a very reasonable attitude.

Sometimes I make these jokes less as jokes and more as bookmarks to remind
me to expound more later.  They're a kind of "contuation marker".  So,
continuing...

On a more serious note, the reason I raise this is that this
incredibly bad compatibility mapping is what I think ultimately kept
users from moving forward.  A lot of people felt Lambda (the
character, not the word) was the right name for this, and there it was
a nice short character, but the compatibility aspects were terrible.
So, as nearly as I can tell, the idea died there.

Almost every effort on the part of the LispM to do cool things with
extended characters died an early death not because the ideas weren't
cool but because the attention to file compatibility was not made, and
ultimately there was no way to even publish the ideas to other people
in a readable form.

The LispM had cool functionality on Sharpsign Altmode (which printed
as a kind of diamond-shaped character, like something you'd find in
APL) which was a simple infix parser.  It had some very nice
indentation control things for FORMAT on Tilde right-pointing-arrow
(ASCII control-y, if I remember right) and Tilde left-pointing-arrow
(ASCII control-x, if I remember right, though I might have them
flipped), but again those characters showed up poorly.  As I recall
right and left horeshoe characters were used as braces in some
contexts (perhaps in version-controlled pathnames?)

There was a big fight in pathnames about whether to use the 'first colon'
convention as a separator between "host name" and "host-specific data"
in pathnames.  Ultimately, first colon won out, and that's used in some
other platforms now too, I think, though is not part of the CL standard (pity).
But for a while, there was a push to instead use a character that didn't
occur in other file systems, so it would be easier to tell.  
Bernie Greenberg (architect of one of the LispM file system)'s whimsically
suggested the use of a new character he imagined would look like a chalice
(mnemonic for being the "host" character -- religious humor).  Probably just
as well that wasn't done...  Lispm pathnames might not have been as popular 
if they had been mired in non-portable syntax, and might have had less
influence on CL.  (Some might consider that a virtue, but I think it would
have been a net loss.)

(Early TECO suffers similarly, because it used funny characters that 
print lousily on printers since that time, so even though files exist 
full of TECO code, it's hard for anyone any more to visualize what these
files looked like in those days.)

So yes, these are all trivia about old systems.  But they explain a
lot about today and how it came to be, and why it's not different than
some might imagine it would have been.
From: Pascal Bourguignon
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <87mzp4y9n7.fsf@thalassa.informatimago.com>
Kent M Pitman <······@nhplace.com> writes:
> On a more serious note, the reason I raise this is that this
> incredibly bad compatibility mapping is what I think ultimately kept
> users from moving forward.  A lot of people felt Lambda (the
> character, not the word) was the right name for this, and there it was
> a nice short character, but the compatibility aspects were terrible.
> So, as nearly as I can tell, the idea died there.

It is the right name.  And happily, emacs is configurable easily
enough to have it display a λ for all occurence of lambda.


> Almost every effort on the part of the LispM to do cool things with
> extended characters died an early death not because the ideas weren't
> cool but because the attention to file compatibility was not made, and
> ultimately there was no way to even publish the ideas to other people
> in a readable form.

Nowadays we could just use unicode.


> [...]
> So yes, these are all trivia about old systems.  But they explain a
> lot about today and how it came to be, and why it's not different than
> some might imagine it would have been.

-- 
__Pascal Bourguignon__                     http://www.informatimago.com/
Small brave carnivores
Kill pine cones and mosquitoes
Fear vacuum cleaner
From: Matthias Buelow
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <86fyuwhj09.fsf@drjekyll.mkbuelow.net>
Christophe Rhodes <·····@cam.ac.uk> writes:

>>> Nah--there you're supposed to use backspace.
>> Geez, is no one gonna post a "huh?" here.
>> Ok, ok, so no one's reading my posts. :( 
>
>I'm reading, but to be honest, for me the minutiae of 20-year-old lisp
>environments aren't so riveting as to warrant a request to explain
>what is obviously an in-joke.  Feel free to amuse yourself, but don't
>expect too much of a reaction from people who weren't there at the
>time.

Plus, considering that Emacs uses backspace for "help", there's little
that still surprises people in that direction...

mkb.
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uwto8u1hk.fsf@nhplace.com>
Matthias Buelow <···@incubus.de> writes:

> Christophe Rhodes <·····@cam.ac.uk> writes:
> 
> >>> Nah--there you're supposed to use backspace.
> >> Geez, is no one gonna post a "huh?" here.
> >> Ok, ok, so no one's reading my posts. :( 
> >
> >I'm reading, but to be honest, for me the minutiae of 20-year-old lisp
> >environments aren't so riveting as to warrant a request to explain
> >what is obviously an in-joke.  Feel free to amuse yourself, but don't
> >expect too much of a reaction from people who weren't there at the
> >time.
> 
> Plus, considering that Emacs uses backspace for "help", there's little
> that still surprises people in that direction...

Well, at least that makes some sense to those who know Backspace = Control-H.
(Maybe people thought I was telling Joe he should "get help".)
From: Joe Marshall
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <irzsn02e.fsf@comcast.net>
Kent M Pitman <······@nhplace.com> writes:

> Well, at least that makes some sense to those who know Backspace = Control-H.
> (Maybe people thought I was telling Joe he should "get help".)

As if I hadn't heard *that* before.

-- 
~jrm
From: Matthias Buelow
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <86u0jcpt0u.fsf@drjekyll.mkbuelow.net>
Kent M Pitman <······@nhplace.com> writes:

>> Plus, considering that Emacs uses backspace for "help", there's little
>> that still surprises people in that direction...
>
>Well, at least that makes some sense to those who know Backspace = Control-H.
>(Maybe people thought I was telling Joe he should "get help".)

Yes.. of course. And thankfully, it can be changed in seconds in
emacs. I was just hinting at the otherworldly decision of making C-h
the default help key even though it collides with backspace.

mkb.

[I have help on C-x?, and f1 and explicitly bound C-h to
backward-delete-char.]
From: ········@yahoo.com
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1120409203.491118.234340@z14g2000cwz.googlegroups.com>
Matthias Buelow wrote:
> Christophe Rhodes <·····@cam.ac.uk> writes:
>
> >>> Nah--there you're supposed to use backspace.
> >> Geez, is no one gonna post a "huh?" here.
> >> Ok, ok, so no one's reading my posts. :(
> >
> >I'm reading, but to be honest, for me the minutiae of 20-year-old lisp
> >environments aren't so riveting as to warrant a request to explain
> >what is obviously an in-joke.  Feel free to amuse yourself, but don't
> >expect too much of a reaction from people who weren't there at the
> >time.
>
> Plus, considering that Emacs uses backspace for "help", there's little
> that still surprises people in that direction...
>
> mkb.

Ah, that OLD stuff again, eh?

Or is what we have today but a stale imitation, a pretension if you
will,
towards what came before.

In order to know for sure the answer to the above, rather important
question, one  must learn something of what came before - or else not
know and be doomed to accept the environments of the present as the
unquestioned status quo.

Jim
From: Matthias Buelow
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <868y0nfy9g.fsf@drjekyll.mkbuelow.net>
········@yahoo.com writes:

>Or is what we have today but a stale imitation, a pretension if you
>will,
>towards what came before.

Yes, if you like to live backwards in time.

mkb.
From: lin8080
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <42C8661F.7BB80E2F@freenet.de>
Matthias Buelow schrieb:
> ········@yahoo.com writes:

> >Or is what we have today but a stale imitation, a pretension if you
> >will,
> >towards what came before.

> Yes, if you like to live backwards in time.

It is more from that kind: keep the lispy way alive. So looking from
where Lisp come from may prevent to go where the rest goes. This is a
value one may laugh about but there are some good reasons, having
something else. The status of lisp is different from any other kind of
programming language and really, I don't want that point gets lost in
coming days, months, years. (but yes, sometimes it is boring reading all
these old adventures, when 3clicks away something fantastic happens,
hm?)

stefan

train your tolerance
From: Matthias Buelow
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <86pstze7mr.fsf@drjekyll.mkbuelow.net>
lin8080 <·······@freenet.de> writes:

>It is more from that kind: keep the lispy way alive. So looking from
>where Lisp come from may prevent to go where the rest goes. This is a
>value one may laugh about but there are some good reasons, having
>something else. The status of lisp is different from any other kind of
>programming language and really, I don't want that point gets lost in
>coming days, months, years. (but yes, sometimes it is boring reading all
>these old adventures, when 3clicks away something fantastic happens,

Well. One should also be able to let go of unimportant details which
no longer make sense when the environment changes. Otherwise you'll
carry on an unproductive, or sometimes -in the new environment-
counterproductive burden of anachronistic ballast. That said, I was
talking specifically about Emacs' C-h for "help", which might have
made sense in Emacs' original habitat but certainly is
counterproductive on Unix, or any other system that uses the ASCII
character set (or superset thereof) for terminal input.

mkb.
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uoe9jtmdz.fsf@nhplace.com>
Matthias Buelow <···@incubus.de> writes:

> Well. One should also be able to let go of unimportant details which
> no longer make sense when the environment changes. Otherwise you'll
> carry on an unproductive, or sometimes -in the new environment-
> counterproductive burden of anachronistic ballast. That said, I was
> talking specifically about Emacs' C-h for "help", which might have
> made sense in Emacs' original habitat but certainly is
> counterproductive on Unix, or any other system that uses the ASCII
> character set (or superset thereof) for terminal input.

Especially curious because this is an archeologically recent change.

On ITS (pdp10), where developed, we had a help key (a two-character sequence
on ASCII keyboards translated ^_H at the operating system level to a character
not in ASCII).  On the Knight TV systems used by the AI Lab, Help was
its own key.

On DEC Tops-20, later, I think just ^_ itself was used.

It wasn't until the Linux implementation that control-h was chosen.
It never was that in its "original habitat", for exactly the reasons 
you cite--ascii confusion.

I don't know the rationale.  It never made sense to me at any time
for the reasons you cite.  I think the march of history is not the 
problem here.  It was just a bad choice.
From: Christopher Browne
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <m3vf3r5hn9.fsf@mobile.int.cbbrowne.com>
Centuries ago, Nostradamus foresaw when Kent M Pitman <······@nhplace.com> would write:
> It wasn't until the Linux implementation that control-h was chosen.
> It never was that in its "original habitat", for exactly the reasons 
> you cite--ascii confusion.

I have to disagree.

I saw it on Ultrix in the late 1980s, before Linux emerged.
-- 
(format nil ···@~S" "cbbrowne" "gmail.com")
http://linuxdatabases.info/info/slony.html
Pound for pound, the amoeba is the most vicious animal on earth.
From: Christopher C. Stacy
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <u1x6e8m5d.fsf@news.dtpq.com>
Christopher Browne <········@acm.org> writes:

> Centuries ago, Nostradamus foresaw when Kent M Pitman <······@nhplace.com> would write:
> > It wasn't until the Linux implementation that control-h was chosen.
> > It never was that in its "original habitat", for exactly the reasons 
> > you cite--ascii confusion.
> 
> I have to disagree.
> 
> I saw it on Ultrix in the late 1980s, before Linux emerged.

GNU Emacs was based on the source code of an Emacs
written by James Gosling around 1981; the first
release of GNU Emacs was around 1985.  These ran
on operating systems that were still called Unix,
and long predated the advent of Linux.
DEC had a version of Unix called Ultrix.
I think Kent was just being sloppy when referring
to "Linux", and he just meant "Unix".

I thought Kent's point was that the "original habitat" 
of Emacs was not any Unix-like operating system,
but rather was on (in order) ITS, Multics, and Lisp Machines.  
In particular, I believe he was referring to ITS, 
which ran the original EMACS -- RMS is the author of 
both that EMACS and of GNU Emacs.   As Kent explained,
on ITS Emacs the Help key was not control-H.
On the special keyboards at the lab, it was the key
labeled HELP, and on the ASCII terminals of the day
Help was an awkward two-character control sequence
beginning with control-underscore.
Whereas on GNU Emacs, Help was always control-H.
From: Christopher C. Stacy
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uk6k77zrb.fsf@news.dtpq.com>
Kent M Pitman <······@nhplace.com> writes:

> Matthias Buelow <···@incubus.de> writes:
> 
> > I was talking specifically about Emacs' C-h for "help", which
> > might have made sense in Emacs' original habitat but certainly is
> 
> It wasn't until the Linux implementation that control-h was chosen.
> It never was that in its "original habitat", for exactly the reasons 
> you cite--ascii confusion.
> 
> I don't know the rationale.  It never made sense to me at any time
> for the reasons you cite.  I think the march of history is not the 
> problem here.  It was just a bad choice.

For a user in the Emacs mindset, it is an excellent choice,
as there could be nothing more mnemonic than "H" for "Help".
And, in fact, BACKSPACE is delete-backward-char, not "Help".
(This even works in terminal windows, even over ssh.)
The confusion was only on ASCII terminal keyboards,
not on PCs or workstations.

However, at the time when the decision was made, most of the 
keyboards in use were ASCII terminals, and this was an issue.
This was before X Windows was widely available, and I am pretty 
sure that RMS was himself just using an ASCII terminal (as I recall).
I don't know whether he was thinking about the coming prevalence of
modern keyboards that distinguished BACKSPACE from ^H or whether he
just didn't care.  He had just come from an environment where many people, 
including himself had been using the more sophisticated kinds of keyboards.

Emacs is my favorite editor, but I was going to write something about
the subtle arrogance of the Emacs mindset.  But then I realized that
the ^H problem was not real: users today don't know the ASCII control
characters, and they key labeled BACKSPACE doesn't send ^H (since it
doesn't "send" anything at all.)
From: Matthias Buelow
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <86wto6vfa6.fsf@drjekyll.mkbuelow.net>
······@news.dtpq.com (Christopher C. Stacy) writes:

>the subtle arrogance of the Emacs mindset.  But then I realized that
>the ^H problem was not real: users today don't know the ASCII control
>characters, and they key labeled BACKSPACE doesn't send ^H (since it
>doesn't "send" anything at all.)

This is not true.. even today, you have about a 50% chance that xterm
on any given system will send either ^H or ^?. On some systems, ^? is
the default, on others it's ^H (and on text consoles and real
terminals, it's a similar issue). Using ^H for anything except
backspace in a given program that accepts terminal input is really a
bad choice. On X11, this doesn't manifest itself as much because the
program gets X events, of course. But even there I occasionally type
^H instead of backspace, and expect them to be the same (simply
because of habit; in the same way that I sometimes use ^M instead of
<Return> and also expect it to work.) Thankfully one can rebind keys
in (X)Emacs so the "problem" is a non-issue after a quick change but
still the default is annoying.

mkb.
From: Christopher C. Stacy
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <u64vq8mz1.fsf@news.dtpq.com>
Matthias Buelow <···@incubus.de> writes:

> ······@news.dtpq.com (Christopher C. Stacy) writes:
> 
> >the subtle arrogance of the Emacs mindset.  But then I realized that
> >the ^H problem was not real: users today don't know the ASCII control
> >characters, and they key labeled BACKSPACE doesn't send ^H (since it
> >doesn't "send" anything at all.)
> 
> This is not true.. even today, you have about a 50% chance that xterm
> on any given system will send either ^H or ^?. On some systems, ^? is
> the default, on others it's ^H (and on text consoles and real
> terminals, it's a similar issue).

Perhaps I was not clear.  

You mention "real terminals" -- what's that, a VT100?  
My whole point was that it was a problem on VT100s, 
but we're not talking about museum pieces, 
we're talking about what most people run Emacs on today.

On PCs (Windows, MacOS) and (Unix-based) workstations today, 
the user interface is not a seperate ASCII keyboard terminal; 
rather, the keyboard is a seperate device attached by a kind 
of serial cable, and the each key press and release is a distinct
coded event that is mapped to arbitrary functions by the OS and 
the applications.  The keyboard does not send ASCII; it sends 
what are called "scan codes" which are arbitrarily mapped by 
the operating system.

The BACKSPACE key does not "send" ^H, but rather a scan code
which identifies which physical key (number) was pressed.

On these systems, Emacs does not map the BACKSPACE key to ^H,
but rather to the function  delete-backward-char.

I believe the above information is indisputable fact.

You can also run Emacs in a terminal window (which is trying 
to simulate the old-fashioned way of doing things), and in
addition you can run Emacs over an ssh terminal connection.
My second claim is that even in those circumstances, this
BACKSPACE / ^H distinction is usually preserved.
My test cases for this latter claim included:

 1. puTTY (ssh) on Windows
 2. command-line terminal ("DOS box") on Windows
 3. Debian Linux, KDE, Konsole (local and ssh).
 4. Debian Linux, Gnome Terminal (local and ssh).
 5. Debian Linux, bare xterm (local and ssh).

All the remote connections were to a FreeBSD.
The default X resources were used in each case.

That's obviously not all programs in the world, but I think
it's a representative sample.  So (obviously not counting
programs for which you yourself have misconfigured) can you
please provide the details for the ones you tested that led
to your 50% survey number?

Finally, I claim that most computer users today do not know
that in ASCII, ^H was BS.  Indeed, I claim that most do not
even know what ASCII is at all.   (Most also don't know even 
know that ^M is CR, not have they ever seen a "Return" key.)

Your thesis seems to be that people are getting confused
as to why character deletion doesn't happen when they
type ^H in Emacs.  I think it is you, not the keyboard
or any imaginary users, that is trying to "BS" here.
From: Matthias Buelow
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <86r7ee74qo.fsf@drjekyll.mkbuelow.net>
······@news.dtpq.com (Christopher C. Stacy) writes:

>On PCs (Windows, MacOS) and (Unix-based) workstations today, 
>the user interface is not a seperate ASCII keyboard terminal; 

xterm and putty are ASCII (actually vt102-ish) terminals.

Of course you haven't quoted the part of my posting where I spoke
about X events and that they are different from an ASCII terminal, and
you more or less tried to berate me by telling me that very fact.

>My test cases for this latter claim included:
>
> 1. puTTY (ssh) on Windows
> 2. command-line terminal ("DOS box") on Windows
> 3. Debian Linux, KDE, Konsole (local and ssh).
> 4. Debian Linux, Gnome Terminal (local and ssh).
> 5. Debian Linux, bare xterm (local and ssh).

The ^H/^? schism is perhaps the #1 FAQ for Unix usage ever. It's
hardly something obscure that can be observed only in special
situations.

>Finally, I claim that most computer users today do not know
>that in ASCII, ^H was BS.  Indeed, I claim that most do not
>even know what ASCII is at all.   (Most also don't know even 
>know that ^M is CR, not have they ever seen a "Return" key.)

Most computer users don't know what emacs is either.

mkb.
From: Christopher C. Stacy
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uoe9i73yx.fsf@news.dtpq.com>
Matthias Buelow <···@incubus.de> writes:

> ······@news.dtpq.com (Christopher C. Stacy) writes:
> 
> >On PCs (Windows, MacOS) and (Unix-based) workstations today, 
> >the user interface is not a seperate ASCII keyboard terminal; 
> 
> xterm and putty are ASCII (actually vt102-ish) terminals.
> 
> Of course you haven't quoted the part of my posting where I spoke
> about X events and that they are different from an ASCII terminal, and
> you more or less tried to berate me by telling me that very fact.
> 
> >My test cases for this latter claim included:
> >
> > 1. puTTY (ssh) on Windows
> > 2. command-line terminal ("DOS box") on Windows
> > 3. Debian Linux, KDE, Konsole (local and ssh).
> > 4. Debian Linux, Gnome Terminal (local and ssh).
> > 5. Debian Linux, bare xterm (local and ssh).
> 
> The ^H/^? schism is perhaps the #1 FAQ for Unix usage ever. It's
> hardly something obscure that can be observed only in special
> situations.

We're not talking about ^H/^?  - we're talking about what
Emacs does if you press BACKSPACE versus ^H.   I wrote
some things about,  which you quoted and upon which
you made the following direct, initial comment: "Not true...".

(If you feel that I have misrepresented your position, 
take solace in the fact that the conversation is archived 
in Google.  I always encourage people to check there, 
since it better the context and what was said should 
just check Google, which preserves an archive of the
conversation without any potential editorial manipulation.)
From: Robert Uhl
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <m3u0jaw5vv.fsf@4dv.net>
······@news.dtpq.com (Christopher C. Stacy) writes:
>
> > The ^H/^? schism is perhaps the #1 FAQ for Unix usage ever. It's
> > hardly something obscure that can be observed only in special
> > situations.
>
> We're not talking about ^H/^?  - we're talking about what Emacs does
> if you press BACKSPACE versus ^H.

Trust me, with the wrong terminal setting emacs pops up help every time
I hit backspace.  It's happened _many_ times to me, often when SSHing
with cygwin to Linux.

-- 
Robert Uhl <http://public.xdi.org/=ruhl>
I sat around during the design phase going `this is going to suck so
badly that we're going to have to hold onto desks to stop us from being
drawn into the vortex.'                              --Chris Saunderson
From: Pascal Bourguignon
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <87slyuw0ka.fsf@thalassa.informatimago.com>
Robert Uhl <·········@NOSPAMgmail.com> writes:
> ······@news.dtpq.com (Christopher C. Stacy) writes:
>>
>> > The ^H/^? schism is perhaps the #1 FAQ for Unix usage ever. It's
>> > hardly something obscure that can be observed only in special
>> > situations.
>>
>> We're not talking about ^H/^?  - we're talking about what Emacs does
>> if you press BACKSPACE versus ^H.
>
> Trust me, with the wrong terminal setting emacs pops up help every time
> I hit backspace.  It's happened _many_ times to me, often when SSHing
> with cygwin to Linux.

M-x normal-erase-is-backspace-mode RET





[ Now, if anybody could explain me why it doesn't work automatically
  when I put this in my ~/.emacs:

    (case window-system
      ((nil)
       (message "houra")
       (normal-erase-is-backspace-mode)) ; never works :-(
      ((x)
       (define-key global-map [(delete)]    "\C-d")
       (make-face-bold 'bold-italic))
      ((mac)
       (setq mac-command-key-is-meta nil
             mac-reverse-ctrl-meta   nil)
       (set-keyboard-coding-system 'mac-roman)))

  :-( ]


-- 
__Pascal Bourguignon__                     http://www.informatimago.com/

Nobody can fix the economy.  Nobody can be trusted with their finger
on the button.  Nobody's perfect.  VOTE FOR NOBODY.
From: Matthias Buelow
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <86d5pyhy41.fsf@drjekyll.mkbuelow.net>
Pascal Bourguignon <···@informatimago.com> writes:

>M-x normal-erase-is-backspace-mode RET

Hehe... that is _so_ typically emacs[1].. a convoluted non-solution to
a trivial problem that needn't exist in the first place. I had to read
the description for the function 3 times and still don't quite know
what it does and am not sure that it works as described. In fact, when
I enabled it, backspace stopped working.

Why not just:

(global-set-key "\C-h" 'delete-backward-char)
(global-set-key "\C-?" 'delete-backward-char)

in .emacs (or .xemacs/init.el) and forget about it once and forever?

mkb.

[1] FSF.. doesn't seem to exist in Xemacs.
From: Robert Uhl
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <m3hdf9nwjc.fsf@4dv.net>
Pascal Bourguignon <···@informatimago.com> writes:
>
> > Trust me, with the wrong terminal setting emacs pops up help every
> > time I hit backspace.  It's happened _many_ times to me, often when
> > SSHing with cygwin to Linux.
>
> M-x normal-erase-is-backspace-mode RET

Yup--just goes to show that the OP was incorrect in his assertion that
these issues don't exist.

> [ Now, if anybody could explain me why it doesn't work automatically
>   when I put this in my ~/.emacs:

Check default.el.  Quite a piece of bran-dead engineering: it runs
_after_ .emacs and one's own configuration, and allows the sysadmin to
arbitrarily undo anything he wishes.  RedHat's notorious for putting
miscellaneous breakage in there.

-- 
Robert Uhl <http://public.xdi.org/=ruhl>
The average woman would rather have beauty than brains because the
average man can see better than he can think.
From: Sam Steingold
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <ufyute1pl.fsf@gnu.org>
> * Robert Uhl <·········@ABFCNZtznvy.pbz> [2005-07-05 08:43:19 -0600]:
>
>
> Check default.el.  Quite a piece of bran-dead engineering: it runs
> _after_ .emacs and one's own configuration, and allows the sysadmin to
> arbitrarily undo anything he wishes.  RedHat's notorious for putting
> miscellaneous breakage in there.

inhibit-default-init's value is t

*Non-nil inhibits loading the `default' library.

You can customize this variable.

Defined in `startup'.


site-run-file's value is "site-start"

File containing site-wide run-time initializations.
This file is loaded at run-time before `~/.emacs'.  It contains inits
that need to be in place for the entire site, but which, due to their
higher incidence of change, don't make sense to load into Emacs's
dumped image.  Thus, the run-time load order is: 1. file described in
this variable, if non-nil; 2. `~/.emacs'; 3. `default.el'.

Don't use the `site-start.el' file for things some users may not like.
Put them in `default.el' instead, so that users can more easily
override them.  Users can prevent loading `default.el' with the `-q'
option or by setting `inhibit-default-init' in their own init files,
but inhibiting `site-start.el' requires `--no-site-file', which
is less convenient.

This variable is defined for customization so as to make
it visible in the relevant context.  However, actually customizing it
is not allowed, since it would not work anyway.  The only way to set
this variable usefully is to set it during while building and dumping Emacs.

You can customize this variable.

Defined in `startup'.



-- 
Sam Steingold (http://www.podval.org/~sds) running w2k
<http://www.jihadwatch.org/> <http://www.memri.org/>
<http://www.iris.org.il> <http://pmw.org.il/> <http://www.palestinefacts.org/>
main(a){a="main(a){a=%c%s%c;printf(a,34,a,34);}";printf(a,34,a,34);}
From: Christopher C. Stacy
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <u4qb95ezk.fsf@news.dtpq.com>
Robert Uhl <·········@NOSPAMgmail.com> writes:
> Trust me, with the wrong terminal setting emacs pops
> up help every time I hit backspace.  It's happened
> _many_ times to me, often when SSHing with cygwin to Linux.

As I said, it is certainly possible to misconfigure your
terminal program (or maybe by using certain programs) to
cause this problem.  However, as I illustrated by my test
results, to my surprise at the time, that doesn't happen by
default under normal configurations with the programs that
most people are using.

More importantly, it certainly doesn't happen when Emacs 
is running under the window system, which is what we were
really talking about.
From: Ivan Boldyrev
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <100sp2-f0c.ln1@ibhome.cgitftp.uiggm.nsc.ru>
--Albania/Mafia/interception/MDA/World-Trade-Center/A2Wjw9Dhq5
Content-Type: text/plain
Content-Transfer-Encoding: quoted-printable

On 9161 day of my life Pascal Bourguignon wrote:
> [ Now, if anybody could explain me why it doesn't work automatically
>   when I put this in my ~/.emacs:
>
>     (case window-system
>       ((nil)
>        (message "houra")
>        (normal-erase-is-backspace-mode)) ; never works :-(

Does at least (message "houra") work?
Try also to use (normal-erase-is-backspace-mode 1)

=2D-=20
Ivan Boldyrev

                                      Life!  Don't talk to me about life.

--Albania/Mafia/interception/MDA/World-Trade-Center/A2Wjw9Dhq5
Content-Type: application/pgp-signature

-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.1-ecc0.1.6 (GNU/Linux)

iEYEABECAAYFAkLLUOEACgkQ4rmsj66Vbhd9PgCfejkKBIbUHoQuUtFSwuDi8vlo
o10AnRgD4NBWSxNhpK6oCbkInuqh92QI
=2mg+
-----END PGP SIGNATURE-----
--Albania/Mafia/interception/MDA/World-Trade-Center/A2Wjw9Dhq5--
From: Pascal Bourguignon
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <87ekac1gxk.fsf@thalassa.informatimago.com>
Ivan Boldyrev <···············@cgitftp.uiggm.nsc.ru> writes:

> On 9161 day of my life Pascal Bourguignon wrote:
>> [ Now, if anybody could explain me why it doesn't work automatically
>>   when I put this in my ~/.emacs:
>>
>>     (case window-system
>>       ((nil)
>>        (message "houra")
>>        (normal-erase-is-backspace-mode)) ; never works :-(
>
> Does at least (message "houra") work?
> Try also to use (normal-erase-is-backspace-mode 1)

Yes, houra is logged.  And with:
  (setq inhibit-default-init t) ; and
  (normal-erase-is-backspace-mode 1) ; it still leaves
C-h bound to delete-backward-char, and
DEL translated to C-d bound to delete-char

It's been that way forever (20,7, 21.[123], 22.0.50.*).


(After I run manually M-x normal-erase-is-backspace-mode RET
C-h is prefix for help, and
DEL is bound to delete-backward-char, as expected).


-- 
__Pascal Bourguignon__                     http://www.informatimago.com/
Litter box not here.
You must have moved it again.
I'll poop in the sink. 
From: Ivan Boldyrev
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <j09vp2-sph.ln1@ibhome.cgitftp.uiggm.nsc.ru>
On 9162 day of my life Pascal Bourguignon wrote:
>>> [ Now, if anybody could explain me why it doesn't work automatically
>>>   when I put this in my ~/.emacs:
>>>
>>>     (case window-system
>>>       ((nil)
>>>        (message "houra")
>>>        (normal-erase-is-backspace-mode)) ; never works :-(
>>
>> Does at least (message "houra") work?
>> Try also to use (normal-erase-is-backspace-mode 1)
>
> Yes, houra is logged.  And with:
>   (setq inhibit-default-init t) ; and
>   (normal-erase-is-backspace-mode 1) ; it still leaves
> C-h bound to delete-backward-char, and
> DEL translated to C-d bound to delete-char

What about

(add-hook 'emacs-startup-hook
  (lambda ()
     (case window-system
       ...)))

-- 
Ivan Boldyrev

                  Sorry my terrible English, my native language is Lisp!
From: Pascal Bourguignon
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <87d5puyc1s.fsf@thalassa.informatimago.com>
Ivan Boldyrev <···············@cgitftp.uiggm.nsc.ru> writes:

> On 9162 day of my life Pascal Bourguignon wrote:
>>>> [ Now, if anybody could explain me why it doesn't work automatically
>>>>   when I put this in my ~/.emacs:
>>>>
>>>>     (case window-system
>>>>       ((nil)
>>>>        (message "houra")
>>>>        (normal-erase-is-backspace-mode)) ; never works :-(
>>>
>>> Does at least (message "houra") work?
>>> Try also to use (normal-erase-is-backspace-mode 1)
>>
>> Yes, houra is logged.  And with:
>>   (setq inhibit-default-init t) ; and
>>   (normal-erase-is-backspace-mode 1) ; it still leaves
>> C-h bound to delete-backward-char, and
>> DEL translated to C-d bound to delete-char
>
> What about
>
> (add-hook 'emacs-startup-hook
>   (lambda ()
>      (case window-system
>        ...)))

(add-hook 'emacs-startup-hook
          (lambda ()
            (message "setting up the keyboard")
            (case window-system
              ((nil)
               (message "houra")
               (normal-erase-is-backspace-mode 1))
              ((x)
               (define-key global-map [(delete)]    "\C-d")
               (make-face-bold 'bold-italic))
              ((mac)
               (setq mac-command-key-is-meta nil
                     mac-reverse-ctrl-meta   nil)
               (set-keyboard-coding-system 'mac-roman)))))

I get:
    setting up the keyboard
    houra
in the *Message*, but C-h is still delete-backward-char and DEL delete-char.

-- 
__Pascal Bourguignon__                     http://www.informatimago.com/

This is a signature virus.  Add me to your signature and help me to live
From: Ivan Boldyrev
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1f12q2-i2q.ln1@ibhome.cgitftp.uiggm.nsc.ru>
On 9163 day of my life Pascal Bourguignon wrote:
> (add-hook 'emacs-startup-hook
>           (lambda ()
>             (message "setting up the keyboard")
>             (case window-system
>               ((nil)
>                (message "houra")
>                (normal-erase-is-backspace-mode 1))
>
> I get:
>     setting up the keyboard
>     houra
> in the *Message*, but C-h is still delete-backward-char and DEL delete-char.

Then use after-init-hook or clean up you default.el :)

-- 
Ivan Boldyrev

                  Sorry my terrible English, my native language is Lisp!
From: Pascal Bourguignon
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <871x69cwl1.fsf@thalassa.informatimago.com>
Ivan Boldyrev <···············@cgitftp.uiggm.nsc.ru> writes:

> On 9163 day of my life Pascal Bourguignon wrote:
>> (add-hook 'emacs-startup-hook
>>           (lambda ()
>>             (message "setting up the keyboard")
>>             (case window-system
>>               ((nil)
>>                (message "houra")
>>                (normal-erase-is-backspace-mode 1))
>>
>> I get:
>>     setting up the keyboard
>>     houra
>> in the *Message*, but C-h is still delete-backward-char and DEL delete-char.
>
> Then use after-init-hook or clean up you default.el :)


The only thing that my defaults.el files contain is this:

;;;BEGIN gcl addition
(autoload 'dbl "dbl" "Make a debugger to run lisp, maxima and or gdb in" t)
;;;END gcl addition

And I even don't have a defaults.el for 22.0.50...

Now, with:

(add-hook 'after-init-hook ; emacs-startup-hook
          (lambda ()
            (message "setting up the keyboard")
            (case window-system
              ((nil)
               (message "houra")
               (normal-erase-is-backspace-mode 1)
               (message "C-h = %S" (key-binding "\C-h"))
               (message "DEL = %S" (key-binding "\C-?")))
              ((x)
               (define-key global-map [(delete)]    "\C-d")
               (make-face-bold 'bold-italic))
              ((mac)
               (setq mac-command-key-is-meta nil
                     mac-reverse-ctrl-meta   nil)
               (set-keyboard-coding-system 'mac-roman)))))

the *Message* end with:


Loading erc-ring (source)...done
Loading erc-nickserv (source)...done
Loading erc-track (source)...done
C-h = help-command         <-------
DEL = vm-scroll-backward   <-------
.EMACS DONE
setting up the keyboard
houra
C-h = help-command         <-------
DEL = delete-backward-char <-------
Loading jit-lock...done
For information about the GNU Project and its goals, type C-h C-p.


and with: M-x (progn (message "C-h = %S" (key-binding "\C-h")) (message "DEL = %S" (key-binding "\C-?"))) RET

It adds:

C-h = help-command
DEL = delete-backward-char


BUT C-h still deletes the backward char !!!

M-x describe-key RET C-h RET gives:

   DEL runs the command delete-backward-char
   ...

whatever the combination of ticks on the xterm options:
  Backarrow key (BS/DEL)
  Delete is DEL

C-q C-h inserts ^?


Note that as soon as I quit emacs, C-v C-h enters ^H as expected: the
terminal works correctly... 


Or as soon as I M-x normal-erase-is-backspace-mode RET manually, it
behaves as expected:

C-q C-h inserts ^H
and C-h gives the help prefix.


Perhaps the modes are reset even after after-init-hook?

-- 
__Pascal Bourguignon__                     http://www.informatimago.com/

This is a signature virus.  Add me to your signature and help me to live
From: Ivan Boldyrev
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <6t45q2-0pl.ln1@ibhome.cgitftp.uiggm.nsc.ru>
On 9164 day of my life Pascal Bourguignon wrote:
> Ivan Boldyrev <···············@cgitftp.uiggm.nsc.ru> writes:
> Now, with:
>
> (add-hook 'after-init-hook ; emacs-startup-hook

Hm...  It seems that normal-erase-is-backspace-mode dosn't just
rebind keys, it also tweak terminal settings.  Then use this hook:

,----[ C-h v term-setup-hook RET ]
| term-setup-hook's value is nil
| 
| Documentation:
| Normal hook run after loading terminal-specific Lisp code.
| It also follows `emacs-startup-hook'.  This hook exists for users to set,
| so as to override the definitions made by the terminal-specific file.
| Emacs never sets this variable itself.
| 
| Defined in `startup'.
`----

-- 
Ivan Boldyrev

                                        | recursion, n:
                                        |       See recursion
From: Pascal Bourguignon
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <87slynly92.fsf@thalassa.informatimago.com>
Ivan Boldyrev <···············@cgitftp.uiggm.nsc.ru> writes:

> On 9164 day of my life Pascal Bourguignon wrote:
>> Ivan Boldyrev <···············@cgitftp.uiggm.nsc.ru> writes:
>> Now, with:
>>
>> (add-hook 'after-init-hook ; emacs-startup-hook
>
> Hm...  It seems that normal-erase-is-backspace-mode dosn't just
> rebind keys, it also tweak terminal settings.  Then use this hook:

Thank you all.  After all, it seems the best path is to avoid
normal-erase-is-backspace-mode altogether.  Without it, it works as I
want in the terminal.

-- 
__Pascal Bourguignon__                     http://www.informatimago.com/
Our enemies are innovative and resourceful, and so are we. They never
stop thinking about new ways to harm our country and our people, and
neither do we. -- Georges W. Bush
From: Klaus Harbo
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <87br5hukyo.fsf@freebie.harbo.net>
Robert Uhl <·········@NOSPAMgmail.com> writes:

> ······@news.dtpq.com (Christopher C. Stacy) writes:
>>
>> > The ^H/^? schism is perhaps the #1 FAQ for Unix usage ever. It's
>> > hardly something obscure that can be observed only in special
>> > situations.
>>
>> We're not talking about ^H/^?  - we're talking about what Emacs does
>> if you press BACKSPACE versus ^H.
>
> Trust me, with the wrong terminal setting emacs pops up help every time
> I hit backspace.  It's happened _many_ times to me, often when SSHing
> with cygwin to Linux.

Happens to me too, occasionally -- I normally do 'stty erase ^?' in
the terminal before starting emacs, sometimes in .bashrc or
equivalent.

> -- 
> Robert Uhl <http://public.xdi.org/=ruhl>
> I sat around during the design phase going `this is going to suck so
> badly that we're going to have to hold onto desks to stop us from being
> drawn into the vortex.'                              --Chris Saunderson

-Klaus.
From: Christopher C. Stacy
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uirzp11i2.fsf@news.dtpq.com>
Klaus Harbo <·····@harbo.net> writes:

> Robert Uhl <·········@NOSPAMgmail.com> writes:
> 
> > ······@news.dtpq.com (Christopher C. Stacy) writes:
> >>
> >> > The ^H/^? schism is perhaps the #1 FAQ for Unix usage ever. It's
> >> > hardly something obscure that can be observed only in special
> >> > situations.
> >>
> >> We're not talking about ^H/^?  - we're talking about what Emacs does
> >> if you press BACKSPACE versus ^H.
> >
> > Trust me, with the wrong terminal setting emacs pops up help every time
> > I hit backspace.  It's happened _many_ times to me, often when SSHing
> > with cygwin to Linux.
> 
> Happens to me too, occasionally -- I normally do 'stty erase ^?' 
> in the terminal before starting emacs, sometimes in .bashrc or
> equivalent.

Just curious, why aren't you using a window system?
From: Edi Weitz
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <ufyutvxb2.fsf@agharta.de>
On Tue, 05 Jul 2005 19:44:37 GMT, ······@news.dtpq.com (Christopher C. Stacy) wrote:

> Klaus Harbo <·····@harbo.net> writes:
>
>> Happens to me too, occasionally -- I normally do 'stty erase ^?'
>> in the terminal before starting emacs, sometimes in .bashrc or
>> equivalent.
>
> Just curious, why aren't you using a window system?

I can't speak for Klaus but I can say that it's not funny to use X
over a slow connection.  Sometimes, when I'm not at home, I use ssh to
log into my server at home and read my email with 'emacs -nw' - I
wouldn't want to do that via X.

And, to provide another data point: Using the widely used commercial
SSH client "SecureCRT" with the default settings I'm thrown into
Emacs' help system if I press the Backspace key.  SecureCRT has an
option "Backspace sends Delete" as a workaround but you have to check
it manually, it's not the default.

Cheers,
Edi.

-- 

Lisp is not dead, it just smells funny.

Real email: (replace (subseq ·········@agharta.de" 5) "edi")
From: Christopher C. Stacy
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <u64vp0zh3.fsf@news.dtpq.com>
Edi Weitz <········@agharta.de> writes:

> On Tue, 05 Jul 2005 19:44:37 GMT, ······@news.dtpq.com (Christopher C. Stacy) wrote:
> 
> > Klaus Harbo <·····@harbo.net> writes:
> >
> >> Happens to me too, occasionally -- I normally do 'stty erase ^?'
> >> in the terminal before starting emacs, sometimes in .bashrc or
> >> equivalent.
> >
> > Just curious, why aren't you using a window system?
> 
> I can't speak for Klaus but I can say that it's not funny to use X
> over a slow connection.  Sometimes, when I'm not at home, I use ssh to
> log into my server at home and read my email with 'emacs -nw' - I
> wouldn't want to do that via X.

> And, to provide another data point: Using the widely used commercial
> SSH client "SecureCRT" with the default settings I'm thrown into
> Emacs' help system if I press the Backspace key.  SecureCRT has an
> option "Backspace sends Delete" as a workaround but you have to check
> it manually, it's not the default.

SecureCRT actually has a checkbox feature called "Emacs Mode",
which is supposed to make the ALT key behave as META.
(I'm not sure how well that works.)  But, if you check off
"Emacs Mode" in SecureCRT, does it also fix the BACKSPACE key?

I used to use SecureCRT years ago, but switched to PuTTY,
so I've forgotten all the details.
From: Edi Weitz
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <ud5pxuh0i.fsf@agharta.de>
On Tue, 05 Jul 2005 20:28:25 GMT, ······@news.dtpq.com (Christopher C. Stacy) wrote:

> SecureCRT actually has a checkbox feature called "Emacs Mode", which
> is supposed to make the ALT key behave as META.  (I'm not sure how
> well that works.)  But, if you check off "Emacs Mode" in SecureCRT,
> does it also fix the BACKSPACE key?

No, these options seem to be independent of each other.  (At least in
my version of SecureCRT which is 4.1.8 and might not be current.)

Cheers,
Edi.

-- 

Lisp is not dead, it just smells funny.

Real email: (replace (subseq ·········@agharta.de" 5) "edi")
From: Christopher C. Stacy
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <ur7edyngk.fsf@news.dtpq.com>
Edi Weitz <········@agharta.de> writes:

> On Tue, 05 Jul 2005 20:28:25 GMT, ······@news.dtpq.com (Christopher C. Stacy) wrote:
> 
> > SecureCRT actually has a checkbox feature called "Emacs Mode", which
> > is supposed to make the ALT key behave as META.  (I'm not sure how
> > well that works.)  But, if you check off "Emacs Mode" in SecureCRT,
> > does it also fix the BACKSPACE key?
> 
> No, these options seem to be independent of each other.  (At least in
> my version of SecureCRT which is 4.1.8 and might not be current.)

Doesn't that seem like a bug?  Shouldn't something called "Emacs Mode"
configure the terminal program to behave properly with Emacs?
I'd send a bug report to Van Dyke.
From: Matthias Buelow
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <86acl1rlx9.fsf@drjekyll.mkbuelow.net>
······@news.dtpq.com (Christopher C. Stacy) writes:

>> No, these options seem to be independent of each other.  (At least in
>> my version of SecureCRT which is 4.1.8 and might not be current.)
>
>Doesn't that seem like a bug?  Shouldn't something called "Emacs Mode"
>configure the terminal program to behave properly with Emacs?
>I'd send a bug report to Van Dyke.

I would think the actual behaviour depends on the terminal emulation
setting. I don't have securecrt around but from its webpage it
supports a few: VT100, VT102, VT220, ANSI, SCO ANSI, Xterm, Wyse
50/60, and Linux console emulation with ANSI color. While on the DEC
terminals (vtxxx) backspace usually sends ^? (ascii delete), on ANSI
and IBM-style ones it's ^H (ascii backspace). So I guess you can
change that when switching the emulation mode (which hopefully makes
securecrt send the proper terminal type to the remote host, i.e. set
TERM.) Just unconditionally changing a setting will likely produce a
mismatch between what codes securecrt sends and what the remote host
has in its termcap.

mkb.
From: Edi Weitz
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <u7jg5kksk.fsf@agharta.de>
On Tue, 05 Jul 2005 21:03:55 GMT, ······@news.dtpq.com (Christopher C. Stacy) wrote:

> Doesn't that seem like a bug?  Shouldn't something called "Emacs
> Mode" configure the terminal program to behave properly with Emacs?
> I'd send a bug report to Van Dyke.

Hmm, they don't call it "Emacs Mode" but "Emacs Compatibility" and
from the description it is fairly clear that the /only/ thing the
switch does is to change the behaviour of ALT.  One could argue if
they should perhaps provide more "Emacs Compatibility" features but
the software works as described and doesn't warrant a bug report IMHO.

Cheers,
Edi.

-- 

Lisp is not dead, it just smells funny.

Real email: (replace (subseq ·········@agharta.de" 5) "edi")
From: Christopher C. Stacy
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uvf3o539u.fsf@news.dtpq.com>
Edi Weitz <········@agharta.de> writes:

> On Tue, 05 Jul 2005 21:03:55 GMT, ······@news.dtpq.com (Christopher C. Stacy) wrote:
> 
> > Doesn't that seem like a bug?  Shouldn't something called "Emacs
> > Mode" configure the terminal program to behave properly with Emacs?
> > I'd send a bug report to Van Dyke.
> 
> Hmm, they don't call it "Emacs Mode" but "Emacs Compatibility" and
> from the description it is fairly clear that the /only/ thing the
> switch does is to change the behaviour of ALT.  One could argue if
> they should perhaps provide more "Emacs Compatibility" features but
> the software works as described and doesn't warrant a bug report IMHO.

Sure - "bug report", "feature request", however you want to say it.
From: Pascal Bourguignon
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <87hdf83dfn.fsf@thalassa.informatimago.com>
······@news.dtpq.com (Christopher C. Stacy) writes:
>> Happens to me too, occasionally -- I normally do 'stty erase ^?' 
>> in the terminal before starting emacs, sometimes in .bashrc or
>> equivalent.
>
> Just curious, why aren't you using a window system?

Don't be fooled by these Mb/s: they are ASYNCHRONOUS DSL, the upload
is still in Kb/s, and X on Kb/s is painful.


-- 
__Pascal Bourguignon__                     http://www.informatimago.com/

This is a signature virus.  Add me to your signature and help me to live
From: Klaus Harbo
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <877jg3vftg.fsf@freebie.harbo.net>
  ······@news.dtpq.com (Christopher C. Stacy) writes:

  > Klaus Harbo <·····@harbo.net> writes:
  >
  >> Robert Uhl <·········@NOSPAMgmail.com> writes:
  >> 
  ...
  >> Happens to me too, occasionally -- I normally do 'stty erase ^?' 
  >> in the terminal before starting emacs, sometimes in .bashrc or
  >> equivalent.
  >
  > Just curious, why aren't you using a window system?

I do, most of the time.  But sometimes it is not possible, or faster,
or just more convenient to skip all the windowing stuff, in which case
I just invoke emacs inside the terminal.  My point was merely to point
to the fact that there several ways to deal with the symptom of the
Backspace key invoking the help system in emacs.  

-Klaus.
From: Matthias Buelow
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <86vf3quy5b.fsf@drjekyll.mkbuelow.net>
······@news.dtpq.com (Christopher C. Stacy) writes:

>We're not talking about ^H/^?  - we're talking about what
>Emacs does if you press BACKSPACE versus ^H.   I wrote
>some things about,  which you quoted and upon which
>you made the following direct, initial comment: "Not true...".

The "not true" was in response to your "... I realized that the ^H
problem was not real". Of course, you're right. I cannot say that it
is "not true" that you realized something, because that's something
only you can say. But from my experience, the implied statement that,
for all people the "^H problem is not real" is not true.

Apart from that, it isn't a problem for me. It's been quite some time
since I last had ^H bound to help on (x)emacs.

BTW., both emacs aswell as xemacs seem by default to follow the CUA
"standard" of F1 for help. That's the closest thing to a "help" key
that one can get on PCs (where the keyboard doesn't have a help key),
and has been recommended for ca. 20 years by IBM's CUA guidelines,
which are the basis of keybindings from old DOS pulldown-menu programs
to Microsoft Windows and OSF Motif. Now I wouldn't say that the F-keys
are generally unproblematic but certainly less problematic than C-h.

mkb.
From: Christopher C. Stacy
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <ufyuu6x1w.fsf@news.dtpq.com>
Matthias Buelow <···@incubus.de> writes:

> BTW., both emacs aswell as xemacs seem by default to follow the CUA
> "standard" of F1 for help. That's the closest thing to a "help" key
> that one can get on PCs (where the keyboard doesn't have a help key),
> and has been recommended for ca. 20 years by IBM's CUA guidelines,
> which are the basis of keybindings from old DOS pulldown-menu programs
> to Microsoft Windows and OSF Motif. Now I wouldn't say that the F-keys
> are generally unproblematic but certainly less problematic than C-h.

I don't have a problem with people preferring to  type F1
rather than c-H (or any other key they like) for Help,
and it's nice that Emacs bound F1 for user compatability.
But I still see no evidence of any "c-H problem" in Emacs.
From: Juliusz Chroboczek
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <7ioe9j20ht.fsf@lanthane.pps.jussieu.fr>
Kent M Pitman <······@nhplace.com>:

>> Nah--there you're supposed to use backspace.

> Geez, is no one gonna post a "huh?" here.

No.  We knew you couldn't keep from explaining on your own ;-)

(And just in case you've got any doubts -- some of us youngsters *do*
find the details of obsolete environments ``riveting''.  Never mind
the sourly comments of people who can't use a killfile.)

                                        Juliusz
From: Rob Warnock
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <w82dnZSDBb4A1VvfRVn-vQ@speakeasy.net>
Kent M Pitman  <······@nhplace.com> wrote:
+---------------
| ····@rpw3.org (Rob Warnock) writes:
| > I tend to prefer the Scheme-style default for the non-list arg case:
| >  (defmacro \\ (args &body body)
| >    `(lambda ,(if (listp args) args (list '&rest args)) ,@body))
| 
| Maybe you'd like
|   (defmacro \\ (args &body body)
|     `(lambda ,(do ((args args (cdr args))
|                    (temp '() (cons (car args) temp)))
|                   ((atom args)
|                (if (null args)      ; was it a proper list?
|                    (nreverse temp)  ; if so, just reverse saved args
|                    (nreconc temp    ; otherwise, reverse most and add tail:
|                             (list '&rest args)))))
|        ,@body))
| even better.
+---------------

Well, yes, if I were a Scheme purist, very probably.  ;-}

But once the arglist gets as big as (a b . rest) it's
not *that* much more trouble to go ahead and write out
(a b &rest rest). It's really "x" versus "(&rest x)"
that's the big convenience. IMHO. YMMV. Etc., &c.


-Rob

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: Rainer Joswig
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <BEE8C75C.E579%joswig@lisp.de>
Am 29.06.2005 21:29 Uhr schrieb "Kent M Pitman" unter <······@nhplace.com>
in ·············@nhplace.com:

> just gratuitous.  And when you say open a file "foo"
> and read its first line, then
>  (with-open-file (f "foo") (read-line f))
> seems about right. but
>  (call-with-open-file (lambda (f) (read-line f)) "foo")
> has an extra word, lambda, that isn't needed in the with-open-file scenario,
> and that extra words offers flexiblity but in the common case just isn't
> needed. so i consider the with-open-file solution more optimal.

In some other functional languages this would look differently, more
aesthetically pleasing.

(call-with-open-file 'read-line "foo") would be fine for me.

The case where you need some anonymous function, makes we often wish,
Common Lisp's LAMBDA had a shorter footprint. I also see (not often)
people using the greek lambda symbol in source code or something
that looks like Smalltalks block syntax.

> Joe probably knows this, but other readers might want to review my
> special forms paper for a study of what macros are used for (since in
> that paper, macros were defined as a subset of special forms, even
> though we have since shifted the terminology to make macros and
> special forms mean different things... sometimes disjoint, except...)
> http://www.nhplace.com/kent/Papers/Special-Forms.html
> 
> - - - - 
> but on the issue of subtlety of seemingly harmless syntax variations ...
> 
> I recall a specific example from the design of dylan, where they killed
> dylan with syntax. I don't even remember the exact symbol syntax they
> picked, but it was like #"foo" or something.  The problem is that a list
> looked like {#"how", #"are", #"you"}. You can say this is a minor deal,
> but compare it to lisp (how are you) and ask yourself: would wiezenbaum's
> eliza have been as compelling if it had caught people up in the issues of
> lisp syntax.  the whole thing that caught my eye when i first learned it
> was that it was like he had programs with english in them.
>   (cond ((equal sentence '(how are you?)) ...))
> just LOOKS LIKE code with english embedded.
>   if [sentence={#"how", #"are", #"you?"}]
> or whatever does NOT look like code with English embedded. it looks like
> code that is manipulating tokens that are based on english words, not
> english sentences wholesale. I think such distinctions matter in any
> decent theory of aesthetic.  Those #'s and whatnot scream out to the user
> "there is syntax here. go away. you don't know what you're doing"
> while the lisp notation screamed out "this is syntax free. get involved."
> that once mattered greatly to me as i got involved in lisp... i think it
> still matters.

Yes, that's something I think, too. It is often nice to read some
old-fashioned symbol-based code. Stuff looks clear and concise.
MANY years ago I used a small Lisp implementation on the Apple II
and it implemented several examples like Eliza, it had to do it with
the limited functionality it offered (no object system, no structures, ...),
little memory footprint (16kbytes). It is the same appeal you can
get from some Forth systems, where the source code is also very clear and
pleasing.
 
> Anyone who says otherwise certainly cannot explain the rules of good
> typography, all of which hinge on much tinier aesthetic shifts than any
> I've mentioned here.
From: Patrick O'Donnell
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <rtoe9nacn3.fsf@ascent.com>
Kent M Pitman <······@nhplace.com> writes:

> Joe Marshall <···@ccs.neu.edu> writes:
> > ...
>
> Among other things, it indents better.  That matters to me hugely.
> And it's less full of words that serve no purpose. 
> ...
> And when you say open a file "foo" > and read its first line, then
>  (with-open-file (f "foo") (read-line f))
> seems about right. but
>  (call-with-open-file (lambda (f) (read-line f)) "foo")

I'd like to interject that, although the appearance of "lambda"
bothers me just a little less than it seems to bother Kent, what's
huge for me is the locality of the "file" stuff and the "procedure"
stuff.  That is, using with-open-file, the binding variable and the
filename and options are all together, then there's the code to use
the open stream.  Using call-with-open-file, the actual filename and
options -- usually unremarkable -- are tucked away at the end of the
call and not in visual proximity to the portions of the form to which
they are proximal semantically.  Or, to put it another way, the visual
proximity is lacking between the portions of the form where the actual
filename and options are given, and they're usually unremarkable in
themselves, where they are at the end of the form, and the variable
name that it will be bound to, being separated by potentially large
amounts of code, when using call-with-open-file.

		- Pat
From: Rainer Joswig
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <BEEA3651.E7A3%joswig@lisp.de>
Am 30.06.2005 21:05 Uhr schrieb "Patrick O'Donnell" unter <···@ascent.com>
in ··············@ascent.com:

> Kent M Pitman <······@nhplace.com> writes:
> 
>> Joe Marshall <···@ccs.neu.edu> writes:
>>> ...
>> 
>> Among other things, it indents better.  That matters to me hugely.
>> And it's less full of words that serve no purpose.
>> ...
>> And when you say open a file "foo" > and read its first line, then
>>  (with-open-file (f "foo") (read-line f))
>> seems about right. but
>>  (call-with-open-file (lambda (f) (read-line f)) "foo")
> 
> I'd like to interject that, although the appearance of "lambda"
> bothers me just a little less than it seems to bother Kent, what's
> huge for me is the locality of the "file" stuff and the "procedure"
> stuff.  That is, using with-open-file, the binding variable and the
> filename and options are all together, then there's the code to use
> the open stream.  Using call-with-open-file, the actual filename and
> options -- usually unremarkable -- are tucked away at the end of the
> call and not in visual proximity to the portions of the form to which
> they are proximal semantically.

In Scheme it is

(call-with-input-file "foo" read-line)

Looks fine to me...

>  Or, to put it another way, the visual
> proximity is lacking between the portions of the form where the actual
> filename and options are given, and they're usually unremarkable in
> themselves, where they are at the end of the form, and the variable
> name that it will be bound to, being separated by potentially large
> amounts of code, when using call-with-open-file.
> 
> - Pat
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <u4qbfbi83.fsf@nhplace.com>
Rainer Joswig <······@lisp.de> writes:

> Am 30.06.2005 21:05 Uhr schrieb "Patrick O'Donnell" unter <···@ascent.com>
> in ··············@ascent.com:
> 
> > Kent M Pitman <······@nhplace.com> writes:
> > 
> >> Joe Marshall <···@ccs.neu.edu> writes:
> >>> ...
> >> 
> >> Among other things, it indents better.  That matters to me hugely.
> >> And it's less full of words that serve no purpose.
> >> ...
> >> And when you say open a file "foo" > and read its first line, then
> >>  (with-open-file (f "foo") (read-line f))
> >> seems about right. but
> >>  (call-with-open-file (lambda (f) (read-line f)) "foo")
> > 
> > I'd like to interject that, although the appearance of "lambda"
> > bothers me just a little less than it seems to bother Kent, what's
> > huge for me is the locality of the "file" stuff and the "procedure"
> > stuff.  That is, using with-open-file, the binding variable and the
> > filename and options are all together, then there's the code to use
> > the open stream.  Using call-with-open-file, the actual filename and
> > options -- usually unremarkable -- are tucked away at the end of the
> > call and not in visual proximity to the portions of the form to which
> > they are proximal semantically.
> 
> In Scheme it is
> 
> (call-with-input-file "foo" read-line)
> 
> Looks fine to me...

However, PAO is right that you can't do the same with file options.
That is, your only options for a calling sequence are

 (call-with-input-file "foo" read-line
   'direction 'input 
   'element-type 'character
   'if-does-not-exist 'supersede)

assuming they were to do it keyword-style, which of course they wouldn't.
You can only put a 'rest list' or a 'keyword list' at the end.  If it was
a more complex function, it would be

 (call-with-open-file "foo"
    (lambda (stream)
      blah
      blah
      blah)
   'direction 'input
   'element-type 'character
   'if-does-not-exist 'supersede)

if you wanted to avoid consing.  Of course, you could do:

 (call-with-open-file "foo" (list 'direction 'input
                                  'element-type 'character
                                  'if-does-not-exist 'supersede)
    (lambda (stream) ...))


but all in all, I think the macrology solution of allowing

 (with-open-file (stream "foo" direction 'input
                               element-type 'character
                               if-does-not-exist 'supersede)
    ...)

shines here.  

[Moreover, this raises the issue that you might think keywords 
should evaluate.   In English they don't, and I've never heard anyone
complain.  (keywords = prepositions -- there is no such thing as a variable
preposition)]
From: Joe Marshall
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <64vvv3a0.fsf@comcast.net>
Kent M Pitman <······@nhplace.com> writes:

> However, PAO is right that you can't do the same with file options.
> That is, your only options for a calling sequence are
>
>  (call-with-input-file "foo" read-line
>    'direction 'input 
>    'element-type 'character
>    'if-does-not-exist 'supersede)
>
> assuming they were to do it keyword-style, which of course they wouldn't.
> You can only put a 'rest list' or a 'keyword list' at the end.  If it was
> a more complex function, it would be

When things get this complex, it's just as easy to make the recieving
procedure also be passed by keyword:

  (call-with-input-file "foo"
    :direction :input
    :element-type :character
    :if-does-not-exist :supersede
    :receiver (lambda (stream)
                  blah 
                  blah))

> [Moreover, this raises the issue that you might think keywords 
> should evaluate.   In English they don't, and I've never heard anyone
> complain.  (keywords = prepositions -- there is no such thing as a variable
> preposition)]

But in Lisp, they do.  It's just that they evaluate to themselves.  It
is rare that anyone takes advantage of this, though.



-- 
~jrm
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uzmt72uly.fsf@nhplace.com>
Joe Marshall <·············@comcast.net> writes:

> Kent M Pitman <······@nhplace.com> writes:
> [...]
> > [Moreover, this raises the issue that you might think keywords 
> > should evaluate.   In English they don't, and I've never heard anyone
> > complain.  (keywords = prepositions -- there is no such thing as a variable
> > preposition)]
> 
> But in Lisp, they do.  It's just that they evaluate to themselves.  It
> is rare that anyone takes advantage of this, though.

Well, that was my point.  It's useful sometimes when constructing an arglist
from multiple sources of info, as in:

 (apply #'open file :element-type my-type other-options)

but it's less useful in the almost nonsensical:

 (open file some-keyword-i-will-offer-later :input)

or even

 (open file some-keyword-i-will-offer-later some-value-i-will-add-later-too)
From: Rainer Joswig
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <BEEA46F6.E7AE%joswig@lisp.de>
Am 01.07.2005 0:19 Uhr schrieb "Kent M Pitman" unter <······@nhplace.com> in
·············@nhplace.com:

> Rainer Joswig <······@lisp.de> writes:
> 
>> Am 30.06.2005 21:05 Uhr schrieb "Patrick O'Donnell" unter <···@ascent.com>
>> in ··············@ascent.com:
>> 
>>> Kent M Pitman <······@nhplace.com> writes:
>>> 
>>>> Joe Marshall <···@ccs.neu.edu> writes:
>>>>> ...
>>>> 
>>>> Among other things, it indents better.  That matters to me hugely.
>>>> And it's less full of words that serve no purpose.
>>>> ...
>>>> And when you say open a file "foo" > and read its first line, then
>>>>  (with-open-file (f "foo") (read-line f))
>>>> seems about right. but
>>>>  (call-with-open-file (lambda (f) (read-line f)) "foo")
>>> 
>>> I'd like to interject that, although the appearance of "lambda"
>>> bothers me just a little less than it seems to bother Kent, what's
>>> huge for me is the locality of the "file" stuff and the "procedure"
>>> stuff.  That is, using with-open-file, the binding variable and the
>>> filename and options are all together, then there's the code to use
>>> the open stream.  Using call-with-open-file, the actual filename and
>>> options -- usually unremarkable -- are tucked away at the end of the
>>> call and not in visual proximity to the portions of the form to which
>>> they are proximal semantically.
>> 
>> In Scheme it is
>> 
>> (call-with-input-file "foo" read-line)
>> 
>> Looks fine to me...
> 
> However, PAO is right that you can't do the same with file options.
> That is, your only options for a calling sequence are
> 
>  (call-with-input-file "foo" read-line
>    'direction 'input
>    'element-type 'character
>    'if-does-not-exist 'supersede)
> 
> assuming they were to do it keyword-style, which of course they wouldn't.
> You can only put a 'rest list' or a 'keyword list' at the end.  If it was
> a more complex function, it would be
> 
>  (call-with-open-file "foo"
>     (lambda (stream)
>       blah
>       blah
>       blah)
>    'direction 'input
>    'element-type 'character
>    'if-does-not-exist 'supersede)
> 
> if you wanted to avoid consing.  Of course, you could do:
> 
>  (call-with-open-file "foo" (list 'direction 'input
>                                   'element-type 'character
>                                   'if-does-not-exist 'supersede)
>     (lambda (stream) ...))

Sometimes I think an object should capture these options:

(call-with-open-file "foo" (make-instance 'open-file-options
                                          :direction :input
                                          :element-type :character
                                          :if-does-not-exist :supersede)
                           (lambda (stream) ...))

Or even:

(call-with-open-file "foo"
                     (make-instance 'open-file-options
                                    :direction :input
                                    :element-type :character
                                    :if-does-not-exist :supersede
                                    :reader (lambda (stream) ...)))


> 
> 
> but all in all, I think the macrology solution of allowing
> 
>  (with-open-file (stream "foo" direction 'input
>                                element-type 'character
>                                if-does-not-exist 'supersede)
>     ...)
> 
> shines here.

Which brings me to another favorite topic of mine: options.

If anybody writes a book about Lisp Style & Design, I would like to see a
chapter about options and how use them in Common Lisp.

- I would like to see explicit arglists for checking, so no &rest stuff
- how does one maintain options that are used in several functions, where
  some functions add options, filter some options?
- how to create objects from options and back without writing too much, but
getting good error checking
- How to sync the options of a macro, its corresponding function and the
corresponding slots?
- when is it advisable to propagate options via special variables?
- when should you create objects for options?
- how to reduce option (arglists, slot lists, ...) duplication in source?

Just look at all the stuff CLIM provides/uses:

- you can draw with functions and they take a lot of options
- you can use something like (with-drawing-options (... ) ...) around your
drawing functions
- you can create graphic contexts which capture some options
- you can set the slots of the underlying streams
- you can bind some special variables
- and more...

How do you set up such an environment in your source, keep it consistent,
and evolve it without too much duplication?

> [Moreover, this raises the issue that you might think keywords
> should evaluate.   In English they don't, and I've never heard anyone
> complain.  (keywords = prepositions -- there is no such thing as a variable
> preposition)]
> 
From: Joerg Hoehle
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and  Rahul
Date: 
Message-ID: <uhdf9te0z.fsf@users.sourceforge.net>
Rainer Joswig <······@lisp.de> writes:
> Which brings me to another favorite topic of mine: options.
Excellent topic

> If anybody writes a book about Lisp Style & Design, I would like to see a
> chapter about options and how use them in Common Lisp.
[many subtopics elided]

- how to avoid options cascading? I.e. a low-level function provides
  hundreds of them, so any higher-level functions feels obliged to
  provide these as well and pass them along.
Or is this what you meant with
> - how to reduce option (arglists, slot lists, ...) duplication in source?

I've seen such, and it's always been a waste of resources (human and
cycles).  OTOH, I always felt annoyed when some low-level layer had
some option that you wished you could use but that was not accessible
from the higher-level API.


I basically agree with your bad feeling (and probably experience)
about options. I always wanted to make some tests by setting all the
*print-* and *read-* variables to some untypical values and expect to
watch lots of packages fail miserably.

> - you can create graphic contexts which capture some options

I think this is the correct approach. Some FORMATTER-STREAM object
closes over a set of options and a stream and uses these for
printing. This is much better than those variables which affect all
streams within their lifetime... Argh!

It also shows a good decomposition of the problem space. Argument list
get shorter as a result and e.g. the open options are grouped together
and separated from transformation options, instead of having all this
in a flat list of unordered keywords.

Of course, people would call for slight modifications of the
formatter-stream object, which is so conveniently represented by just
touching one or two *print-* variables.  Well, that problem would not
exist with me, since I'm fond of single assignment (similar to linear)
style of programming. Set once only, best at definition time. But I
haven't managed to rule the world yet :-;


The separate problem of "I want to change that one option deep inside
there" is what AOP people try to address. But wisdom has not yet been
reached.

I regularly find myself fighting lack of lexicality. My latest example
is in CLISP, not closing over the *foreign-encoding* at foreign
function definition time for many cases where it would make sense.
Rule: special variables in libraries are an indication of bad design
(with abundant exceptions to the rule, of course). They will always
fire back at some time.

Regards,
	Jorg Hohle
Telekom/T-Systems Technology Center
From: Rob Warnock
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <OLydnc2h2I-ldVjfRVn-tA@speakeasy.net>
Kent M Pitman  <······@nhplace.com> wrote:
+---------------
| And it's less full of words that serve no purpose.  That is, the word
| LAMBDA is useful in a non-idiomatic place because it identifies an
| anonymous function.  But in an idiom, where I already know what's going
| on, it's just clutter.  In a sense, I'd like to write
|   (mapcar ((x) ..) ...)
| because I already know that LAMBDA is so common there that I could leave
| it out and a human would know what I meant ...
+---------------

Heh! Which is why my personal init file defines a #$ readmacro, just
so I don't have to type so much to the REPL when I'm trying out stuff:

    > (mapcar #$(cons $1 (+ 32 (* 9/5 $1))) (iota 11 0 10))

    ((0 . 32) (10 . 50) (20 . 68) (30 . 86) (40 . 104) (50 . 122)
     (60 . 140) (70 . 158) (80 . 176) (90 . 194) (100 . 212))
    > 

But don't worry, I never use #$ in permanent code...  ;-}


-Rob

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: Antonio Menezes Leitao
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <878y0pxxbr.fsf@gia.ist.utl.pt>
····@rpw3.org (Rob Warnock) writes:

> Kent M Pitman  <······@nhplace.com> wrote:
> +---------------
> | And it's less full of words that serve no purpose.  That is, the word
> | LAMBDA is useful in a non-idiomatic place because it identifies an
> | anonymous function.  But in an idiom, where I already know what's going
> | on, it's just clutter.  In a sense, I'd like to write
> |   (mapcar ((x) ..) ...)
> | because I already know that LAMBDA is so common there that I could leave
> | it out and a human would know what I meant ...
> +---------------
>
> Heh! Which is why my personal init file defines a #$ readmacro, just
> so I don't have to type so much to the REPL when I'm trying out stuff:
>
>     > (mapcar #$(cons $1 (+ 32 (* 9/5 $1))) (iota 11 0 10))
>
>     ((0 . 32) (10 . 50) (20 . 68) (30 . 86) (40 . 104) (50 . 122)
>      (60 . 140) (70 . 158) (80 . 176) (90 . 194) (100 . 212))
>     > 
>
> But don't worry, I never use #$ in permanent code...  ;-}

But I'm starting to use something very similar in permanent code...

For simple one-arg lambdas, I use this:

(defmacro << (&body body)
  `#'(lambda (<<)
       (declare (ignorable <<))
       ,@body))

Your example becomes:

(mapcar (<< (cons << (+ 32 (* 9/5 <<)))) (iota 11 0 10))

The idea was that << means the flow of values.  It's main use is to
compose (and curry) functions when we don't need to attach a meaning
to the anonymous function parameter, as in

(mapcar (<< (f (g (h <<)))) ...)

The previous code, IMHO, looks better than

(mapcar (lambda (x) (f (g (h x)))) ...)

and better than

(mapcar (compose #'f #'g #'h) ...)

and even better when you want to include other arguments, as in:

(mapcar (<< (f a
               (g (h <<)
                  b)))
         ...)

In this last case, the compose equivalent looks uglier:

(mapcar (compose #'(lambda (x) (f a x)) 
                 #'(lambda (x) (g x b)) 
                 #'h) 
        ...)

Currying could be another option but it also doesn't look so good:

(mapcar (compose (curry #'f a) (rcurry #'g b) #'h) ...)

and it doesn't generalize to the case where we need to access several
other arguments (unless we use something similar to <<).

But I'm still fond of the good old lambdas, and I use them whenever I
need anonymous functions with zero or more than one parameters.

Ant�nio Leit�o.
From: Joerg Hoehle
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <ud5pxtdah.fsf@users.sourceforge.net>
····@rpw3.org (Rob Warnock) writes:
> Heh! Which is why my personal init file defines a #$ readmacro, just
> so I don't have to type so much to the REPL when I'm trying out stuff:
>     > (mapcar #$(cons $1 (+ 32 (* 9/5 $1))) (iota 11 0 10))
> But don't worry, I never use #$ in permanent code...  ;-}

The Iterate package has exactly this readmacro, #L...
(mapcar #L(cons !1 (+ 32 (* 9/5 !1))) (iota 11 0 10))

Antonio Menezes Leitao wrote:
>(mapcar (<< (f (g (h <<)))) ...)

Sorry, I do not (yet) find this more readable than
(mapcar #L(f (g (h !1))) ...)

Incidentally, these remind me of Scheme's SRFI cut operator, which
annoyed me everytime I thought I would use it because that's exactly
the thing it cannot do:
(mapcar (cut f (g (h <>))) #);wrong, no nesting
http://srfi.schemers.org/srfi-26/


Note how I fear a problem with all these hacks (sorry): they do not
scale, because they don't compose.  What do you expect out of
#$(foo $1 #$(bar $1 $2))
#L(foo !1 #L(bar !1 !2))
(<< foo << (<< bar << <<))

Such observations have led me to withdraw from using such seemingly
cool (at first sight) "extensions".
Growing a language is what I'm looking for, and that's a hard job.

Regards,
	Joerg Hoehle
From: Antonio Menezes Leitao
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <877jg584uh.fsf@gia.ist.utl.pt>
Joerg Hoehle <······@users.sourceforge.net> writes:

> ····@rpw3.org (Rob Warnock) writes:
>> Heh! Which is why my personal init file defines a #$ readmacro, just
>> so I don't have to type so much to the REPL when I'm trying out stuff:
>>     > (mapcar #$(cons $1 (+ 32 (* 9/5 $1))) (iota 11 0 10))
>> But don't worry, I never use #$ in permanent code...  ;-}
>
> The Iterate package has exactly this readmacro, #L...
> (mapcar #L(cons !1 (+ 32 (* 9/5 !1))) (iota 11 0 10))
>
> Antonio Menezes Leitao wrote:
>>(mapcar (<< (f (g (h <<)))) ...)
>
> Sorry, I do not (yet) find this more readable than
> (mapcar #L(f (g (h !1))) ...)
>
> Incidentally, these remind me of Scheme's SRFI cut operator, which
> annoyed me everytime I thought I would use it because that's exactly
> the thing it cannot do:
> (mapcar (cut f (g (h <>))) #);wrong, no nesting
> http://srfi.schemers.org/srfi-26/
>
> Note how I fear a problem with all these hacks (sorry): they do not
> scale, because they don't compose.  What do you expect out of
> #$(foo $1 #$(bar $1 $2))
> #L(foo !1 #L(bar !1 !2))
> (<< foo << (<< bar << <<))

In the case of the << macro, I (obviously) expect a

(lambda (x) foo x (lambda (x) bar x x))

which is a somewhat meaningless.  Maybe you want

(<< (foo << (<< (bar << <<))))

in which case it means an anonymous function that applies foo to it's
argument and to another anonymous function that applies bar to its
(replicated) argument.  But it doesn't look pretty.  In this
particular case, I would prefer something with clearer semantics using
explicit lambdas.

> Such observations have led me to withdraw from using such seemingly
> cool (at first sight) "extensions".

As I mentioned in my previous post, I use the << macro just for
_simple_ one-arg lambdas.  I still use the good old lambdas whenever I
need anonymous functions with zero or more than one parameters or when
I want to be more clear about the meaning of the anonymous function
parameter.

> Growing a language is what I'm looking for, and that's a hard job.

Indeed.

Ant�nio Leit�o.
From: Rob Warnock
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <UN6dnT82UpOJ9VbfRVn-qA@speakeasy.net>
Joerg Hoehle  <······@users.sourceforge.net> wrote:
+---------------
| ····@rpw3.org (Rob Warnock) writes:
| >   (mapcar #$(cons $1 (+ 32 (* 9/5 $1))) (iota 11 0 10))
| > But don't worry, I never use #$ in permanent code...  ;-}
| 
| The Iterate package has exactly this readmacro, #L...
| (mapcar #L(cons !1 (+ 32 (* 9/5 !1))) (iota 11 0 10))
+---------------

Neat!

+---------------
| Note how I fear a problem with all these hacks (sorry): they do not
| scale, because they don't compose.  What do you expect out of [example]...
| Such observations have led me to withdraw from using such seemingly
| cool (at first sight) "extensions".
+---------------

Which is exactly why I said "I never use #$ in permanent code".
My various "REPL conveniences" are just that, personal shortcuts,
*not* language proposals.

+---------------
| Growing a language is what I'm looking for, and that's a hard job.
+---------------

Indeed, as I discovered circa 1996 when I tried write an infix dialect
of Scheme to avoid massive parenthephobia at a PPOE. Oh, sure, a simple
hybrid of Tcl & ML syntax styles [don't shudder, it wasn't *that* bad!],
really just a thin veneer of syntactic sugar over the Scheme REPL, worked
well enough to satisfy the users of the particular application in question
[a user-mode hardware debugger], and P'Lite [Parentheses-Lite Scheme,
pronounced "polite"] had several dozen happy users for a few years...

But there were always a few nasty corner cases that didn't seem "right"
to me, which made me feel uneasy about promoting the "language" to a
wider audience [which I had permission to do, by the way, so that wasn't
the reason].

Case in point: As in Tcl, a "word" by itself was a function call, e.g.:

    plite> display "a string" ; newline ; display "another string" ; newline
    a string
    another string
    plite> 

But like Scheme, Lisp, ML, and BLISS, it was also an expression language,
so the value of the last expression in a BEGIN block (PROGN) should be
returned as the value of the block:

    plite> let x = 3 in { set x = x + 1 ; x }
    4
    plite> 

But this leads to a problem [and Lisp-1 vs. Lisp-2 doesn't really help],
namely, what should one return when the last expression is a block is a
single symbol *and* the value of the symbol is a function [in a Lisp-1
such as MzScheme, or has both a function- and symbol-binding in a Lisp-2]?
E.g.:

    plite> let x = 3 in { display x ; newline }
    3
    ???
    plite> 

What should get printed where the question marks are?  #<void>, which
"newline" returns as a value?  Or #<primitive:newline>, the value of
(eval newline)?  [Note: Please ignore for a moment the fact that the
MzScheme REPL suppresses printing #<void> as a value -- that's an
irrelevant side-issue.]

Consistency with the Tcl-like imperative command style says that one
should call the function and return whatever the function returns.
Consistency with expression-language (Scheme/Lisp/ML/BLISS) style
says that one should return the symbol's variable binding as the
value. What to do? What to do?

For expedience and "least astonishment" with the particular set of
users at the time, I chose to implement the Tcl-like imperative command
style *iff* the symbol had a variable binding whose value was some sort
of function, e.g., "procedure?" (Scheme) or "fboundp" (CL). If script-
writing users wanted the other result, I told them to use "values {symbol}"
instead of just "{symbol}", which was an unambiguous way to request the
symbol's variable binding as the value.

But almost a decade later, I'm still not sure that was necessarily
the "right" answer for an infix dialect of Scheme or CL intended for
a wider audience than the one I had at the time. [I'm open to comments
and/or criticism on this point.]

As you said, "Growing a language... [is] a hard job."


-Rob

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: Joe Marshall
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <zmszjy2j.fsf@comcast.net>
····@rpw3.org (Rob Warnock) writes:

> Case in point: As in Tcl, a "word" by itself was a function call, e.g.:
>
>     plite> display "a string" ; newline ; display "another string" ; newline
>     a string
>     another string
>     plite> 
>
> But like Scheme, Lisp, ML, and BLISS, it was also an expression language,
> so the value of the last expression in a BEGIN block (PROGN) should be
> returned as the value of the block:
>
>     plite> let x = 3 in { set x = x + 1 ; x }
>     4
>     plite> 
>
> But this leads to a problem [and Lisp-1 vs. Lisp-2 doesn't really help],
> namely, what should one return when the last expression is a block is a
> single symbol *and* the value of the symbol is a function [in a Lisp-1
> such as MzScheme, or has both a function- and symbol-binding in a Lisp-2]?
> E.g.:
>
>     plite> let x = 3 in { display x ; newline }
>     3
>     ???
>     plite> 
>
> What should get printed where the question marks are?  #<void>, which
> "newline" returns as a value?  Or #<primitive:newline>, the value of
> (eval newline)?  [Note: Please ignore for a moment the fact that the
> MzScheme REPL suppresses printing #<void> as a value -- that's an
> irrelevant side-issue.]

REBOL has a similar problem.  The guy in charge (Carl Sassenrath) said
that you always call the function.  He used a special syntax (leading
colon) to indicate that you wanted to evaluate the name, but not call
if it was a function.

The *real* issue is when you write higher order procedures where some
variable `foo' might be bound to a procedure.  If you find that it
is, do you call it?  REBOL had the rule of `yes, you do'.  That was a
very bad choice.  Suppose you had a simple data abstraction (in
pseudo-rebol):

get-name: func [x] [car x]

Which would be

  (defun get-name (x) (car x))

in lisp.


Now suppose you called get name on a function of no arguments.  That
function would be invoked as X was evaluated.

The only reasonable thing to do was to use the special quotation on
*every* variable just in case someone accidentally or maliciously
bound it to a procedure.

> Consistency with the Tcl-like imperative command style says that one
> should call the function and return whatever the function returns.
> Consistency with expression-language (Scheme/Lisp/ML/BLISS) style
> says that one should return the symbol's variable binding as the
> value. What to do? What to do?
>
> For expedience and "least astonishment" with the particular set of
> users at the time, I chose to implement the Tcl-like imperative command
> style *iff* the symbol had a variable binding whose value was some sort
> of function, e.g., "procedure?" (Scheme) or "fboundp" (CL). If script-
> writing users wanted the other result, I told them to use "values {symbol}"
> instead of just "{symbol}", which was an unambiguous way to request the
> symbol's variable binding as the value.
>
> But almost a decade later, I'm still not sure that was necessarily
> the "right" answer for an infix dialect of Scheme or CL intended for
> a wider audience than the one I had at the time. [I'm open to comments
> and/or criticism on this point.]

It looks like you chose an analagous option.  It works ok for small
scripts, but it can lead to *really* obscure errors in larger ones.
It is also a major security hole if you want to pass around
non-trusted function objects.

-- 
~jrm
From: Jens Axel Søgaard
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <42cc5764$0$261$edfadb0f@dread12.news.tele.dk>
Joerg Hoehle wrote:

> Sorry, I do not (yet) find this more readable than
> (mapcar #L(f (g (h !1))) ...)
> 
> Incidentally, these remind me of Scheme's SRFI cut operator, which
> annoyed me everytime I thought I would use it because that's exactly
> the thing it cannot do:
> (mapcar (cut f (g (h <>))) #);wrong, no nesting
> http://srfi.schemers.org/srfi-26/

Yeah - I too was reminded of srfi 26, and when I tried
to use it for the example, I needed nested <>s too.

I think the reason is that it is cumbersome to write
a syntax-rules macro that handles nesting (after all
most srfi implementations is written in pure R5RS).
It wouldn't be hard to use syntax-case to extend cut
to the nested case though.

-- 
Jens Axel Søgaard
From: Marcin 'Qrczak' Kowalczyk
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <87u0ivj6th.fsf@qrnik.zagroda>
Kent M Pitman <······@nhplace.com> writes:

> That is, the word LAMBDA is useful in a non-idiomatic place because
> it identifies an anonymous function. But in an idiom, where I
> already know what's going on, it's just clutter.

Lisp has an ugly syntax of an anonymous functions, and then macros are
used to hide that, instead of having a nice syntax in the first place.

For example In my language it's ?params {body} or ?params => body
and I don't feel tempted to wrap higher order functions in macros
just because the body of their parameter is usually written inline.

I know you could write a reader macro for that. Besides reader macros
being unmodular, it's too late to undo habits of other programmers.
And MAPCAR should take the function as the last argument.

> There exists no formal formulation of expressiveness distinct from
> mere turing computability, but I think if there were, it would have
> to take into account, as at least one of probably several useful
> metrics, this notion of comparing numbers of tokens. Neither
> shortness nor fewer words is an absolute measure of good, but each
> is a potentially legitimate measure that bears study.

I agree. I'm not convinced that CL would have a very high score however.
Higher than average but it could be better, it usually looks too
verbose for my taste.

-- 
   __("<         Marcin Kowalczyk
   \__/       ······@knm.org.pl
    ^^     http://qrnik.knm.org.pl/~qrczak/
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <u8y05pime.fsf@nhplace.com>
Marcin 'Qrczak' Kowalczyk <······@knm.org.pl> writes:

> Kent M Pitman <······@nhplace.com> writes:
> 
> > That is, the word LAMBDA is useful in a non-idiomatic place because
> > it identifies an anonymous function. But in an idiom, where I
> > already know what's going on, it's just clutter.
> 
> Lisp has an ugly syntax of an anonymous functions, and then macros are
> used to hide that, instead of having a nice syntax in the first place.
> 
> For example In my language it's ?params {body} or ?params => body
> and I don't feel tempted to wrap higher order functions in macros
> just because the body of their parameter is usually written inline.

Lisp traditionally avoids this kind of thing because of the difficulty of
tree-walking it. Perhaps you also make that less painful.  But my real
point is that goodness is not the presence or absence of one word.  It is
an ecology of terms and tools that work together.

LOOP provides things more like what you say but is notoriously harder
to tree-walk.

> I know you could write a reader macro for that. Besides reader macros
> being unmodular, it's too late to undo habits of other programmers.
> And MAPCAR should take the function as the last argument.

This is done exactly to satisfy &REST.  The complication, which I believe
to be a general problem with pattern matchers though I'm not familiar with
the literature on this in general, is that if you start doing 
multi-wildcard-matching in your syntax graph in more than one place, 
you can get clashes like foo* bar* where you don't know how to draw the
boundary.  Yes, you can write a parser to treat the boundary on the basic
of semantic information, but then again you have potentially complicated
tree-walking.  At minimum you need more operators than lisp does.  So the
savings one place is paid for in pain elsewhere.

> > There exists no formal formulation of expressiveness distinct from
> > mere turing computability, but I think if there were, it would have
> > to take into account, as at least one of probably several useful
> > metrics, this notion of comparing numbers of tokens. Neither
> > shortness nor fewer words is an absolute measure of good, but each
> > is a potentially legitimate measure that bears study.
> 
> I agree. I'm not convinced that CL would have a very high score however.
> Higher than average but it could be better, it usually looks too
> verbose for my taste.

The lowlevel is easily mapped over because the atomic tokens are the things
you want to test.  When you have to start doing

 (if (and (symbolp x)
          (> (length (symbol-name x)) 0)
          (char= (symbol-name x) #\?))
     ...)

you're not only having to do lots of extra work, but people will often
decide not to do it (as they already mis-handle &keywords sometimes).
I wanted &keywords and :keywords combined, personally, and for them to
be a different datatype than symbols.  [Since that time, I've come to
think :keywords should be a specialization of string, being
effectively a canonical (eq-testable, uniquified) string, so that
:keywords would be the symbol-name of every symbol with the same name.
But I live with what we have.]

I don't agree with your assessment that the underlying system is ugly, btw.
That's a subjective judgment.  It's more cumbersome than some macros.
But not without reason.  And the reason satisfies me as having good purpose.
From: Marcin 'Qrczak' Kowalczyk
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <87d5ph2jxz.fsf@qrnik.zagroda>
Kent M Pitman <······@nhplace.com> writes:

>> For example In my language it's ?params {body} or ?params => body
>> and I don't feel tempted to wrap higher order functions in macros
>> just because the body of their parameter is usually written inline.
>
> Lisp traditionally avoids this kind of thing because of the
> difficulty of tree-walking it.

It could expand to sexprs, like ' and #' (not like ` , ,@ whose
expansion is implementation-defined).

>> And MAPCAR should take the function as the last argument.
>
> This is done exactly to satisfy &REST.

The fault is in &rest which must be last, trading expressiveness for
efficiency.

> The complication, which I believe to be a general problem with
> pattern matchers though I'm not familiar with the literature on this
> in general, is that if you start doing multi-wildcard-matching in
> your syntax graph in more than one place, you can get clashes like
> foo* bar* where you don't know how to draw the boundary.

Right, but there is no conceptual problem with having a single
multi-wildcard pattern anywhere in a list of patterns.

-- 
   __("<         Marcin Kowalczyk
   \__/       ······@knm.org.pl
    ^^     http://qrnik.knm.org.pl/~qrczak/
From: M Jared Finder
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <Kd2dnT3nbIFocEffRVn-qA@speakeasy.net>
Marcin 'Qrczak' Kowalczyk wrote:
> Kent M Pitman <······@nhplace.com> writes:
> 
> 
>>>For example In my language it's ?params {body} or ?params => body
>>>and I don't feel tempted to wrap higher order functions in macros
>>>just because the body of their parameter is usually written inline.
>>
>>Lisp traditionally avoids this kind of thing because of the
>>difficulty of tree-walking it.
> 
> It could expand to sexprs, like ' and #' (not like ` , ,@ whose
> expansion is implementation-defined).
> 
>>>And MAPCAR should take the function as the last argument.
>>
>>This is done exactly to satisfy &REST.
> 
> The fault is in &rest which must be last, trading expressiveness for
> efficiency.
> 
>>The complication, which I believe to be a general problem with
>>pattern matchers though I'm not familiar with the literature on this
>>in general, is that if you start doing multi-wildcard-matching in
>>your syntax graph in more than one place, you can get clashes like
>>foo* bar* where you don't know how to draw the boundary.
> 
> Right, but there is no conceptual problem with having a single
> multi-wildcard pattern anywhere in a list of patterns.

How would &rest in the middle interact with &optional?  In other words, 
given the function definition:

(defun pattern-matcher (&optional before &rest middle &optional after)
   (values before middle after))

What would (pattern-matcher 1) evaluate to?   How about (pattern-matcher 
1 2)?  You could define rules that would work, but I think it would be 
very confusing to use.

   -- MJF
From: Marcin 'Qrczak' Kowalczyk
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <87oe90gz9n.fsf@qrnik.zagroda>
M Jared Finder <·····@hpalace.com> writes:

>> Right, but there is no conceptual problem with having a single
>> multi-wildcard pattern anywhere in a list of patterns.
>
> How would &rest in the middle interact with &optional?

Mixing &optional and &rest is a poor idea in itself, but it's easy to
design a semantics which is consistent with current rules and allows
&rest anywhere. The following constraints yield a unique solution:

- Arguments are bound to parameters in order, including &optional
  and &rest. A required parameter consumes one argument; an optional
  parameter consumes one argument or nothing; a rest parameter
  consumes any number of arguments.

- If any optional parameter doesn't consume an argument, then the
  following optional parameters don't either, and a rest parameter
  is nil.

This design doesn't require optionals to be adjacent, but the CL
syntax can only express adjacent optionals. A backward compatible
extension would be to add &required which turns off the effect of
&optional. &rest only applies to the following parameter.

-- 
   __("<         Marcin Kowalczyk
   \__/       ······@knm.org.pl
    ^^     http://qrnik.knm.org.pl/~qrczak/
From: Juliusz Chroboczek
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <7ioe8z6kji.fsf@lanthane.pps.jussieu.fr>
>> And MAPCAR should take the function as the last argument.

> This is done exactly to satisfy &REST.

It's also for consistency with APPLY.  Which itself is consistent with
plain function application.

For what it's worth, languages in which the mapping function only
takes one argument (ML, Haskell) still put the function in the first
position.  (But the reason might also be to make partial application
more convenient.)

                                        Juliusz
From: Hannah Schroeter
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <dbu1kg$v83$7@c3po.use.schlund.de>
Hello!

Juliusz Chroboczek  <···@pps.jussieu.fr> wrote:
>For what it's worth, languages in which the mapping function only
>takes one argument (ML, Haskell) still put the function in the first
>position.  (But the reason might also be to make partial application
>more convenient.)

Dito for zipWith in Haskell (which would be equivalent to
mapcar with 2 list arguments). So it's
  zipWith function list1 list2
instead of
  zipWith' list1 function list2

Kind regards,

Hannah.
From: William D Clinger
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1121782266.167877.67180@g47g2000cwa.googlegroups.com>
I apologize for missing this at the time.  I just now
noticed it when it was quoted in a newer message.

Kent M Pitman wrote:
> There exists no formal formulation of expressiveness distinct from mere
> turing computability....

Untrue.  The first major results in this area were
Plotkin's proof that PCF could not express parallel
or (equivalently McCarthy's amb)---this work is better
known as Plotkin's work on full abstraction---and the
Patterson-Hewitt theorem, which showed that recursion
is more expressive than flowcharts (i.e. gotos).

Matthias Felleisen then showed that continuations are
more expressive than recursion.  (I had suggested this
as a conjecture while covering the Patterson-Hewitt
theorem in a guest lecture for Mitch Wand's semantics
course, which Felleisen was taking as a PhD student at
Indiana.)

Felleisen later showed that first class prompts are
more expressive than call-with-current-continuation.
There are several other important theorems of this
kind.

Will
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <u7jfmn4pl.fsf@nhplace.com>
"William D Clinger" <··········@verizon.net> writes:

> I apologize for missing this at the time.  I just now
> noticed it when it was quoted in a newer message.
> 
> Kent M Pitman wrote:
> > There exists no formal formulation of expressiveness distinct from mere
> > turing computability....
> 
> Untrue.  The first major results in this area were
> Plotkin's proof that PCF could not express parallel
> or (equivalently McCarthy's amb)---this work is better
> known as Plotkin's work on full abstraction---and the
> Patterson-Hewitt theorem, which showed that recursion
> is more expressive than flowcharts (i.e. gotos).

Thanks for pointers to this important and useful information, but what
I meant was different than that.  I meant a sort of theory of
aesthetic that explained why people find certain programs pretty and
others not, to include even issues of dispute such as whether mixed
case or uppercase or lowercase is better, to explain the choice of
when and whether to macrofy certain things, etc.

A proper theory of these would, for example, not only address
computability but would discuss the peculiarities of the optical
system (what kinds of things we can search for quickly visually), of
text retrieval and editing systems (what kinds of operations aret
typically available), of memory systems (what things are hard and easy
to remember), of human reasoning (what kinds of problems our brains
are good and bad at solving, and what pattern classes of errors we
make that certain coding or even just syntactic styles can avoid or
encourage), etc.

For example, while recursion may be Mathematically Good (if there's
even a well-formed notion of Goodness or Betterness in Mathematics, it
doesn't automatically follow that the brain reasons in terms of
recursion, and so even if it's possible to show that people take well
to recursion, it's not something that follows mearly from proving that
it is mathematically elegant.  I think there is a separate proof step
required to show that it maps neatly onto brain structures.

It is often assumed, for example, that people are good at judgment and
computers at repetition.  (This may change with time, but probably
mostly with computers getting better at other things and people
staying still.  Sigh...)  But the presumed reason for this is that
people think about things in a way that is different than machines do.
So a great deal of the art of programming language design is not just
figuring out how to make things good to compile but how to make them
be expressed in a way that is conceptually compatible with an
architecture that one ought to at least (absent proof to the contrary)
presume is different in structure than the machine.

That doesn't mean these other areas you cite are uninteresting.  But it
just means that I was talking about something different than that.

Still, I'm glad you misunderstood me since otherwise perhaps you would
have let the comment go and I and others wouldn't have gotten pointers
to this other work. ;)
From: Frode Vatvedt Fjeld
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <2hll42c0kn.fsf@vserver.cs.uit.no>
"William D Clinger" <··········@verizon.net> writes:

> Untrue.  The first major results in this area were Plotkin's proof
> that PCF could not express parallel or (equivalently McCarthy's
> amb)---this work is better known as Plotkin's work on full
> abstraction---and the Patterson-Hewitt theorem, which showed that
> recursion is more expressive than flowcharts (i.e. gotos).
>
> Matthias Felleisen then showed that continuations are more
> expressive than recursion.  (I had suggested this as a conjecture
> while covering the Patterson-Hewitt theorem in a guest lecture for
> Mitch Wand's semantics course, which Felleisen was taking as a PhD
> student at Indiana.) [..]

Would it be possible to explain in general terms what "expressive"
means in this context? My own somewhat vague understanding of this
term has been that once you're beyond the mere ability to do something
(which is what I understood Kent Pitman to refer to by
Turing-equivalence, which is the ability to do anything a typical
machine can do), expressiveness is a rather subjective quality.

-- 
Frode Vatvedt Fjeld
From: William D Clinger
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1121789523.732037.43970@g47g2000cwa.googlegroups.com>
> Would it be possible to explain in general terms what "expressive"
> means in this context? My own somewhat vague understanding of this
> term has been that once you're beyond the mere ability to do something
> (which is what I understood Kent Pitman to refer to by
> Turing-equivalence, which is the ability to do anything a typical
> machine can do), expressiveness is a rather subjective quality.
>
> --
> Frode Vatvedt Fjeld

You have to state the rules carefully.  For example,
any program that can be written using first class
continuations can also be written using recursion
and higher order functions, by writing the entire
program in continuation-passing style.

That transformation is horribly non-local, however,
so it isn't practical.  The expressiveness game
depends on finding a way to define what we mean
by a local transformation.

There are several ways to do that.  I'm going to
do it the way Patterson and Hewitt did it, and I'll
use their theorem for my examples.  Consider the
following program (in pure Scheme, without side
effects):

(define (pgm x)
  (if (d x)
      (i x)
      (pgm (b x))))

Here is an equivalent flowchart program:

pgm:
  if (d x) goto rtn else goto rpt
rtn:
  x := (i x)
  stop
rpt:
  x := (b x)
  goto pgm

Even though you don't know the interpretation of
the function symbols d, i, and b, you can see that
these two programs are equivalent.  That's the key
idea: we say that two programs are equivalent if
and only if they compute the same results for all
possible interpretations of their free variables.

Furthermore we regard all of the standard library
functions as free variables also, so you can't
cheat by using library functions to simulate things
that aren't in the core language.

Now consider the following recursive program:

(define (cpy x)
  (if (d x)
      (i x)
      (c (cpy (a x))
         (cpy (b x)))))

Patterson and Hewitt showed that no flowchart
program is equivalent to the above for all possible
interpretations of the function symbols d, i, a, b,
and c.

I'm not going to give a proof, but I'll give you the
idea of the proof.  Suppose there is some fixed flowchart
program P equivalent to the above for all interpretations
of the function symbols, and consider the free (Herbrand)
interpretation over the terms constructed from the
function symbols and some atomic constant; in other
words, think of a, b, and c as CAR, CDR, and CONS.
Then the flowchart program P looks like an automaton.
It isn't a finite state automaton, because its variables
can store arbitrarily complex terms, but the predicates
that P uses to determine its control flow can only look
at a finite portion of its state.  That means that, for
the free interpretation, the control flow of P is based
on a finite state machine.  There are infinitely many
possible values for x in the free interpretation, so
there exists a pair of distinct inputs x1 and x2 that
must go through exactly the same sequence of steps in
the flowchart program, which means the flowchart program
produces exactly the same output for both x1 and x2.

But the original recursive program produces different
outputs for x1 and x2 when d is interpreted as the
predicate that recognizes the atomic constant and i
is interpreted as the identity function.  (Indeed, with
that interpretation, the original program performs a
deep list copy.)  Therefore the flowchart program cannot
be equivalent to the original program under all
interpretations of its free variables.  QED

At this point, you're probably saying this can't be,
that you can simulate recursion using a stack and a
finite state machine.  That's true.  The problem is,
your flowchart program doesn't have access to a stack
data type, because we took away all of its library
functions when we insisted on arbitrary interpretations
of the free variables.

The other half of the Patterson-Hewitt theorem involves
showing that every flowchart program is equivalent to a
recursive program.  (That's an undergraduate exercise.)
Put these two results together, and you have the result
that recursion is strictly more expressive than flowcharts
in this very precise formal sense.

Will
From: William D Clinger
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1121796550.850179.79850@g47g2000cwa.googlegroups.com>
Sorry for responding to myself...I just want to correct
a mistake in my presentation.

I wrote:
> ....and consider the free (Herbrand)
> interpretation over the terms constructed from the
> function symbols and some atomic constant; in other
> words, think of a, b, and c as CAR, CDR, and CONS.

Untrue.  In the free interpretation, c corresponds to
CONS all right, but a and b correspond to tagged variations
of CONS.  For example, (a (c x y)) is not equal to x or
to y or to (c x y) or to (c (c x y)) or to (b (c x y))
in the free interpretation; it is equal only to itself.

I also wrote:

> (Indeed, with that interpretation, the original program
> performs a deep list copy.)

That is wrong also, for the same reason.

Both of these incorrect remarks were side comments that
had nothing to do with the idea of the correct proof.
I apologize for not thinking them through.

Will
From: Joe Marshall
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <irz5t41w.fsf@comcast.net>
"William D Clinger" <··········@verizon.net> writes:

> Sorry for responding to myself...I just want to correct
> a mistake in my presentation.
>
> I wrote:
>> ....and consider the free (Herbrand)
>> interpretation over the terms constructed from the
>> function symbols and some atomic constant; in other
>> words, think of a, b, and c as CAR, CDR, and CONS.
>
> Untrue.  

Will, you're setting bad precedent.  Usenet is not the place for
mea culpas or corrections.

> I also wrote:
>
>> (Indeed, with that interpretation, the original program
>> performs a deep list copy.)
>
> That is wrong also, for the same reason.
>
> Both of these incorrect remarks were side comments that
> had nothing to do with the idea of the correct proof.
> I apologize for not thinking them through.

Apologies are right out.  What were you thinking?

-- 
~jrm
From: William D Clinger
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1121908177.418510.231800@f14g2000cwb.googlegroups.com>
Joe Marshall wrote:
> Will, you're setting bad precedent.

Oh no, not again!

> Usenet is not the place for mea culpas or corrections.

Mea culpa.

> Apologies are right out.  What were you thinking?

Sorry, I just wasn't thinking.

Will
From: Marcus Breiing
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <c7zarbmvigx85@breiing.com>
* William D Clinger

> [...] and you have the result that recursion is strictly more
> expressive than flowcharts in this very precise formal sense.

Thanks for the explanation.  Although... I'm not quite sure I
understand how setting things up in such a way that "iteration"
implies bounded working memory while "recursion" requires unbounded
memory proves much about relative power of the _control structures_ as
such -- as I usually understand claims of that kind to include an
implied "ceteris paribus".

So, unless I didn't correctly get your sketch of this proof, isn't it
slightly misleading to state that "recursion is strictly more
expressive than flowcharts" when a substantial part of that claim
really is that you can compute strictly more with unlimited memory
than with limited memory?

Marcus
From: William D Clinger
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1121804460.477651.84110@z14g2000cwz.googlegroups.com>
Marcus Breiing wrote:
> So, unless I didn't correctly get your sketch of this proof, isn't it
> slightly misleading to state that "recursion is strictly more
> expressive than flowcharts" when a substantial part of that claim
> really is that you can compute strictly more with unlimited memory
> than with limited memory?

Neither the theorem nor its proof mention memory at all.

In fact, one way of looking at the theorem is that it
shows flowchart programs cannot take full advantage of
unlimited memory even when unlimited memory is offered
to them.  This has more to do with abstract data types
than with memory: think of each function symbol as a type,
and think of each call to a function symbol f with arguments
x1, ..., xn as allocating an object of type f that
encapsulates objects x1, ..., xn.  Because the function
symbols are uninterpreted, you can't count on any
identities that would allow you to substitute a different
term for f(x1, ..., xn).  That is why equivalence of
programs implies essentially the same compositions of
function symbols---and no others, because you can't
count on the behavior of any composition of function
symbols that doesn't come up in the computation you're
trying to simulate.

Your intuition correctly identifies recursion with
implicit allocation of memory, but that is not the main
thing that's going on here.  If you focus on allocation
of memory, you probably won't understand why first class
continuations are more expressive than recursion, or why
first class prompts are more expressive than continuations,
or why PCF plus McCarthy's amb is more expressive than PCF
alone.

Will
From: Marcus Breiing
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <mcqz6q19rgxzp@breiing.com>
* William D Clinger

> Your intuition correctly identifies recursion with implicit
> allocation of memory, but that is not the main thing that's going on
> here.

If it isn't the main thing, then taking unbounded memory away from the
recursive program shouldn't change the result.

So introduce an arbitrary but fixed maximal recursion depth D for the
recursive program.  Let's say the program will "stop-with-error" if D
is exceeded, and add an equivalent "stop-with-error" instruction to
the flowchart model.

Does the result still hold?  If it doesn't, and since what we changed
was to take away implicit allocation of unbounded storage from the
recursive model, would it be absurd to say that this _was_ the main
thing that's going on here?

> If you focus on allocation of memory, you probably won't understand
> why first class continuations are more expressive than recursion, or
> why first class prompts are more expressive than continuations, or
> why PCF plus McCarthy's amb is more expressive than PCF alone.

Yes, quite possibly... and that would be because memory might not have
anything to do with what's claimed for those cases:-) And I admit I
was really mainly thinking of the "gotos vs recursion" claim, because
that's the claim that didn't resonate with the context of the
discussion, which I took to be expressiveness of general programming
languages.

But the model of computation chosen to represent "flowcharts (ie
gotos)" in the sketched proof is not a general (turing-equivalent)
model of computation, which of course explains the "counter-intuitive"
result.

So to sum it up, I think it is confusing to use that result in the
context of a discussion about general programming languages.  But
looking back, it may be that that context was lost when you answererd
to Kent.

Marcus
From: William D Clinger
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1121862867.525866.176190@g44g2000cwa.googlegroups.com>
Marcus Breiing wrote:
> But the model of computation chosen to represent "flowcharts (ie
> gotos)" in the sketched proof is not a general (turing-equivalent)
> model of computation, which of course explains the "counter-intuitive"
> result.

You're wrong about that.  Flowcharts are Turing-complete
if you give them a standard library consisting of the
successor function and tests for equality.

First-order recursive programs are Turing-complete with
the same standard library.

Neither flowcharts nor first-order recursive programs
are Turing-complete if you don't let them have any
library functions.

So the comparison is a fair one, and your impression
to the contrary appears to have rested upon a false
premise.

Will
From: Marcus Breiing
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <o8kbmz53qens0@breiing.com>
* William D Clinger

> So the comparison is a fair one

Fair is fine. What I still don't know is whether you want to say that
by fairly comparing the computational power of these two "sub-Turing"
models you simultaneously made a point about relative expressive power
of Turing-complete programming languages, or whether that's just
something I hallucinated into your postings, based on context.

Marcus
From: William D Clinger
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1121870090.821796.124410@g43g2000cwa.googlegroups.com>
Marcus Breiing wrote:
> Fair is fine. What I still don't know is whether you want to say that
> by fairly comparing the computational power of these two "sub-Turing"
> models you simultaneously made a point about relative expressive power
> of Turing-complete programming languages, or whether that's just
> something I hallucinated into your postings, based on context.

You weren't hallucinating.  As I tried to explain in my first
post on this subject, you have to define the rules carefully
in order to play the expressiveness game.  Perhaps I should
have gone on to say that the results are often subtle, and
easy to misinterpret.  For example, there are many people who
work under the delusion that "expressive" is always a good
thing.

As for why these results are relevant to real languages, and
why they matter:  Most modern languages are indeed parameterized
by their libraries, and the facilities they provide for composing
those libraries are important.  The results I have cited really
do get at the relative expressive power of these compositional
facilities, and that really does matter when a programmer is
performing local transformations of the kind that are routinely
performed during the extended life of software.

The previous paragraph is mere opinion, of course.  I could try
to support it, but I won't.  It's fine with me if you don't
believe it.

Will
From: [Invalid-From-Line]
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <dbltlo$het$1@news.sap-ag.de>
"William D Clinger" <··········@verizon.net> wrote in message
·····························@g43g2000cwa.googlegroups.com...
> As for why these results are relevant to real languages, and
> why they matter:  Most modern languages are indeed parameterized
> by their libraries, and the facilities they provide for composing
> those libraries are important.  The results I have cited really
> do get at the relative expressive power of these compositional
> facilities, and that really does matter when a programmer is
> performing local transformations of the kind that are routinely
> performed during the extended life of software.

Are there any results for Glasgow Haskell in comparision to 'normal'
imperative languages?

i.e.

Does lazy buy you extra expressivity?

The combination of lazyness and pureness?

Type classes?

Rene.
From: William D Clinger
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1121877857.134678.144860@g47g2000cwa.googlegroups.com>
··············@hotmail.com asked:
> Are there any results for Glasgow Haskell in comparision to 'normal'
> imperative languages?
>
> i.e.
>
> Does lazy buy you extra expressivity?

That's an interesting one.  Expressiveness is a partial order,
not a total order.  It turns out that call-by-value and
call-by-name are incomparable: each can express things the
other cannot.  See

    Matthias Felleisen.  On the Expressive Power of Programming
    Languages.  Science of Computer Programming, 1991.  Online at
    http://www.ccs.neu.edu/scheme/pubs/scp91-felleisen.ps.gz

> Type classes?

I don't know any results about that, but I'm not an expert on
this stuff.

Will
From: ···········@gmail.com
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1121901091.660839.43710@g47g2000cwa.googlegroups.com>
Sorry to pose several questions at once, but I am catching up on the
thread:
Isn't the difference between lazy and eager evaluation orthogonal to
that between calling by name versus by value?

Also, what are prompts?  I have found a number of papers on them, but
they were all on acm.org, and I am not a member.  Is there a relatively
comprehensible proof for the expressivity of continuations vs.
recursion, or vs. prompts?
From: William D Clinger
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1121905281.474644.315150@g49g2000cwa.googlegroups.com>
···········@gmail.com wrote:
> Sorry to pose several questions at once, but I am catching up on the
> thread:
> Isn't the difference between lazy and eager evaluation orthogonal to
> that between calling by name versus by value?

No.  I wouldn't say they're the same, but they certainly aren't
orthogonal.  If you want to follow up on this, I'd suggest the
comp.lang.functional newsgroup, which has more people with
stronger and more informed opinions on that subject than you'd
find here.

> Also, what are prompts?  I have found a number of papers on them, but
> they were all on acm.org, and I am not a member.  Is there a relatively
> comprehensible proof for the expressivity of continuations vs.
> recursion, or vs. prompts?

I don't remember whether the following paper contains that
proof, but it will surely provide some background and pointers
to other relevant papers:

Matthias Felleisen and Robert Hieb.
The revised report on the syntactic theories of sequential control
and state.  Theoretical Computer Science, 1992.
http://www.ccs.neu.edu/scheme/pubs/tcs92-fh.ps.gz

You might also use Google Scholar to search for "delimited
continuations", that being the more modern and fashionable
name for first class prompts.  There are dozens of papers
on that subject, which (alluding to part of the subject line)
is closely related to the hilarious brouhaha we enjoyed some
time ago in comp.lang.lisp with regard to the interactions
between continuations and dynamic-wind-like thingies.

Will
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uvf36hhfa.fsf@nhplace.com>
Frode Vatvedt Fjeld <······@cs.uit.no> writes:

> "William D Clinger" <··········@verizon.net> writes:
> 
> > Untrue.  The first major results in this area were Plotkin's proof
> > that PCF could not express parallel or (equivalently McCarthy's
> > amb)---this work is better known as Plotkin's work on full
> > abstraction---and the Patterson-Hewitt theorem, which showed that
> > recursion is more expressive than flowcharts (i.e. gotos).
> >
> > Matthias Felleisen then showed that continuations are more
> > expressive than recursion.  (I had suggested this as a conjecture
> > while covering the Patterson-Hewitt theorem in a guest lecture for
> > Mitch Wand's semantics course, which Felleisen was taking as a PhD
> > student at Indiana.) [..]
> 
> Would it be possible to explain in general terms what "expressive"
> means in this context? My own somewhat vague understanding of this
> term has been that once you're beyond the mere ability to do something
> (which is what I understood Kent Pitman to refer to by
> Turing-equivalence, which is the ability to do anything a typical
> machine can do), expressiveness is a rather subjective quality.

Right.  To make my point a different way:

Why not teach all programming in assembly language?  You can teach
continuations and then you can tell people how to construct patterns
of assembly language into continuations.  Or literal Turing machines
full of write a zero, write a 1, branch if zero, move the tape, etc.
instructions.

The answer is not simply computability.  There is a sense of
quantifiable speed that comes with using higher order languages which
is always assumed, but rarely documented.  People believe that the
choice of computer language matters, and if they did not believe this,
they would not keep moving from one language to another.  Pascal would
have been enough.  Why move to Java?  C would have been enough, why
C++?  Lisp 1.5 enough, why INTERLISP or Symbolics Lisp or CL?

Some of this can be explained by subroutine libraries and some by
operations like call/cc that are uniquely available in this or that
language.

But I think most people believe it's more than that, they simply
haven't quantified what the cost/benefit trade-off is of choosing
infix over lispy notation, of choosing typed over untyped, of
abstracting vs exposing implementations, of large languages vs
small languages, etc.  The meta-design points that peg how you'll 
make the small decisions that make up the fabric of a language.

We worry greatly about the cost of marshalling/unmarshalling data in
and out of programs--I'm told that's one of the things that made CORBA
catch on less than some had anticipated.  Marshaling and unmarshalling
data into the brain seems of equal consequence, it's just hard to
measure what's going on, and so hard to know how to fix.  Also,
structurally, we do this at design time, not at run time, and we pay
less attention to that cost as a culture of programmers (even though
it's of great concern to the culture of "businesses" and even somewhat
to those of us who have a lot of "things we want to think about before
we die" and want to pick a good set of tools for enabling as much
"extended thought" as possible...)
From: Marcus Breiing
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <usgas6pmfgpst@breiing.com>
* Frode Vatvedt Fjeld

> Would it be possible to explain in general terms what "expressive"
> means in this context? My own somewhat vague understanding of this
> term has been that once you're beyond the mere ability to do
> something (which is what I understood Kent Pitman to refer to by
> Turing-equivalence, which is the ability to do anything a typical
> machine can do), expressiveness is a rather subjective quality.

Hmm. Haven't read any of it, but my hunch is that a major difference
is in cardinality of working memory sizes in the computing models
under consideration.

Marcus
From: Juliusz Chroboczek
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <7izmshcxr2.fsf@lanthane.pps.jussieu.fr>
>> There exists no formal formulation of expressiveness distinct from mere
>> turing computability....

"William D Clinger" <··········@verizon.net>:

> Untrue.  The first major results in this area were Plotkin's proof
> that PCF could not express parallel or [...]

If anyone wants a more intuitive grasp of what William means: if your
language doesn't provide threads, non-blocking I/O, poll, select,
guarded sums or something similar, there is no way you can wait for an
input on one of two channels.

                                        Juliusz
From: Christopher C. Stacy
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uekalasik.fsf@news.dtpq.com>
Kent M Pitman <······@nhplace.com> writes:

> Rainer Joswig <······@lisp.de> writes:
> 
> > Why wasn't it possible to migrate older code to the new dialects?
> 
> What makes you think it wasn't? ;)
> 
> I think in a lot of cases, it was copyright that kept it from happening.
> 
> MACSYMA was, for example, quickly upgraded for porting.  It was written
> in Maclisp/Franz/Zetalisp and I ported the bulk of it (100,000 lines of
> code) to Common Lisp in about 3 months.  So there's one data point about
> code durability.

And using the tools that you developed for that effort,
I ported a large application (I forget, but probably at
least 40 KLOC) one weekend.   (That is, I started hacking
on Friday after people went home, and on Monday morning the 
system was back _in production_, in Common Lisp.)
From: Rainer Joswig
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <joswig-667312.19195829062005@news-europe.giganews.com>
In article <·············@nhplace.com>,
 Kent M Pitman <······@nhplace.com> wrote:

Thanks for your very interesting answer. I'm not going to much into detail
but want to follow-up to some points. Obviously I'm a non
native speaker os it is a bit more effort to craft an
understandable answer. ;-) And I wanted to have some
training today, too. ;-)

> Rainer Joswig <······@lisp.de> writes:
> 
> > Why wasn't it possible to migrate older code to the new dialects?
> 
> What makes you think it wasn't? ;)

Well, it was possible, but I don't think the results were always
satisfying.

The examples I'm thinking of were DW to CLIM,
Statice to Portable Statice, and some others.

> I think in a lot of cases, it was copyright that kept it from happening.
> 
> MACSYMA was, for example, quickly upgraded for porting.  It was written
> in Maclisp/Franz/Zetalisp and I ported the bulk of it (100,000 lines of
> code) to Common Lisp in about 3 months.  So there's one data point about
> code durability.
> 
> But in other cases, there simply wasn't a will for it to happen.  The
> issue was not technical.  David Moon told me at one point that Zmacs
> couldn't run in Common Lisp because of problems with array-leaders and
> special instance variables.  In my "free time" (not a sponsored
> project, just something I squeezed in between my other
> responsibilities, so it couldn't have been a huge effort) in my last
> few months at symbolics (1992), I managed to use macrology to hide
> these issues and to port Zmacs successfully to Common Lisp (in that
> case, to Symbolics Common Lisp, not raw Common Lisp, but that
> difference is minor compared to the gap between Zetalisp+Flavors and
> Common Lisp with CLOS, I think) and had it running intenally at
> Symbolics. [My project, incidentally, was called TRES, for TRES
> Replaces Eine's Successor... you always have to have a good name, and
> I needed a proper title for the series that began with Eine (Eine Is
> Not Emacs) and Zwei (Zwei Was Eine Initially).]  It still used TV
> windows, and needed to use CLIM instead, but I considered that a
> separate task.  I'm confident the task could have been completed.  Why
> wasn't it?  Symbolics was then focused on the VLM (Virtual Lisp
> Machine, the emulator product rolled out later on the DEC Alpha) and
> didn't care any more about other things.  I considered my project a
> "backup" in case the VLM didn't work and they needed to toss the
> hardware, but of course I didn't realize I'd be laid off and that my
> project would get lost in the dust.  

There are also not enough Kent Pitmans. Unfortunately.

> Nevertheless, these two projects lead me to believe that porting of
> large amounts of the Lisp Machine was largely an issue of bad business
> planning by Symbolics more than a technical artifact.  Experience with
> the non-acceptance of CLOE at Symbolics (a project to take a native Lisp,
> originally Robertson Common Lisp, RCL, developed by Paul Robertson and
> to morph it into a native Lisp on the 386 running Symbolics-like 
> functionality) showed up the problem.  The project had pretty good success
> given the resources thrown at it (minimal), but was not pushed very hard
> by Symbolics because at the time it was thought by the controlling forces
> there to be The Wrong Thing and mostly heretical.  (Incidentally, CLOE was
> another brainchild of Howard Cannon, who once again showed that he had
> a very good theory of what was needed in Lisp, this time not just 
> architecturally but business-wise.)  CLOE was picked up by Macsyma Inc.,
> which bought MACSYMA as well, and rolled it out on native PC to
> escape the demise of Symbolics and to continue on with technology.
> 
> At this level, we're talking pretty coarse data.  A few anecdotes, not
> a widespread phenomenon, so my observations are perhaps less
> systematic.  But here my impression is that what was happening
> systematically was a lack of will by Symbolics to invest in trying these
> things.  There was a certain hubris that made people believe that only the
> hardware was enabling all this magic, and there were some important assists
> from the hardware, but ultimately the code was not dependent on the hardware.
> And at every step where it was tried, macrology hid the porting issues
> and enabled platform-hopping in a natural way.  
> 
> So, respectfully, I dispute your apparent assumption that it was not
> possible, and I suggest instead that what you infer as lack of possibility
> might be equally well explained by business complications.

Might it also be the case that the management level was having
reasons to not follow this route that are based on
experience with done projects? Too costly, too late,
to complicated, ...? Just asking.

> > If that happened it always looked like a major effort or even
> > a reimplementation. Were macros part of the problem?
> 
> I think they'd have been part of the solution if it had been tried.
> 
> The parts that are often hardest, if you examine the code, are the
> function call parts, not the macros.  Rewriting SEND is way harder
> than rewriting DEFFLAVOR.  DEFFLAVOR can be rewritten once as a macro
> to expand differently and it fixes all the uses of DEFFLAVOR.  Just to
> take an example: SEND, by contrast, computes its args oddly and
> arguably inefficiently; if you rewrite just the function, you still
> have the problem that at the source end, you're computing the second
> arg (at runtime) as a keyword, and that (again at runtime) you must
> re-convert it to a generic function name.  That's more painful, all in
> all, than dealing with the macrology because by their nature, macros are
> resolvable at compile time (if our friends in the "runtime macros" thread
> going on now at c.l.l. don't accidentally get their way - heh) and
> functions are not.
> 
> > If I look how the software looks like when you see at them now
> > (not from within the surroundings of a live software project
> > developing this stuff), I have the feeling that there was too
> > much hackery at work and that the code was not enough written
> > to be ported, to be understandable (by non-original authors),
> > to be debuggable and so on.
> 
> Orthogonally to the issue of macrology per se, there was also a belief
> by some people (for some reason I attribute it, correctly or not, to
> Mike McMahon, but I think it was more widespread) that if you followed
> all the abstractions, you'd lose efficiency.  I personally felt that 
> efficiency lost by abstractions should be gained back in other ways and
> that code should be kept clean for longevity's sake.  But there are 
> certainly abstraction violations in the name of "speed bums" all over
> the place, and those are inhibitors to easy transformation in some cases.
> All systems probably suffer from this to some degree, but all the moreso
> if it's part of the design strategy.  You may be somewhat up against
> that as you examine things.  I think you're right that it wasn't written
> to be ported, but I don't know why it couldn't have been.  I don't think
> it was essential.  I came across some of this in the Zmacs->Tres port, 
> and just added abstractions where I saw fit.

I think you need extremely good developers to be able to do that.
I can imaging for example that it would be easy for you
to let the documentation substrate render its data on a web
stream. But I feel uncomfortable by looking at the code or
even seeing things in the debugger. I find at runtime in the
debugger not enough of what I see in the source code.

> I once had a discussion with Dave Moon where he alleged to me that
> complex systems were inherently messy and couldn't be coded cleanly.
> I alleged otherwise.  The problem here is that complex systems are
> hard to write and who has the ability to go test this claim?  Dave had
> written many of them, and they were often messy internally here and there.
> But he is a prolific creator of robust large systems.  I haven't created
> the systems he has, and so I can't say he's wrong.  But I simply believe
> he is.  New Flavors had all sorts of messiness in it, and I had occasion
> to re-implement it for CLOE and it didn't come out nearly as messy. That's
> just one datapoint, and my implementation was not as comprehensive as his,
> nor did it have all the error-checking bells and whistles since we were
> using it to port already-debugged code, but it supported correct code
> efficiently and with clean abstractions.  Does that disprove Dave's claim?
> No.  But it gives at least a little credence to the idea that I might
> not just be some annoyed loser who's jealous of Moon's ability to create
> cool systems and looking for ways to nitpick him.  (I also don't know
> if Moon still holds to these beliefs either--it's been quite a while since
> we talked.)
> 
> > For me macros are violating principles like late-binding. They
> > are creating static structures that are no longer evovable. The
> > make too much going on behind the scenes that are non-obvious.
> 
> I understand what you're saying here, but consider that the static
> structure they create is late-bound.  That is, the bulk of macrology
> is often in the use... most macros are not like LOOP or DEFCLASS that
> has a whole huge library supporting it.  And even for those that are,
> a lot of the work they're doing is lexical analysis (portable stuff)
> not implementation, so even that is probably more robust than you're
> giving it credit for.
> 
> What I allege most macros do is solve what I'll call the "CORBA problem".
> They shield you from idiotic little idioms where every program that wants
> to make an object first makes an object factory, then asks it to crank out
> a singleton object of a generic type, then tests that it succeeds, then
> finally casts it to something that the program can use.  That kind of 
> idiomatic stuff, in line in regular code, is a disaster for programming
> intelligibility except by idiom recognition.  It's like in old Maclisp 
> having to know that (APPEND X '()) meant what we now call (COPY-LIST x),
> only worse because it's strung over separate "statements" and hence it's
> something a naive programmer, not realizing that three statements alone
> are working in lockstep might interpose another statement between, making
> the thing harder to read.  At least there's less likelihood that someone 
> will turn (APPEND X '()) into (APPEND X (PROGN (PRINT Y) '())) when they
> can instead write (APPEND X '()) (PRINT Y) so mostly that kind of idiom
> doesn't get wrecked.  But macros serve the accidental purpose of making
> abstract operations atomic, and easier to manage.
> 
> And in the typical case, I think the bulk of macrology is in the use, not
> the definition, so macros also tighten code and move a complex set of
> operations to a central spot, collapsing lots of 
>  (LET ((X NIL))
>    (UNWIND-PROTECT (PROGn (SETQ X (OPEN ...))  ... (CLOSE X) (SETQ X NIL))
>       (IF X (CLOSE X))))
> kinds of things to a single instance of that in one place that is easily
> updated when a port occurs, rather than leaving large numbers of them to
> update.

I agree that this on the code-writing side helps a lot. But
if you don't leave something recognizable at runtime (something
you can inspect and identify in a debugger), I have a problem
with that. The single macro is innocent, but if you cascade
macros, have macros inside macros, macro generating macros,
the code that runs is so completely different from what you
write that debugging it will be a pain. You can work against
it, but you have make some active design choices how to
achieve that. You have to do something to get debuggability
back.

>  So I suggest the opposite of your claim is true.  "Early binding"
> would be coding this stuff inline.  Sure, you can turn this into
>  (CALL-WITH-OPEN-FILE #'(LAMBDA ...) file)
> and then you've hidden the portability, but you've still left cumbersome
> syntax that I can't for the life of me find aesthetic.

If it gets me other things I can live with it. Personally
I don't find it worth to write WITH-OPEN-FILE instead. The
readbility I get from that is almost zero.

> If there is one 
> thing I don't understand in the Scheme community, it's the passion for
> tolerating this junk.  We used to write tons of it in the late 1970's before
> there were macros, but instantly outgrew it when we came up with the idea
> of adding:
>  (DEFMACRO WITH-OPEN-FILE ((name file) &body forms)
>    `(CALL-WITH-OPEN-FILE #'(lambda (,name) ,@forms) ,file))
> ... and then you're back to macrology, this time not hiding implementation
> but hiding syntax tedium.  And what difference does it make at that point
> whether it's early or late binding?  One doesn't worry when discussing 
> early/late binding in other languages like C or C++ or Pascal that the 
> syntax of assignment might change from '=' to ':=' or vice versa.
> Why should I worry that the syntax of macro that is using only tools 
> that are well-understood are involved.  
> 
> So maybe lumping all macrology into one discussion is bad because there
> are several distinct activities (and associated risks) going on here.
> Good style in macrology reduces the risk of "change" to as small a place
> as possible.  But always the risk is there, macro or function.  And
> surely it's better to have syntax abstraction than not?
> 
> > Take for example an interface builder type of application. If you
> > represent the interface objects as first class objects you can
> > modify the interface with an interface builder, since you have
> > an explicit representation of it and you can work on top of that.
> 
> The interface representations that Symbolics FrameUp were much easier
> to manipulate than the interface representations that I see Visual
> Cafe or Qt or other like kinds of things using these days.

Probably one of the best is Apple's Interface Builder. Which
is pretty cool. It is way cooler than Frame-Up and does
a lot more. It reaches out also in the implementation in a
way Frame-Up never did. For me Frame-Up is nothing more
than a glorified structure-editor for the frame defining
macro definition and it would be hard to imagine how to
make it more dynamic, incremental, useful, .. given the
way it works.

>  How could
> it possibly be easier for a programming system to interpret imperative
> commands than declarative commands, since the former has an associated
> halting problem in the worst case and the latter can be restricted to
> a subset that is reliably syntactically analyzable.
> 
> > Now take DW or CLIM. One of the main drivers for the interface is the
> > frame definition macro.
> 
> Ah, now we get to the meat of the matter.

It was just an example to illustrate what I mean.

> I'm going to guess ahead of time, before reading further, that the answer  
> is going to be "it is possible to do bad design in any system, whether
> macro based or not".   Where by "good" and "bad" I mean here "suitable to
> the needs of your project", not "God will take you into heaven for doing
> good or strike you down for doing bad, independent of context".

Sure, my theory was that macros help you to get bad design, since
there isn't not enough, documented, lore how to use them
in non-trivial ways.

> > It has several drawbacks:
> > 
> > - it does have extremely poor checking of the input and does not provide
> >   you with useful ideas why a certain description produces results
> >   that are not desirable (layout problems, description grammar violations,
> >   ...).
> 
> And a functional interface to the same kind of magic that CLIM tries to do
> would fix that?  I don't think so.  It would just make it harder to analyze
> the connectivity.

I want some explicit machinery to describe and check the structure of the
input. Macros allow you to use some of there machinery to
process that structure - unfortunately there is no useful
mechanism to have domain related error messages added to
the code in a good way. If I want a mini-language I should not
use the macros call interfaces to implement it.

> But what is getting you the trouble is not how you assemble the connectivity,
> it is the fact that CLIM's job is to take input in one shape and to give 
> output in another shape, as if by magic, since part of its design (its
> "charm" if you will) is that it is allowed a free hand in how to do this
> in many cases.

A layout description in a frame description should be properly
checked. Checked by a domain specific method that gives sensible
answers. I never found a CLIM implementation where you really
get an idea why a layout description of some complexity
didn't 'work'. One was always guess, looking at the documentation
and finding it differently implemented.

Just generating code for the good case is often a problem
with macros. They often tend not to check their input
enough and they often don't give useful debugging/diagnosis
messages at expansion time. Implementors are just implementing
the obvious expansion and think that they are ready then.

> But surely the problem is in the lack of clear mapping between input and
> output, and surely this is an intentional part of the design, not an 
> accidental artifact of the use of macros to implement it.
> 
> > - it is non reversable. You cannot go from an interface back to an editable
> >   description (-> a macro form that can be changed and evaluated again).
> 
> That's because you're asking it to undo a full set of  design choices.
> CLIM's charter is not to manage a well-known design, it is to create 
> a design and then manage it.  Of course, in general, that process is
> not undoable. If this is your desire, you don't want CLIM.  And it doesn't
> matter if CLIM is expressed in macros or not.

If I do something procedurally with objects:

(create-window ...)
(add-button ...)
(find-button ...)
(remove-button ...)

I can manipulate these objects.

If I do

(defwindow ...
  (layout (button ...))

And later I do it again with a different defintion, I want the
UIMS be able to stay in a clean state ... having just the
variables/methods/classes/etc. that are need then.
Otherwise I'm polluting my Lisp system with all kinds of
no longer need artifacts that will be potential a problem
later. Solution: if you have some unexplainable effects
restart the Lisp. That's hackery.

> > - it introduces all kinds of side-effects that are non-reversable or where
> >   there is no dependency tracking. Example: you define a frame and several
> >   methods will get generated. Later you change that frame definition
> >   and all kinds of no longer needed methods will NOT be removed.
> 
> These are criticisms of the design of CLIM and/or statements that you are
> not CLIM's proposed client.
> 
> The same criticisms would be true of a consulting organization that you hire 
> to maintain your web pages and that use Javascript as a solution when you
> only know PHP, or that use server-side when you might want client-side.
> When you don't specify how the organization does something, it uses its 
> choice, not yours.  If you have a design constraint you want to introduce,
> you can indeed do so, but CLIM is a consulting organization that doesn't
> offer you that choice.  Different CLIMs may solve the problem differently,
> but the whole idea is to make it easy for people who don't want to do certain
> kinds of things, not for people who do.  I know this because I've done some
> web hosting, and what I offer to people who don't understand HTML is VERY
> different than I would offer to people who do... in fact, I just tell people
> who do want their own maintenance that I won't support them because I'm not
> set up to do that kind of hands-on stuff efficiently--it means someone else
> could be introducing bugs I'm not aware of and that I'm not billing for, and
> I just don't want the headache.  CLIM has this identical problem.  It gets
> power out of not cooperating with you, and offers you something in return.
> If you know how to offer that power without that cost, make a new window
> system that does both. I think that's great.  But I don't think the use or
> non-use of macros will make the difference in your ability to succeed.
> What will matter is your coming up with an intelligible collaboration/design
> model.
> 
> > - the compile-time debugging of such a frame definition is black art.
> 
> When you don't expose how something works, indeed debugging it at the
> frame level is hard.  
> 
> I'm not 100% sure this is made better by not using macros since all that
> does is expose implementation.  If you don't expose all the functions you
> use, then you still have unexplained stack frames.  If you do expose them,
> then the value you have offered the user is completely documenting your
> implementation, not your use of macros.
> 
> > - There is no idea what the frame definition macro does and so it is
> >   hard to replicated it by other people in their implementation.
> >   The various CLIM implementations had several incompatible versions
> >   of the frame definition macro, where you can get to a working
> >   usage of it only by trial and error.
> 
> Already answered.
>  
> > - it wires dependency into the resulting code. You can redefine a
> >   function in a patch and the effect will be visible. You can redefine
> >   a macro in a patch and the change will NOT be visible in
> >   user level code, since the user then needs to compile his code
> >   and it is non-trivial to track dependency and undo/redo
> >   side-effects.
> 
> It depends on how you design the system. This is not intrinsic to macros.
> That is, you can both write macros that don't have this problem (e.g., if
> all the macros can be expressed as functions, you can use the CALL-WITH-
> technology I mentioned above).

Right, and then you double your macros into functions, which I don't
find too useful, since I often could live with just the functions
alone.

> If you can't convert solely to that, and if
> macros are doing something truly syntactic, I don't see how open-coding that
> without macros is going to make your code cleaner.  If every thing a macro
> collapses into one form has to be done as three consecutive magic words,
> I don't see how inlining those three magic words makes things better. You
> still have to patch all the uses whether you write
>  (FOO) (BAR) (BAZ)
> or 
>  (MACROLOGICAL-FOO-BAR-BAZ)
> If you have made your code depend on foo, bar, and baz happening in order
> and you change it to (FOO) (BAZ) (BAR) or (FOO) (QUUX) (BAZ), either way you
> still have to recompile the same number of forms.  The mention of macros
> seems a red herring.

Why not (functional-foo-bar-baz)?
> 
> > - the macro definition is not tool friendly. Frame-Up for example
> >   could easily generate such a frame description, but you couldn't
> >   take one and edit it in Frame-Up.
> 
> That's just a weakness in Frame-Up, but if you believe that you CAN edit 
> a procedural description then
>  (a) you're not realizing how much users can muck that up 
> or
>  (b) you plan to not 'understand' the user's description but just execute
>      it blindly.
> The latter strategy has the minor problem that if there are side-effects
> (file deletions, and worse) in the setup code, you can't detect them, and
> you might really injure the user by executing them in your editor.
> 
> I just don't see that you're offering anything better, nor do I think 
> criticizing an implementation as if it were the representative of all
> implementations is well-founded.
> 
> > So from a Lisp hacking perspective there the frame definition macro
> > is very cool, but from a software engineering perspective (taking
> > into account maintainability) it is very questionable, since it does
> > only barely the job and is not developed for robustness and other
> > demands. Unfortunately there is no established 
> 
> documented

This is the most important word that makes me answering this post. ;-)

How can we get it documented? I'd like to see wizard's best practices
and anti-patterns documented for the Lisp community. So that
we can point Lisp developers to it and say, if you are writing
macros, read this first. And no, Paul Grahams book isn't it.
I'm talking about other code styles that are barely touched
in his book.

> 
> > best practice how
> > to write maintainable macros at that scope and level. I'm not
> > talking about the kind of macros that transform a LAMBDA into a LET
> > expression.
> > I'm talking about the stuff that has been used in the Lisp machine software
> > like DEFWHOPPERs/WRAPPERs or like the complex macros to define
> > application frames (or what it was called) and similar.
> 
> I stand by my earlier guess that some design is good and bad, independent
> of the use of macros.
>
> DEFWHOPPER and DEFWRAPPER have pretty simple definitiions that are probably
> in many cases as conceptually simple as LET when it comes to the issue of
> syntax.  Most of what they do is functional.  If one wrote
>  (ADD-WHOPPER #'(LAMBDA (X) ...) 'TV:FOO-FLAVOR)
> would that make there suddenly not be an issue?  I think not.
> 
> Hence I am inclined to exonerate macros here.
> 
> > The stuff that
> > has been written with heavy macrology has not been able to evolve
> > (that's what I see). Why was that?
> 
> Business priorities.  And in that I include both:
> 
>  * lack of will to do what is possible (business planner is confused)
> 
>  * lack of desire to do what is possible (business planner has a 
>      legitimately different goal and so doesn't want this as starting point)
> 
> > Are there other reasons, or is
> > there some misuse of macros, some anti-patterns implemented?
> 
> A reasonable summary questions for you to end on, but I'm not going to
> summarize here. Hopefully the above counts as my answer to this question.
> 
> I appreciate your taking the time to detail an answer.  Even if we
> disagree, I hope you've found my repsonse to have enough substance
> that it was worthwhile for you to do all this detail.  It's certainly
> a matter about which reasonable people could disagree, but I'll be
> interested to see if you find any of my arguments above to be
> persuasive.

Thanks for your time Kent.
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uslz1yo0u.fsf@nhplace.com>
Rainer Joswig <······@lisp.de> writes:

> I'm not going to much into detail but want to follow-up to some
> points. Obviously I'm a non native speaker os it is a bit more
> effort to craft an understandable answer. ;-)

I've sometimes myself tried to respond to email from people in
languages that are not my native language (Spanish or Portuguese
mostly, which I read better than I speak these days since those
languages aren't used much in my geographical area) and I can fully
appreciate the difficulty.

Let me take this opportunity to thank you and others who daily put up
with this need for speaking in "Federation Standard" (Star Trek's name
for English, for those who aren't fans of the show).  I always count
myself lucky that I get to use my native dialect in discussion without
the need for translation, and sometimes feel almost embarrassed about
how little effort that is.  But I try never to forget that others
don't have it so easy, and never to leave them feeling like they must
apologize for what's already an amazing collective effort.

Now, back to topic, I just wanted to reply to one detail of your post
right now while I'm thinking of it:

> > If you can't convert solely to that, and if macros are doing
> > something truly syntactic, I don't see how open-coding that
> > without macros is going to make your code cleaner.  If every thing
> > a macro collapses into one form has to be done as three
> > consecutive magic words, I don't see how inlining those three
> > magic words makes things better. You still have to patch all the
> > uses whether you write
> >  (FOO) (BAR) (BAZ)
> > or 
> >  (MACROLOGICAL-FOO-BAR-BAZ)
> > If you have made your code depend on foo, bar, and baz happening in order
> > and you change it to (FOO) (BAZ) (BAR) or (FOO) (QUUX) (BAZ), either way you
> > still have to recompile the same number of forms.  The mention of macros
> > seems a red herring.
> 
> Why not (functional-foo-bar-baz)?

Because the things I'm talking about often depend on having access to
lexically apparent variables.  You can sometimes do it if you use
call-by-reference or pointer-passing, but even then you have some
complicated (data)typing issues.  C++ gets around this with the
complexity of templates, if you don't mind programming in a wholly
different language and you don't mind the overhead that templates
create.  In Java, handling CORBA footholds is very hard to abstract.
In Lisp, it's a lot more natural, IMO.
From: Christopher C. Stacy
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uacl9as89.fsf@news.dtpq.com>
Rainer Joswig <······@lisp.de> writes:
> > So, respectfully, I dispute your apparent assumption that it was not
> > possible, and I suggest instead that what you infer as lack of possibility
> > might be equally well explained by business complications.
> 
> Might it also be the case that the management level was having
> reasons to not follow this route that are based on
> experience with done projects? Too costly, too late,
> to complicated, ...? Just asking.

I'll take this one.

"No."
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uirzxymwi.fsf@nhplace.com>
······@news.dtpq.com (Christopher C. Stacy) writes:

> Rainer Joswig <······@lisp.de> writes:
> > > So, respectfully, I dispute your apparent assumption that it was not
> > > possible, and I suggest instead that what you infer as lack of possibility
> > > might be equally well explained by business complications.
> > 
> > Might it also be the case that the management level was having
> > reasons to not follow this route that are based on
> > experience with done projects? Too costly, too late,
> > to complicated, ...? Just asking.
> 
> I'll take this one.
> 
> "No."

I second Chris's analysis.

You might wish the management was well-informed in the issues of language
and whatnot, but they weren't.  In some cases, they were just fans of 
the hardware, or as the Republican Right Wing likes to put it, had been
drinking too much of the kool-aid.  They were just caught up in their own
belief that the hardware was the thing.  

Even when it started to be obvious that it wasn't, they didn't have
the cash flow to change.  The company was built on selling iron, and
they feared that as soon as it started selling software, 80% of it was
useless. The problem was that the 20% to remain needed the cash flow
from the continued sale of iron to get to a next phase.  So they were
afraid to hint at software-only because they feared they would not
live to the next quarter.  Similar problems happen if people with a
genuine desire to lose weight try to do so too quickly--the body can
only adapt so quickly and must move slowly if it is to survive.
That's one reason (of probably several) why liposuction is so dangerous.
It takes more than a good end goal to transition a company to a new product
or a new product positioning.

And the company was founded on hardware, so the people in control were
people who were experts in that.  Getting a new regime in place was
hard.

Appealing to detailed issues of macrology's presence or absence is
going about 2 or 3 abstraction levels below where company funding was
flowing and projects were being decided.  I doubt that the people
making the decisions about these projects ever saw a macro or a
function or could tell you the difference.  There were probably a few
advisors that could have helped a little in this.

I mean, geez, a lot of what brought Symbolics down was bad deals on
real estate.  They took a ten-year lease on office space for 1000
people just before dropping to 200 people.  The remaining commitment
on high-rent office space was a problem of catastrophic proportions,
and that DEFINITELY had nothing to do with macrology.
From: Paolo Amoroso
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <87br5of9n6.fsf@plato.moon.paoloamoroso.it>
Kent M Pitman <······@nhplace.com> writes:

> Not Emacs) and Zwei (Zwei Was Eine Initially).]  It still used TV
> windows, and needed to use CLIM instead, but I considered that a

By the way, I always wondered what "TV" stands for in the Symbolics TV
package name.


Paolo
-- 
Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film
Recommended Common Lisp libraries/tools:
- ASDF/ASDF-INSTALL: system building/installation
- CL-PPCRE: regular expressions
- UFFI: Foreign Function Interface
From: Christopher C. Stacy
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <u7jgc2l6h.fsf@news.dtpq.com>
Paolo Amoroso <·······@mclink.it> writes:

> Kent M Pitman <······@nhplace.com> writes:
> 
> > Not Emacs) and Zwei (Zwei Was Eine Initially).]  It still used TV
> > windows, and needed to use CLIM instead, but I considered that a
> 
> By the way, I always wondered what "TV" stands for in 
> the Symbolics TV package name.

Not that long ago, there were no bitmapped displays.
Stanford, and then MIT (Tom Knight), built some of
the very first such systems, which were called "TV"
(as in "television") systems.   The Lisp Machine
first came along only a little while after that.
From: Paolo Amoroso
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <87zmt8dt86.fsf@plato.moon.paoloamoroso.it>
······@news.dtpq.com (Christopher C. Stacy) writes:

> Not that long ago, there were no bitmapped displays.
> Stanford, and then MIT (Tom Knight), built some of
> the very first such systems, which were called "TV"
> (as in "television") systems.   The Lisp Machine
> first came along only a little while after that.

Cool, I suspected that :)


Paolo
-- 
Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film
Recommended Common Lisp libraries/tools:
- ASDF/ASDF-INSTALL: system building/installation
- CL-PPCRE: regular expressions
- UFFI: Foreign Function Interface
From: Pascal Bourguignon
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <873br1kzjs.fsf@thalassa.informatimago.com>
Rainer Joswig <······@lisp.de> writes:
> For me macros are violating principles like late-binding. They
> are creating static structures that are no longer evovable. The
> make too much going on behind the scenes that are non-obvious.

Not necessarily.  There's good uses of macros and bad uses of macros.
ISTM that macros allows the programmers to write in a more declarative
style.  One should not concern too much with the implementation of
macros, but should understand the meaning of the macro "calls".  If
you do this, you'll see that code full of macro "calls" becomes very
readable: you're reading the abstractions instead of trying to see low
level stuff.


> Take for example an interface builder type of application. If you
> represent the interface objects as first class objects you can
> modify the interface with an interface builder, since you have
> an explicit representation of it and you can work on top of that.
>
> Now take DW or CLIM. One of the main drivers for the interface is the
> frame definition macro. It has several drawbacks:
>
> - it does have extremely poor checking of the input and does not provide
>   you with useful ideas why a certain description produces results
>   that are not desirable (layout problems, description grammar violations,
>   ...).
>
> - it is non reversable. You cannot go from an interface back to an editable
>   description (-> a macro form that can be changed and evaluated again).
>
> - it introduces all kinds of side-effects that are non-reversable or where
>   there is no dependency tracking. Example: you define a frame and several
>   methods will get generated. Later you change that frame definition
>   and all kinds of no longer needed methods will NOT be removed.
>
> - the compile-time debugging of such a frame definition is black art.
>
> - There is no idea what the frame definition macro does and so it is
>   hard to replicated it by other people in their implementation.
>   The various CLIM implementations had several incompatible versions
>   of the frame definition macro, where you can get to a working
>   usage of it only by trial and error.
>
> - it wires dependency into the resulting code. You can redefine a
>   function in a patch and the effect will be visible. You can redefine
>   a macro in a patch and the change will NOT be visible in
>   user level code, since the user then needs to compile his code
>   and it is non-trivial to track dependency and undo/redo
>   side-effects.
>
> - the macro definition is not tool friendly. Frame-Up for example
>   could easily generate such a frame description, but you couldn't
>   take one and edit it in Frame-Up.

All this might be true.  But I think the important criteria would be
whether the user interfaces written with these macros is written in a
more abstract, more high level language?  If this language is well
defined, you should be able to write a new implementation of these
macros to remove all these defects, generating objects instead of
"hard coded" side-effects.


I've got this html401.lisp file full of macro "calls":

(DEFELEMENT A          ()  "anchor")
(DEFELEMENT ABBR       ()  "abbreviated form (e.g., WWW, HTTP, etc.)")
(DEFELEMENT ACRONYM    ())
(DEFELEMENT ADDRESS    ()                        "information on author")
(DEFELEMENT APPLET     (:DEPRECATED :LOOSE-DTD)  "Java applet")
(DEFELEMENT AREA       (:END-FORBIDDEN :EMPTY)   "client-side image map area")
(DEFELEMENT B          ()                        "bold text style")
(DEFELEMENT BASE       (:END-FORBIDDEN :EMPTY)   "document base URI")
(DEFELEMENT BASEFONT   (:END-FORBIDDEN :EMPTY :DEPRECATED :LOOSE-DTD)
  "base font size");;BASEFONT
...

(DEFATTRIBUTE ABBR 
  (TD TH)
  (%TEXT)  :IMPLIED
  ()  "abbreviation for header cell"
  );;ABBR

(DEFATTRIBUTE ACCEPT-CHARSET
  (FORM)
  (%CHARSETS)  :IMPLIED
  ()  "list of supported charsets"
  );;ACCEPT-CHARSET

(DEFATTRIBUTE ACCEPT 
  (FORM INPUT)
  (%CONTENTTYPES)  :IMPLIED
  ()  "list of MIME types for file upload"
  );;ACCEPT
...

Actually these macros are not defined in this file.  Two others lisp
files contain different definitions of these macros, then load this
html401 file to let the macros generate (at compilation time)
different things. (One is a HTML parser, the other is a HTML generator).

If I needed run-time processing, I could also just READ this file at
run-time and process these s-expressions, buiding any run-time object
I'd need from them.

The focus should not be on the defmacro side, but on the macro "call" side.

If the macro "call" side is well defined, that is if you're defining a
good language, then the defmacro side is no more interesting than the
internals of your Common Lisp implementation. 


> So from a Lisp hacking perspective there the frame definition macro
> is very cool, but from a software engineering perspective (taking
> into account maintainability) it is very questionable, since it does
> only barely the job and is not developed for robustness and other
> demands. Unfortunately there is no established best practice how
> to write maintainable macros at that scope and level. I'm not
> talking about the kind of macros that transform a LAMBDA into a LET
> expression.
> I'm talking about the stuff that has been used in the Lisp machine software
> like DEFWHOPPERs/WRAPPERs or like the complex macros to define
> application frames (or what it was called) and similar. The stuff that
> has been written with heavy macrology has not been able to evolve
> (that's what I see). Why was that? Are there other reasons, or is
> there some misuse of macros, some anti-patterns implemented?

It would be interesting to be able to study these sources, indeed.


-- 
__Pascal Bourguignon__                     http://www.informatimago.com/
You never feed me.
Perhaps I'll sleep on your face.
That will sure show you.
From: Matthias
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <36wwtodurln.fsf@hundertwasser.ti.uni-mannheim.de>
Pascal Bourguignon <···@informatimago.com> writes:

> Rainer Joswig <······@lisp.de> writes:
> > For me macros are violating principles like late-binding. They
> > are creating static structures that are no longer evovable. The
> > make too much going on behind the scenes that are non-obvious.
> 
> Not necessarily.  There's good uses of macros and bad uses of macros.
> ISTM that macros allows the programmers to write in a more declarative
> style.  One should not concern too much with the implementation of
> macros, but should understand the meaning of the macro "calls".  If
> you do this, you'll see that code full of macro "calls" becomes very
> readable: you're reading the abstractions instead of trying to see low
> level stuff.

That's true until an error occurs or something needs to be changed.

Be it a buggy or incomplete macro or be it a wrong macro call: The
macro abstraction breaks down completely.  The user gets a low-level
error message exposing all the internals of the macro call.  That's
not optimal if your user is another Lisp programmer. It's unacceptable
if your user is not a programmer.

It would be nice if macros could provide 'more complete' abstractions
in the sense that if I make an syntax error or pass arguments that
make no sense I get a meaningful, high-level error message.  Currently
this probably would be very tedious to implement.  An area for
innovation? ;-)

> The focus should not be on the defmacro side, but on the macro "call" side.
> 
> If the macro "call" side is well defined, that is if you're defining a
> good language, then the defmacro side is no more interesting than the
> internals of your Common Lisp implementation. 

If I make an error in a CL call a good implementation can provide
meaningful error messages.  Macros typically don't do that.  That's an
important difference, IMHO.

Matthias
From: Cameron MacKinnon
Subject: Macros and error handling (was McCarthy denounces...)
Date: 
Message-ID: <UYGdnWwKzsXRW1_fRVn-oQ@rogers.com>
Matthias wrote:

> Be it a buggy or incomplete macro or be it a wrong macro call: The 
> macro abstraction breaks down completely.  The user gets a low-level 
> error message exposing all the internals of the macro call.  That's 
> not optimal if your user is another Lisp programmer. It's
> unacceptable if your user is not a programmer.

I guess if the macro writer doesn't bother with any error checking, the
results of errors are likely to be propagated down to the CL level.
Garbage in, garbage out?

Sending programmer specific error messages to non programmers is never
"acceptable" to me, though it does appear to be the gold standard in web
applications. Lisp programmers have great control over what happens when
errors are encountered, and over where (if anywhere) the debugger pops
up. Do you subject your end-users to the Lisp error messages and
debugger for ordinary code that isn't infested with macros?

> It would be nice if macros could provide 'more complete' abstractions
>  in the sense that if I make an syntax error or pass arguments that 
> make no sense I get a meaningful, high-level error message.
> Currently this probably would be very tedious to implement.  An area
> for innovation? ;-)

So if I were to write a new domain specific language using macros with
no diagnostics or error checking, my Lisp implementation would add the
checks and the (domain specific, tailored to the end user) error
messages automatically? There could be a few good CS papers in that,
were you to solve it.


-- 
Cameron MacKinnon
Toronto, Canada
From: jayessay
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <m3vf3xukw4.fsf@rigel.goldenthreadtech.com>
Matthias <··@spam.please> writes:

> Pascal Bourguignon <···@informatimago.com> writes:
> 
> > Rainer Joswig <······@lisp.de> writes:
> > > For me macros are violating principles like late-binding. They
> > > are creating static structures that are no longer evovable. The
> > > make too much going on behind the scenes that are non-obvious.
> > 
> > Not necessarily.  There's good uses of macros and bad uses of macros.
> > ISTM that macros allows the programmers to write in a more declarative
> > style.  One should not concern too much with the implementation of
> > macros, but should understand the meaning of the macro "calls".  If
> > you do this, you'll see that code full of macro "calls" becomes very
> > readable: you're reading the abstractions instead of trying to see low
> > level stuff.
> 
> That's true until an error occurs or something needs to be changed.
> 
> Be it a buggy or incomplete macro or be it a wrong macro call: The

There's no difference there to what happens if you have such a bug in
the base language implementation.  All you are really saying here is
that it is hard to do "domain languages" well.  Shrug.  Who would
claim otherwise?

> It would be nice if macros could provide 'more complete' abstractions
> in the sense that if I make an syntax error or pass arguments that
> make no sense I get a meaningful, high-level error message.  Currently
> this probably would be very tedious to implement.  An area for
> innovation? ;-)

Yes.  If you could build "auto-code-walkers", that would work across
application spaces, then you could get a lot of the futsy nutsy boltsy
stuff done once, done well, and done reusably.  This requires _at
least_ the environment type stuff that Franz has done.


> > If the macro "call" side is well defined, that is if you're defining a
> > good language, then the defmacro side is no more interesting than the
> > internals of your Common Lisp implementation. 
> 
> If I make an error in a CL call a good implementation can provide
> meaningful error messages.  Macros typically don't do that.

But that is just because the author hasn't done the work.  The _exact_
same thing can happen with a "bad" implementation of the base
language.

>  That's an important difference, IMHO.

There is basically zero difference in _kind_.  It is all about
resources to get the domain language up to a similar level of robust
implementation.  It's worth noting that there are probably plenty of
cases where such a level is neither cost effective nor worthwhile.


/Jon

-- 
'j' - a n t h o n y at romeo/charley/november com
From: Rainer Joswig
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <joswig-CA1A61.18235129062005@news-europe.giganews.com>
In article <··············@thalassa.informatimago.com>,
 Pascal Bourguignon <···@informatimago.com> wrote:

> I've got this html401.lisp file full of macro "calls":
> 
> (DEFELEMENT A          ()  "anchor")
> (DEFELEMENT ABBR       ()  "abbreviated form (e.g., WWW, HTTP, etc.)")
> (DEFELEMENT ACRONYM    ())
> (DEFELEMENT ADDRESS    ()                        "information on author")
> (DEFELEMENT APPLET     (:DEPRECATED :LOOSE-DTD)  "Java applet")
> (DEFELEMENT AREA       (:END-FORBIDDEN :EMPTY)   "client-side image map area")
> (DEFELEMENT B          ()                        "bold text style")
> (DEFELEMENT BASE       (:END-FORBIDDEN :EMPTY)   "document base URI")
> (DEFELEMENT BASEFONT   (:END-FORBIDDEN :EMPTY :DEPRECATED :LOOSE-DTD)
>   "base font size");;BASEFONT
> ...
> 
> (DEFATTRIBUTE ABBR 
>   (TD TH)
>   (%TEXT)  :IMPLIED
>   ()  "abbreviation for header cell"
>   );;ABBR
> 
> (DEFATTRIBUTE ACCEPT-CHARSET
>   (FORM)
>   (%CHARSETS)  :IMPLIED
>   ()  "list of supported charsets"
>   );;ACCEPT-CHARSET
> 
> (DEFATTRIBUTE ACCEPT 
>   (FORM INPUT)
>   (%CONTENTTYPES)  :IMPLIED
>   ()  "list of MIME types for file upload"
>   );;ACCEPT
> ...
> 
> Actually these macros are not defined in this file.

Do you have an idea why any of these are macros?

Besides, that you can get rid of a quoting level?

Is there really anything that makes you need to have those
as macros? I would know only one reason. Have you made
a conscious choice and know why it is a macro?

What does it buy you to write

(DEFELEMENT A          ()  "anchor")

instead of

(define-element 'a '() '"anchor")

and have DEFINE-ELEMENT being a function?

Recently I've gone away from

(defpage foo :title "foo" :menu ...)

to just

(make-page 'foo :title "foo" :menu ...) 

and wrote a couple of functions to work over first class descritions
and first class objects, instead of using macros to do the machinery.
For me the functional abstraction approach is much simpler and
makes debugging/tracing/... much easier.
From: Joe Marshall
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <mzp9f3wm.fsf@ccs.neu.edu>
Rainer Joswig <······@lisp.de> writes:

> What does it buy you to write
>
> (DEFELEMENT A          ()  "anchor")
>
> instead of
>
> (define-element 'a '() '"anchor")
>
> and have DEFINE-ELEMENT being a function?

I think the first form is just a tad clearer than the second.  If I
were to give a much junior programmer the job of adding a few more
elements to the file (say I told him `add the P and HR elements'), I'd
feel a bit more comfortable with the first form because it is just
*so* damn obvious what to do.

The second form is *almost* as clear, but those little tick marks
would be easy to forget, and since some of them are superfluous, there
would be a temptation for more experienced programmers to not use
them:

(define-element 'a     ()                      "anchor")
(define-element 'base '(:end-forbidden :empty) "document base URI")

A junior programmer might not grasp where the quote is necessary
and where it is optional.  The macro form doesn't have that ambiguity.

> Recently I've gone away from
>
> (defpage foo :title "foo" :menu ...)
>
> to just
>
> (make-page 'foo :title "foo" :menu ...) 
>
> and wrote a couple of functions to work over first class descritions
> and first class objects, instead of using macros to do the machinery.
> For me the functional abstraction approach is much simpler and
> makes debugging/tracing/... much easier.

*Much* easier?  

 (defmacro defpage (name &rest crap)
   `(make-page ',name ,@crap))

That's going to be indistinguishable from the pure functional approach
at debugging time.


I do tend to favor functions and first-class objects (closures and
thunks) to perform the computation, but I think using macros at the
top-level for `definition-like' things can greatly improve the
readability of the code.  Macros that expand into the equivalent
function calls are about the best of both worlds.
From: Rainer Joswig
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <BEE8B0F6.E547%joswig@lisp.de>
Am 29.06.2005 19:50 Uhr schrieb "Joe Marshall" unter <···@ccs.neu.edu> in
············@ccs.neu.edu:

> Rainer Joswig <······@lisp.de> writes:
> 
>> What does it buy you to write
>> 
>> (DEFELEMENT A          ()  "anchor")
>> 
>> instead of
>> 
>> (define-element 'a '() '"anchor")
>> 
>> and have DEFINE-ELEMENT being a function?
> 
> I think the first form is just a tad clearer than the second.

That's hard to believe for me. ;-)

>  If I
> were to give a much junior programmer the job of adding a few more
> elements to the file (say I told him `add the P and HR elements'), I'd
> feel a bit more comfortable with the first form because it is just
> *so* damn obvious what to do.

I have the experience that the additional possibility of not
having to quote makes it for the newbie actually worse.
For a newbie it is easy to understand that a function
needs quoted (or self-evaluating) data. But where in
a macro you quote and where not can be tricky.
Again my example is the frame definition macro in
CLIM where you can declaratively specify a layout and objects
and you can also have some form of real or semi/real
calls to generate objects in some forms. Arrgh.

> The second form is *almost* as clear, but those little tick marks
> would be easy to forget, and since some of them are superfluous, there
> would be a temptation for more experienced programmers to not use
> them:
> 
> (define-element 'a     ()                      "anchor")
> (define-element 'base '(:end-forbidden :empty) "document base URI")
> 
> A junior programmer might not grasp where the quote is necessary
> and where it is optional.  The macro form doesn't have that ambiguity.

And adds the ambiguity what is code and what is data. Which is worse.
You simply don't see in the source just by visual inspection where
stuff gets evaluated and where not. You need to do something
to find out (or know it in advance, but the programmer's brain
is limited). With the pure functional approach it is clear:
Args are always evaluated.

>> Recently I've gone away from
>> 
>> (defpage foo :title "foo" :menu ...)
>> 
>> to just
>> 
>> (make-page 'foo :title "foo" :menu ...)
>> 
>> and wrote a couple of functions to work over first class descritions
>> and first class objects, instead of using macros to do the machinery.
>> For me the functional abstraction approach is much simpler and
>> makes debugging/tracing/... much easier.
> 
> *Much* easier?  
> 
>  (defmacro defpage (name &rest crap)
>    `(make-page ',name ,@crap))
> 
> That's going to be indistinguishable from the pure functional approach
> at debugging time.

No, I don't see the defpage in the debugger at runtime. Additionally
I won't be easily able to find the generating macro, redefine it
and rerun the code. Which makes it bad for any non-trivial example.
I can redefine the function and rerun it with the same or different
args in most cases. I can trace the function. Tracing the macro
gives me nothing at runtime. Stepping the macro code gives
me additional expansions all the time. You are just adding
things where you simply don't need it and they only get
in the way. If I see the stuff in the debugger and I want to change it,
I find the definition change it, compile it done. Oops, now
I have to find the macro, change the expansion, test the expansion,
compile the places where the macros are being used, set back to some
sensible place and redo the computation. What was a simple
task just to change a function becomes a complex multistep
action with lots of possibilities to make errors. And keep
in mind that my team has no Joe Marshall and no Kent Pitman. But
a Rainer Joswig or some other middle class hacker that can hardly
implement the domain right and does not even know much about
these issues.

> I do tend to favor functions and first-class objects (closures and
> thunks) to perform the computation, but I think using macros at the
> top-level for `definition-like' things can greatly improve the
> readability of the code.  Macros that expand into the equivalent
> function calls are about the best of both worlds.

No, it is worse. No you have to maintain two argument lists with different
argument checking capabilities and argument checking facilities. Plus
you have an additional expansion phase, you lose the link from
the code in the debugger to the source code and so on.
From: Joe Marshall
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <64vxezh5.fsf@ccs.neu.edu>
Rainer Joswig <······@lisp.de> writes:

> Am 29.06.2005 19:50 Uhr schrieb "Joe Marshall" unter <···@ccs.neu.edu> in
> ············@ccs.neu.edu:
>
>> Rainer Joswig <······@lisp.de> writes:
>> 
>>> What does it buy you to write
>>> 
>>> (DEFELEMENT A          ()  "anchor")
>>> 
>>> instead of
>>> 
>>> (define-element 'a '() '"anchor")
>>> 
>>> and have DEFINE-ELEMENT being a function?
>> 
>> I think the first form is just a tad clearer than the second.
>
> That's hard to believe for me. ;-)
>
>> If I were to give a much junior programmer the job of adding a few
>> more elements to the file (say I told him `add the P and HR
>> elements'), I'd feel a bit more comfortable with the first form
>> because it is just *so* damn obvious what to do.
>
> I have the experience that the additional possibility of not
> having to quote makes it for the newbie actually worse.
> For a newbie it is easy to understand that a function
> needs quoted (or self-evaluating) data. But where in
> a macro you quote and where not can be tricky.
> Again my example is the frame definition macro in
> CLIM where you can declaratively specify a layout and objects
> and you can also have some form of real or semi/real
> calls to generate objects in some forms. Arrgh.

Let me qualify this a bit.  In the *specific* example:

>>> (DEFELEMENT A          ()  "anchor")
>>> 
>>> instead of
>>> 
>>> (define-element 'a '() '"anchor")

I find the single quote in front of the string to be very odd looking,
and I don't like the quoted `A' because I prefer to think that I'm
binding some identifier in some context rather than associating a
symbol with something.  For the same reason I write

 (defun foo (x) (+ x 3))

rather than

 (setf (symbol-function 'foo) (lambda (x) (+ x 3)))

I'm not familiar with the frame definition macro in CLIM but I bet it
is *lot* more complicated than the DEFELEMENT macro.

So I'm advocating `define-foo' style macros *only* for those places
where it is a *really* simple, obvious, straightforward macro.

With the DEFELEMENT macro, where there is a block of definitions such
as this:

 (defelement a    ()                      "anchor")
 (defelement base (:end-forbidden :empty) "document base URI")

You could even get a C++ programmer to add a P or HR element without
screwing up.

>> A junior programmer might not grasp where the quote is necessary
>> and where it is optional.  The macro form doesn't have that ambiguity.
>
> And adds the ambiguity what is code and what is data. Which is worse.
> You simply don't see in the source just by visual inspection where
> stuff gets evaluated and where not. You need to do something
> to find out (or know it in advance, but the programmer's brain
> is limited). With the pure functional approach it is clear:
> Args are always evaluated.

I agree in the general case, but in the case above, I'm thinking of a
programmer that doesn't even have a clear idea of what evaluation or
quoting is.  He just sees the obvious pattern and mindlessly copies
it.

It doesn't matter what is code and what is data.  What matters is that
DEFELEMENT does the trick, the name comes next, a parenthesized list
of options (possibly empty) and a string.  How does it work?  I
dunno.  What's evaluated?  Doesn't matter.

But I would be very much against this sort of macro for anything where
you'd be interested in putting in some arbitrary lisp code.

>>> Recently I've gone away from
>>> 
>>> (defpage foo :title "foo" :menu ...)
>>> 
>>> to just
>>> 
>>> (make-page 'foo :title "foo" :menu ...)
>>> 
>>> and wrote a couple of functions to work over first class descritions
>>> and first class objects, instead of using macros to do the machinery.
>>> For me the functional abstraction approach is much simpler and
>>> makes debugging/tracing/... much easier.
>> 
>> *Much* easier?  
>> 
>>  (defmacro defpage (name &rest crap)
>>    `(make-page ',name ,@crap))
>> 
>> That's going to be indistinguishable from the pure functional approach
>> at debugging time.
>
> No, I don't see the defpage in the debugger at runtime. Additionally
> I won't be easily able to find the generating macro, redefine it
> and rerun the code. Which makes it bad for any non-trivial example.

Ok.  I would probably use the macro *only* for the trivial examples
(where I think it wins).

> I can redefine the function and rerun it with the same or different
> args in most cases. I can trace the function. Tracing the macro
> gives me nothing at runtime. Stepping the macro code gives
> me additional expansions all the time. You are just adding
> things where you simply don't need it and they only get
> in the way. If I see the stuff in the debugger and I want to change it,
> I find the definition change it, compile it done. Oops, now
> I have to find the macro, change the expansion, test the expansion,
> compile the places where the macros are being used, set back to some
> sensible place and redo the computation. What was a simple
> task just to change a function becomes a complex multistep
> action with lots of possibilities to make errors. 

Right.  These can be serious problems and I avoid using macros where
these could be an issue.  I think the DEFELEMENT example above is
trivial enough that these would not be a problem.

>> I do tend to favor functions and first-class objects (closures and
>> thunks) to perform the computation, but I think using macros at the
>> top-level for `definition-like' things can greatly improve the
>> readability of the code.  Macros that expand into the equivalent
>> function calls are about the best of both worlds.
>
> No, it is worse. No you have to maintain two argument lists with different
> argument checking capabilities and argument checking facilities. Plus
> you have an additional expansion phase, you lose the link from
> the code in the debugger to the source code and so on.

It *could* be worse if there were significant munging of the
arguments, but in the DEFPAGE thing above the macro does nothing but
quote its first argument.  Everything else is foisted off on the
function.
From: Rainer Joswig
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <BEE8C3DB.E574%joswig@lisp.de>
Am 29.06.2005 21:26 Uhr schrieb "Joe Marshall" unter <···@ccs.neu.edu> in
············@ccs.neu.edu:

> It *could* be worse if there were significant munging of the
> arguments, but in the DEFPAGE thing above the macro does nothing but
> quote its first argument.  Everything else is foisted off on the
> function.

So, let's hope it stays that simple. ;-) And if it stays that simple, *I*
would not need the macro.
From: Pascal Costanza
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <3igbenFkhp8cU1@individual.net>
Joe Marshall wrote:

>>>>(DEFELEMENT A          ()  "anchor")
>>>>
>>>>instead of
>>>>
>>>>(define-element 'a '() '"anchor")
> 
> I find the single quote in front of the string to be very odd looking,
> and I don't like the quoted `A' because I prefer to think that I'm
> binding some identifier in some context rather than associating a
> symbol with something.  For the same reason I write
> 
>  (defun foo (x) (+ x 3))
> 
> rather than
> 
>  (setf (symbol-function 'foo) (lambda (x) (+ x 3)))

This should have been

  (setf (symbol-function 'foo)
        (function (lambda (x) (+ x 3))))

because (lambda ...) is a macro when it is not embedded inside a 
(function ...) form. (This makes your case stronger.)



Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Christopher C. Stacy
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uirzxaso5.fsf@news.dtpq.com>
Rainer Joswig <······@lisp.de> writes:

> Am 29.06.2005 19:50 Uhr schrieb "Joe Marshall" unter <···@ccs.neu.edu> in
> ············@ccs.neu.edu:
> 
> > Rainer Joswig <······@lisp.de> writes:
> > 
> >> What does it buy you to write
> >> 
> >> (DEFELEMENT A          ()  "anchor")
> >> 
> >> instead of
> >> 
> >> (define-element 'a '() '"anchor")
> >> 
> >> and have DEFINE-ELEMENT being a function?
> > 
> > I think the first form is just a tad clearer than the second.
> 
> That's hard to believe for me. ;-)
> 
> >  If I
> > were to give a much junior programmer the job of adding a few more
> > elements to the file (say I told him `add the P and HR elements'), I'd
> > feel a bit more comfortable with the first form because it is just
> > *so* damn obvious what to do.
> 
> I have the experience that the additional possibility of not
> having to quote makes it for the newbie actually worse.
> For a newbie it is easy to understand that a function
> needs quoted (or self-evaluating) data.

My experience is in using those kinds of forms for the
input language of a large application; it was not actually
programmers who were writing or modifying the macro calls.
They did not know word one about Lisp, and would never be 
able to figure out where to put quotes.
Using macros worked out great for this.
From: Joe Marshall
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <vf3wvidv.fsf@comcast.net>
······@news.dtpq.com (Christopher C. Stacy) writes:

> My experience is in using those kinds of forms for the
> input language of a large application; it was not actually
> programmers who were writing or modifying the macro calls.
> They did not know word one about Lisp, and would never be 
> able to figure out where to put quotes.
> Using macros worked out great for this.

Exactly!


-- 
~jrm
From: Matthias
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <36wr7ekur56.fsf@hundertwasser.ti.uni-mannheim.de>
······@news.dtpq.com (Christopher C. Stacy) writes:

> > I have the experience that the additional possibility of not
> > having to quote makes it for the newbie actually worse.
> > For a newbie it is easy to understand that a function
> > needs quoted (or self-evaluating) data.
> 
> My experience is in using those kinds of forms for the
> input language of a large application; it was not actually
> programmers who were writing or modifying the macro calls.
> They did not know word one about Lisp, and would never be 
> able to figure out where to put quotes.
> Using macros worked out great for this.

Did it ever happen that the non-programmers did something stupid?  And
would they then be able to make sense out of the error messages they
got?  Did you have any special error-handling code within your macros to
check user's syntax? 

(I'm asking out of curiosity: Macros are sometimes suggested to be the
golden road to writing a DSL.  I wonder how much work it is to make
this approach really robust against user errors.  The (few) macros I
wrote so far felt very cool to write and use, but they all break down
as soon as you do something wrong with them.  With "break down" I mean
you get some error message that's incomprehensible if you don't know
how the macro works internally.)
From: Pascal Costanza
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <3ihto0Flf54eU1@individual.net>
Matthias wrote:

> (I'm asking out of curiosity: Macros are sometimes suggested to be the
> golden road to writing a DSL.  I wonder how much work it is to make
> this approach really robust against user errors.  The (few) macros I
> wrote so far felt very cool to write and use, but they all break down
> as soon as you do something wrong with them.  With "break down" I mean
> you get some error message that's incomprehensible if you don't know
> how the macro works internally.)

Regular functions can also break down if you pass wrong parameters. Bugs 
that arise from wrong parameters can also go unnoticed for quite some 
time until a very low level of your program detects them, and then the 
error messages are also incomprehensible. In a defensive programming 
style, you add a number of checks that test whether the client uses your 
code in the anticipated way, and either adapt or throw appropriate 
exceptions. It's a good idea to do this with macros as well. In a sense, 
it's even more convenient because you don't need to care a lot about 
efficiency of the checks. They happen at macroexpansion time anyway, 
i.e. at compile time when the code is compiled.

It is claimed that the Scheme macro systems that are based on syntax 
objects (i.e., syntax-rules and syntax-case, IIUC) make traceability of 
macro expansion bugs simpler because originating line numbers can be 
logged in the syntax objects. However, I don't have enough experience to 
comment on this.


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Joe Marshall
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <vf3vu1qr.fsf@comcast.net>
Pascal Costanza <··@p-cos.net> writes:

> It is claimed that the Scheme macro systems that are based on syntax
> objects (i.e., syntax-rules and syntax-case, IIUC) make traceability
> of macro expansion bugs simpler because originating line numbers can
> be logged in the syntax objects.

I think that theory has yet to be turned into practice.  It's still
pretty tricky to write hairy macros with syntax-rules and syntax-case.

-- 
~jrm
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <u7jgd9kch.fsf@nhplace.com>
Rainer Joswig <······@lisp.de> writes:

> In article <··············@thalassa.informatimago.com>,
>  Pascal Bourguignon <···@informatimago.com> wrote:
> 
> What does it buy you to write
> 
> (DEFELEMENT A          ()  "anchor")
> 
> instead of
> 
> (define-element 'a '() '"anchor")

Grace and elegance?

Actually, for things like this I advocate having a DECLARE-ELEMENT
underlying DEFINE-ELEMENT that allows the user to do function calling
rather than EVAL if they have a runtime need.

But seeing a lot of DEFELEMENT forms tells you immediately visually
that there are no special games going on involving runtime evaluation,
and that the entire lot of definitions is statically analyzable.
If you see DECLARE-ELEMENT used, you immediately start searching for
missing quotes and trying to understand what's done at runtime.

(This is a bit like the issue of LET vs LET*.)
From: Rainer Joswig
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <BEE8A5F3.E50D%joswig@lisp.de>
Am 29.06.2005 18:51 Uhr schrieb "Kent M Pitman" unter <······@nhplace.com>
in ·············@nhplace.com:

> Rainer Joswig <······@lisp.de> writes:
> 
>> In article <··············@thalassa.informatimago.com>,
>>  Pascal Bourguignon <···@informatimago.com> wrote:
>> 
>> What does it buy you to write
>> 
>> (DEFELEMENT A          ()  "anchor")
>> 
>> instead of
>> 
>> (define-element 'a '() '"anchor")
> 
> Grace and elegance?
> 
> Actually, for things like this I advocate having a DECLARE-ELEMENT
> underlying DEFINE-ELEMENT that allows the user to do function calling
> rather than EVAL if they have a runtime need.
> 
> But seeing a lot of DEFELEMENT forms tells you immediately visually
> that there are no special games going on involving runtime evaluation,

You could do load-time or run-time evaluation.
Why would you need compile-time evaluation? And if you need compile-time
evaluation, why use macros to achieve that? Why not explicitly
add EVAL-WHEN for compile time side effects?

> and that the entire lot of definitions is statically analyzable.
> If you see DECLARE-ELEMENT used, you immediately start searching for
> missing quotes and trying to understand what's done at runtime.
> 
> (This is a bit like the issue of LET vs LET*.)
From: Pascal Bourguignon
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <87hdfhja9i.fsf@thalassa.informatimago.com>
Rainer Joswig <······@lisp.de> writes:

> Am 29.06.2005 18:51 Uhr schrieb "Kent M Pitman" unter <······@nhplace.com>
> in ·············@nhplace.com:
>
>> Rainer Joswig <······@lisp.de> writes:
>> 
>>> In article <··············@thalassa.informatimago.com>,
>>>  Pascal Bourguignon <···@informatimago.com> wrote:
>>> 
>>> What does it buy you to write
>>> 
>>> (DEFELEMENT A          ()  "anchor")
>>> 
>>> instead of
>>> 
>>> (define-element 'a '() '"anchor")
>> 
>> Grace and elegance?
>> 
>> Actually, for things like this I advocate having a DECLARE-ELEMENT
>> underlying DEFINE-ELEMENT that allows the user to do function calling
>> rather than EVAL if they have a runtime need.
>> 
>> But seeing a lot of DEFELEMENT forms tells you immediately visually
>> that there are no special games going on involving runtime evaluation,
>
> You could do load-time or run-time evaluation.
> Why would you need compile-time evaluation? And if you need compile-time
> evaluation, why use macros to achieve that? Why not explicitly
> add EVAL-WHEN for compile time side effects?

Of course, this is for the compilation time evaluation (technically
macro expansion time, but specifically, that done when compiling).  

Macros are indicated here (vs. eval-when), because most of the code
generated will be other macros or functions.  ie. we're not just
building some data structure or doing some computing at compilation
time, but we'll generate code, to be compiled.



In another example of this aspect, I use the lisp reader to load data
formated as a sequence of s-expr:  (class :attribute value...)  At
first, I used a plain lisp file with normal lisp expressions: (let
((instance ...)) ... (add-instance *db* instance)) but this is too
early binding.  Now I have a higher level file format, and define
macros and functions before loading the data file which is no more
procedural, but declarative.


-- 
__Pascal Bourguignon__                     http://www.informatimago.com/

Nobody can fix the economy.  Nobody can be trusted with their finger
on the button.  Nobody's perfect.  VOTE FOR NOBODY.
From: Ray Dillinger
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <URXxe.2988$p%3.17881@typhoon.sonic.net>
Rainer Joswig wrote:

> If you look at the source code in the Lisp Machine, it is written
> in several dialects and also often with heavy macrology.

BTW, I would *love* to look at the Lisp Machine source code.
Can I?  If so where?  And IIRC, the LispM crewe had NLAMBDA
forms to do macro-ish things with first-class procedures.
But these are one of the big stumbling blocks when trying to
translate LispM code to a modern lisp, and have been blamed
for the stagnation/death of that code far more often than
macros.

> For me macros are violating principles like late-binding. They
> are creating static structures that are no longer evovable. The
> make too much going on behind the scenes that are non-obvious.

Hmmm.  I have a toy-lisp with procedures which can do the
stuff that CL uses macros for, because of slightly bizarre
argument passing protocols.  These calls put call frames
in the heap, support late-binding and higher-order functions,
and are as "evolvable" as any other source code... I think
they're easier to understand than macros and easier to
debug, but then I implemented them, so....  my opinion is
suspect. But they also have downsides of getting in the way
of lots of optimizations and requiring eval & friends at
runtime.

> - it introduces all kinds of side-effects that are non-reversable or where
>   there is no dependency tracking. Example: you define a frame and several
>   methods will get generated. Later you change that frame definition
>   and all kinds of no longer needed methods will NOT be removed.

??  Really?  You don't need an alternate macrology to ensure
garbage collection of dead procedures; just a collector with
soft pointers.  Once the symbol is no longer referenced by
any scope, and the procedure is no longer referenced by
any symbol, the softpointer from the symbol table to the
symbol is set null, the symbol is collected, and then the
procedure is collected.

Alternatively, if the symbol is mutated to refer to something
else, the routine becomes floating garbage anyway and ought to
be reaped, softpointers or none.

So...  I don't deny that this happens in many Lisp environments,
but I would lay the blame on the choices of implementors of
garbage collectors rather than poor choices of the language's
macrology.


				Bear
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <u64vr4hm8.fsf@nhplace.com>
Ray Dillinger <····@sonic.net> writes:

> Rainer Joswig wrote:
> 
> > If you look at the source code in the Lisp Machine, it is written
> > in several dialects and also often with heavy macrology.
> 
> BTW, I would *love* to look at the Lisp Machine source code.
> Can I?  If so where?  And IIRC, the LispM crewe had NLAMBDA
> forms to do macro-ish things with first-class procedures.

[This is a side-note just on this one sentence.  I'm not sure yet if I have
 any comment to make on the main substance of your remarks.]

NLAMBDA was just a FEXPR form of LAMBDA.  It was part of Interlisp,
never Maclisp nor Lisp Machine Lisp. e.g.,
 (LET ((ONE 'WON) (TWO 'TOO))
  ((NLAMBDA (X Y) (LIST X Y))
   ONE TWO))
 => (ONE TWO)

The Lisp Machine had &QUOTE, so you could do FEXPR effects on a per-arg
basis.  e.g., in Symbolics Zetalisp (not sure if it was in Symbolics 
Common Lisp), I think you could write:
  (LET ((ONE 'WON) (TWO 'TOO))
    ((LAMBDA (&QUOTE X &EVAL Y) (LIST X Y))
     ONE TWO))
  => (ONE TOO)
[My LispM is down so I can't test this, but surely someone whose isn't
 will correct me if I've erred.]

See my remarks in my special forms paper for further discussion of the
non-viability of fexprs.

One reason I didn't really like &QUOTE is that it seemed to put into the
arglist something that belonged with the symbol.  As an anonymous function,
there was no way for &QUOTE to really do its job.  Consider:

   (apply #'(lambda (x y) (list x y)) (list 'a 'b))        => (A B)
   (apply #'(lambda (&quote x y) (list x y)) (list 'a 'b)) => (A B)

In other words, &QUOTE doesn't really itself suppress evaluation since it
has no power to.  It instead provides advice to the evaluator that is 
consulted when (f a b) is seen and #'f happens to contain an &QUOTE.
It would have been cleaner, IMO, to do
 (defun foo (&quote x y) ...)
macroexpands to
 (progn (setf #'foo #'(lambda (x y) ...))
        (setf (get 'foo 'arg-evaluation-pattern) '(nil nil))
        'foo)
The flimsy rationale for not doing this seems to me to have been that it
meant copying two pieces of informatino rather than one if you wanted to
do
 (setf (symbol-function 'bar) (symbol-function 'foo))
but I'm not sure that "value" was worth the weirdness in semantics.

Politically, &QUOTE was in Zetalisp because the Lispm designers have a
core design principle that said that anything they needed to implement
the language should be available to users.  CL takes an opposite point
of view and says that you sometimes need a finite number of special
forms to bootstrap, but that after that you're better off with just
macros.  So far, I've seen little to refute the CL claim.
From: Ray Dillinger
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <3Y3ye.3007$p%3.19459@typhoon.sonic.net>
Kent M Pitman wrote:


> Politically, &QUOTE was in Zetalisp because the Lispm designers have a
> core design principle that said that anything they needed to implement
> the language should be available to users.   CL takes an opposite point
> of view and says that you sometimes need a finite number of special
> forms to bootstrap, but that after that you're better off with just
> macros.  So far, I've seen little to refute the CL claim.


Hmmm.  I think the LispM guys had a compelling point and may have
been right.  One of the fundamental issues holding CL back, as
far as I'm concerned, is its lack of downward extensibility.

You have upward extensibility to the sky with macros and HOF's
and means of abstraction:  but downward extensibility, where you
do things with arbitrary bits and bytes on hardware I/O ports
or with interrupts or native threads or the binary interfaces of
other programs, has always been a bugaboo for Common Lisp; you
have to go outside the language to do it, which puts CL (and
Scheme too) in the unenviable position of a mere scripting
language. C is rare among programming languages in that it
takes downward extensibility as a serious goal and does it well,
and C and its descendants practically rule the earth these days.
I think this is not a coincidence.

This is a design point that is in common with lots of languages
that had no way to specify such things and no escape; if you wanted
to downward extend Pascal, for example, you had the same problem
with access to machine primitives.  Lisp has survived on the
strength of its upward extensibility, even though every Lisp
hacker has to use some other language for downward extensions
and downward extensions aren't portable between implementations.
That's a heavy burden for a language to carry in terms of
acceptance and general utility, IMO.

				Bear
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <u4qbaqzsa.fsf@nhplace.com>
Ray Dillinger <····@sonic.net> writes:

> Kent M Pitman wrote:
> 
> 
> > Politically, &QUOTE was in Zetalisp because the Lispm designers have a
> > core design principle that said that anything they needed to implement
> > the language should be available to users.   CL takes an opposite point
> > of view and says that you sometimes need a finite number of special
> > forms to bootstrap, but that after that you're better off with just
> > macros.  So far, I've seen little to refute the CL claim.
> 
> Hmmm.  I think the LispM guys had a compelling point and may have
> been right.  One of the fundamental issues holding CL back, as
> far as I'm concerned, is its lack of downward extensibility.

Well, I don't know.  I would say I saw a substantial number of things
where people used this technology to firmly wed what they were doing
to the LispM rather than writing abstract portable code, so I'm just
not sure I agree with you.

In general, just about every time in my career that I've written
programs that used very low-level tools specific to that
implementation, those pieces of code were dead-ended when the hardware
or operating system was obsoleted.

Now, you might be thinking that a feature like &QUOTE was not tying
you to the operating system, so maybe you think we're not in
disagreement on this.  If so, I'd agree there's a blurry line there.
But those features, for example, were tied to the compiler, and most
other compilers didn't have them.  And so by tying to the compiler,
you tied to the platform indirectly.

In the design of Common Lisp, I asked Dave Moon (one of the architects
of the LispM's "new error system" (NES), and also Symbolics' lead
technical representative to X3J13 for a long time) what his opinion
was on including the type tree for errors into CL's design.  They had
a very heavily articulated type tree. I expected him to say "yes,
sure, put it all into the CL Condition System proposal, the more the
better", but he surprised me and said to go conservatively.  He said
that experience had shown that there were lots of things that were
overspecified in the doc, and that while customers were happy to use
them, they were just writing bad code or becoming overly dependent on
details that might change.  Almost certainly we erred by not including
a few things like directory-not-found (although it was for technical
reasons that it wasn't included, not for reasons of oversight), but 
largely I think he was right that a lot of things were too low level.

Now, there are good kinds of low-level, like the way that floats are
picked apart in CL.  That shows a fair bit of abstraction that mirrors
the design space and seems to allow people to write low-level code
fairly portably.

But there are bad kinds of low-level, and that's mostly stuff that
exposes a low-level piece of a mechanism where the mechanism itself
has not been bought into.  So if all of CL doesn't agree on a compiler
model, or on a certain model of optimizers, or of type propagation,
and then you provide accessors into these facilities, you're putting
the cart before the horse sometimes by providing hooks into these.

The MOP, for example, derives its coolness not from the fact that it
exposes low-levels, but that it exposes underlying agreement.  Likewise
the pretty printer.

And there are places like LOOP where people disagree in what I'd bet is
a useless dispute because I think LOOP, while complicated, is not so
complicated that we couldn't force implementations to agree _enough_
that the ability to define loop paths could be portably achieved without
infringing the part of LOOP that should _not_ be exposed, which is what
is the right and most efficient expansion for any given implementation in
order to benchmark well.  The more you try to pin down that latter part,
the more you just require Common Lisp to be slow.  Some parts are
meant to be kept abstract.

> You have upward extensibility to the sky with macros and HOF's
> and means of abstraction:  but downward extensibility, where you
> do things with arbitrary bits and bytes on hardware I/O ports
> or with interrupts or native threads or the binary interfaces of
> other programs, has always been a bugaboo for Common Lisp; 

IMO, this is a place where we have a fundamental disagreement.  Common
Lisp is, IMO, fundamentally not about data layout.  For sake of making
this a lively discussion, I'll even go out on a limb and allege the
extreme position that all of the Lisp Language Family (even to include
Scheme) [and I rarely make generalizations about that large a class]
is not about implementing these details.

[If anyone who regularly corresponds with John McCarthy on this wants
to poll him on whether I'm "blaspheming" here, I'd be interested in knowing.]

Some languages are about laying out data, what I call "implementation
languages".  In the abstract, I claim these are fundamentally built on
micro-managing, and this obsession with data layout seems to me to be 
the siren's call that keeps you from ever diggig your way out of thinking
about the machine issues.

Lisp is a differnet way of thinking, not about implementation, but about
abstract capability.  It's a way of thinking about big problems.  The 
president of any huge corporation does not get there by overspecifying the
small details of each cashier's cash drawer, etc.  In fact, I think it's
assumed that there will be minor inefficiencies in the name of not making
the cost of tracking/preventing those inefficiencies larger than the lost
time they actually lead to.  

Even mathemeticians understand this when they talk about big-O
notation, which effectively teach you that you shouldn't get lost in
the constant factors or you'll never get to the grand truths.

In Lisp, you don't say "I'm going to make an array that is indirect so
that I can adjust its size later" you say "give me an adjustable array".
If there is one error in recent design of CL, it's not the failure to 
provide low-level access to these details, it's the failure to add more
tools for saying "give me a graphical image" or "give me an XML document"
just as easily.  That is, the base should be moving up where you appear
to be saying it should move down.  

> you
> have to go outside the language to do it,

or decide not to do it at all.  Are you so sure that this is what stands
between you and the creation of a commercially successful program?
Or, if you're not in commerce, any program that will stand the test of time?

I allege, and you'll just have to find a way to prove me conclusively
wrong, that this is not what is holding you back.  It may sound good
to the naive new employee or young child to say that "lack of
flexibility" is what's keeping them from getting their homework or job
requirements done.  ("If only you let me do things my way, it would be
better...")  Once in a while, smart people _can_ do things another
way.  And once in a really rare time, smart people can _only_ do
things another way.  But mostly, smart people can figure out how to
work within a structured system, and not-so-smart people can't work
except within a structured system.  

And I'm not talking huge amounts of structure here.  Lisp is one of
the least structured languages you can find.  It allows more
flexibility of work style than most languages.  But one of the ways it
does this is to preserve a certain degree of low-level abstraction
that keeps people from sabotaging themselves.

You might even be write about particular "downward extensibility",
whatever that means, but I will not easily cede the general field.  If
you speak specifically, maybe there are particular details to be had.
But I don't think the general concept of barriers erected in that direction
is wrong.  As such, back to the original point, I don't think the LispM
philosophy of providing all that they used is good.

It, for example, led to them exposing the bit patterns of the words, the
ability to pick apart pointers, the ability to make new pointers that were
in violation of GC protocols, etc.  In effect, it allowed people to make
the GC as unreliable in the same way that pointer casting would make Java
unreliable. 

> which puts CL (and
> Scheme too)

Ah, nice to see you're making your arguments nice and broad over 
much of the language family, too. :)

> in the unenviable position of a mere scripting
> language.

Quoting myself from Slashdot in a remark that the community has
seemed to like (since parts of it show up in people's sigs from 
time to time):

  Both Scheme and Common Lisp have been and continue to be used in
  real-world applications that might surprise you. These include (but
  are certainly not limited to) applications in airline scheduling,
  commercial database manipulation, computer operating systems,
  bioengineering, web services, document format translation, and, yes,
  even space exploration. Franz, Inc. has created quite a nice page of
  Lisp success stories[1] that I think expand on this much better than I
  could in the space
  
  [1] http://www.franz.com/success/customer_apps/scheduling/nasa.php3

Personally, I doubt that all of these applications were best characterized
as "mere scripts", but I guess that's a subjective judgment on my part.
If they were scripts, they were certainly among the more useful scripts
I've heard of in my career, I'll just question your use of "mere" here.
  
[Or maybe you meant "Mir scripts" ...? Lisp has certainly been used for 
 spacecraft, but I'm not sure it's been on Mir.]

> C is rare among programming languages in that it
> takes downward extensibility as a serious goal and does it well,
  
And why would Lisp want to challenge that.  [You should really read the
entirety of my slashdot interview, since it covers this point as well.]
I don't begrudge C doing this well any more than I begrudge working in a
team of experts where other people are experts in things I am not.  It
detracts from neither them nor me when we each have a skill and can 
collaborate to make something bigger than ourselves.

C is, IMO, a second-generation assembly language, higher level than 
traditional assembly languages, but lower-level than Java, which I think
is also an assembly language.  These languages provide useful power,
but they are not what I would choose to program in at the surface level.

> and C and its descendants practically rule the earth these days.
> I think this is not a coincidence.
  
I suspect C is used today because of the accident of history that it
happened to be the implementation language for Unix at the time it 
took off.

Lisp, by contrast, has been deliberately beaten by people intending to
inflict death upon it for many years, and has survived because it contains
things that will not be denied.

In the modern civil rights movement for Gays, you sometimes hear people
say "Choice? You think I would choose this kind of abuse?"  I think 
Lisp's lot is often the same.

We don't cling to Lisp out of quaint nostalgia.  We cling to it
because the features it offers aren't available in other languages,
and we need them.  We are mindful of the fact that others don't value
them, but it makes those features all the more precious to us.

> This is a design point that is in common with lots of languages
> that had no way to specify such things and no escape; if you wanted
> to downward extend Pascal, for example, you had the same problem
> with access to machine primitives.  Lisp has survived on the
> strength of its upward extensibility, even though every Lisp
> hacker has to use some other language for downward extensions
> and downward extensions aren't portable between implementations.
> That's a heavy burden for a language to carry in terms of
> acceptance and general utility, IMO.

I don't worry at all about the Siren's Call of gaining acceptance.
Acceptance doesn't make useful applications.  Useful applications are
made by programs doing their job.  I worry about impressing people
with what I build, not how I built it.  And I don't feel nearly as
hampered about building tools in Lisp as I do in other languages.

"It's a poor craftsman who blames his tools."  That doesn't keep you
from picking a good tool, but Lisp is already enough better than the
other tools that I'm happy to pick it.  And I have strong reason to
believe that it achieves some of that betterness at the expense of
some of what you're asking for.  You never get anything for free.
You're welcome to argue otherwise, of course.  I'm just one opinion.
I'm not even saying I won't change my mind.  I'm saying you haven't
presented any evidence that would even begin to change my mind.
You're welcome to try.  (But personally, I'd rather see you build
something serious and then come back and tell us a concrete reason
that Lisp is standing in the way of your success, since in my experience
there are usually much simpler ways around THOSE problems than to change
the underlying philosophy of the language.)
From: Ray Dillinger
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <pnsye.3169$p%3.20248@typhoon.sonic.net>
Kent M Pitman wrote:


> IMO, this is a place where we have a fundamental disagreement.  Common
> Lisp is, IMO, fundamentally not about data layout.  For sake of making
> this a lively discussion, I'll even go out on a limb and allege the
> extreme position that all of the Lisp Language Family (even to include
> Scheme) [and I rarely make generalizations about that large a class]
> is not about implementing these details.

You're absolutely right that CL and Scheme are not about the
representation of data.  I wouldn't go so far as to say "Lisp"
isn't, though.  I don't think there's a fundametal conflict
here, only a habit.  How about that famous quote about liking
LISP because it lets you feel the bits between your toes?  That
came from someone who had experienced different dialects of
Lisp, and had different opinions about whether "lisp" allowed
dealing with the representation of data.

> Some languages are about laying out data, what I call "implementation
> languages".  In the abstract, I claim these are fundamentally built on
> micro-managing, and this obsession with data layout seems to me to be 
> the siren's call that keeps you from ever diggig your way out of thinking
> about the machine issues.

Yeah, well, actually running on particular machines is what we
want to do with code anyway, isn't it?  Handling machine issues
is relevant to that goal, IMO. And "implementation languages"
as you call them have good reasons to exist.

> In Lisp, you don't say "I'm going to make an array that is indirect so
> that I can adjust its size later" you say "give me an adjustable array".
> If there is one error in recent design of CL, it's not the failure to 
> provide low-level access to these details, it's the failure to add more
> tools for saying "give me a graphical image" or "give me an XML document"
> just as easily.  That is, the base should be moving up where you appear
> to be saying it should move down.  

It's nice, I think, to have a language where you can take something
unexpected -- say, a binary-format file for a new audio codec - and
write code to do the proper things with it.  As opposed to waiting
for the language implementors to support your particular audio codec
and sound hardware, so you can just say "give me an audio-mv file."

My objection is that the universe of things out there that
you need to deal with is nearly as varied as the universe
of programs you need to write.  What you call an "implementation
language" can be enriched by libraries for dealing with things
(hardware or OS calls) that just anybody wrote; you can even
write your own if you have unusual hardware or OS capabilities.
Our currently extant Lisps, on the other hand, must wait for
the language-system implementors to get around to each and
every such thing before it appears in the nice "abstract" forms
you want.  And their attention is finite, and they can't be
expected to help you support some hadware that you're one of
very few users of.

>>you
>>have to go outside the language to do it,

> or decide not to do it at all.  Are you so sure that this is what stands
> between you and the creation of a commercially successful program?

Flash:  programming systems without the ability to call OS
services or do binary interface with other systems have
broad areas in which they cannot possibly be used. If you
don't have the ability to put up a windowed interface, read
the keyboard and mouse, and use menus, icons, scroll bars,
and tool tips, you WILL NOT be able to create certain types
of commercially successful programs.

> I allege, and you'll just have to find a way to prove me conclusively
> wrong, that this is not what is holding you back.  It may sound good
> to the naive new employee or young child to say that "lack of
> flexibility" is what's keeping them from getting their homework or job
> requirements done.  ("If only you let me do things my way, it would be
> better...")  Once in a while, smart people _can_ do things another
> way.  And once in a really rare time, smart people can _only_ do
> things another way.  But mostly, smart people can figure out how to
> work within a structured system, and not-so-smart people can't work
> except within a structured system.  

I'm gonna call bullshit.  The class of useful programs you can
make without reference to an actual machine (access to screen,
mouse input, etc) is fairly small compared to the class of
useful programs you can make with reference to an actual machine.

> It, for example, led to them exposing the bit patterns of the words, the
> ability to pick apart pointers, the ability to make new pointers that were
> in violation of GC protocols, etc.  In effect, it allowed people to make
> the GC as unreliable in the same way that pointer casting would make Java
> unreliable. 

In a language that allows you to implement GC, you can also
subvert GC.  It's a tautology.  I don't regard it as an actual
problem though; more rope is IMO more useful, even though
undisciplined or incorrect use of it permits one to hang oneself.
"Protecting" the user from himself by denying access to useful
functionality is hardly treating him as an adult, after all.

> Personally, I doubt that all of these applications were best characterized
> as "mere scripts", but I guess that's a subjective judgment on my part.
> If they were scripts, they were certainly among the more useful scripts
> I've heard of in my career, I'll just question your use of "mere" here.

To the extent that these programs deal with real-world hardware,
they are "scripts" tying together bits that are necessarily done
in other languages.  To the extent that they deal with unique
hardware, they are just control layers on top of a complete
infrastructure of code written in "implementation" languages.

				Bear
From: Ray Dillinger
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <CIGye.3221$p%3.20752@typhoon.sonic.net>
I think I just managed to put my finger on an unacknowledged
fact that underpins this whole conversation.

There are two conflicting goals here.  The first and IMO most
fundamental is being able to write software that runs on and
takes full advantage of particular hardware and particular
OS's.  The second and if I understand you correctly IYO most
fundamental is being able to write software that does not depend
on particular hardware and particular OS's. To the extent that
a given hardware or OS provides capabilities which are unique,
and you want to take advantage of them, these are incompatible
goals.

The dangerous, grotty, binary stuff has to be done,
regardless.  If you want to take advantage of particular
hardware or a particular OS's capabilities, then one way
or the other, you have to get down in the mud with the pigs
and extend the Lisp implementation using a language that can
subvert the GC or typesafety or pick apart the pointers and
access the internal tables and read and write the typetags,
etc.

I think that Lisp is a reasonable and natural language for
doing this in; after all, creating language extensions is
in large part what Lisps are about.

You've been advocating a "chinese wall" approach, where access
to such capabilities is performed in an "implementation"
language, allowing the Lisp programming to be completely
abstract and generic.

Either way, good design principles require isolating the
grotty stuff at a low level and building abstractions from
"safe" building blocks;  I advocate that the isolation happen
just like any other abstraction level in program design,
and you advocate forcing the isolation by requiring it to
be done in a different source language.

I might go so far as to put the "dangerous" primitives in
a different section of the language manual, and require
some kind of "wheel" priveleges to check in code that uses
them, and use extraordinary measures in source control
systems to keep them isolated, track their use, and keep
them exclusively in the hands (and heads) of a relatively
few "system wizards."  I agree that they have to change
when the grotty binary stuff underneath them changes, and
ought to be marked in the manuals as being liable to change.
I agree even that *every* use of such primitives ought to
be a decision treated with seriousness and scrutiny by a
code review process.

But I don't buy that it is a good thing that there is
useful, necessary programming that can't be done in Lisp.

				Bear
From: Christopher C. Stacy
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uslysk6qf.fsf@news.dtpq.com>
Ray Dillinger <····@sonic.net> writes:

> But I don't buy that it is a good thing that there is
> useful, necessary programming that can't be done in Lisp.

You mean "in ANSI Common Lisp without the necessary extensions".

Nothing stops any implementation from providing the things
that you need to do that kind of stuff.  In the past, some
have done so.  (Actually, I think there may be some that 
do it now, but I am not familiar with them.)

I already wrote this earlier in the thread; it was ignored.
From: Wade Humeniuk
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <mBJye.118807$tt5.83045@edtnps90>
Ray Dillinger wrote:

> 
> I might go so far as to put the "dangerous" primitives in
> a different section of the language manual, and require
> some kind of "wheel" priveleges to check in code that uses
> them, and use extraordinary measures in source control
> systems to keep them isolated, track their use, and keep
> them exclusively in the hands (and heads) of a relatively
> few "system wizards."  

What a terrible idea.  An autocratic meritocracy.  Though Lisp
was developed under the auspices of an authoritative organization
(the DoD) I think it fell out of favour because of the nature
of the language (and the people using it).  It AND they could not
be controlled by the measures you suggest.  I feel that Lisp is
an organic language, if the above was attempted all that would
happen is the mud would squeeze through your fingers.

I really do not think a Lisp needs to be able to take full
advantage of a particular hardware and OS.  It is interesting that there
are mapping problems between Lisps and Platforms.  It points out
that there may be deficiencies in hardware and OS design (and could
take some hints from the way Lisps have been designed).  As an example,
the Dynamic nature of Lisp will be needed in future systems, as components
may be added and removed in a running system (like USB devices).


Wade
From: Ray Dillinger
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <9WMye.3241$p%3.20865@typhoon.sonic.net>
Wade Humeniuk wrote:
> Ray Dillinger wrote:
> 
>>
>> I might go so far as to put the "dangerous" primitives in
>> a different section of the language manual, and require
>> some kind of "wheel" priveleges to check in code that uses
>> them, and use extraordinary measures in source control
>> systems to keep them isolated, track their use, and keep
>> them exclusively in the hands (and heads) of a relatively
>> few "system wizards."  
> 
> 
> What a terrible idea.  An autocratic meritocracy.  

Hmmm.  Not seeing the problem here.  Meritocracies are very
nice when your goal is getting stuff done.

> Though Lisp
> was developed under the auspices of an authoritative organization
> (the DoD) I think it fell out of favour because of the nature
> of the language (and the people using it).  It AND they could not
> be controlled by the measures you suggest.

Well, the measures I suggest are related to source control
tools anyway; While I do advocate having a different section
in the language manual with big red stopsigns saying "Hey,
if you use this wrong you can break stuff," and I'd regard
the source control tools as valuable programs and products
in their own right, and I think it would be a worthwhile
public service to make sure that people know such tools are
available and how to get them, I'm not going to say boo
about somebody who decides not to use them.  Hey.  Your
project, your rules.

And just as anybody can be root over his own system, anybody
can be wheel in their own source control.


>  I feel that Lisp is
> an organic language, if the above was attempted all that would
> happen is the mud would squeeze through your fingers.

If you attempted it without getting "buy-in" from the developers
on the project, then yeah, you're probably right.  But that would
be a dumb thing to do.

> I really do not think a Lisp needs to be able to take full
> advantage of a particular hardware and OS. 

Yah, that's the fundamental disconnect.  I think you should
be able to write device drivers and operating systems in it.

				Bear
From: lin8080
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <42DF688D.455EAA6E@freenet.de>
Ray Dillinger schrieb:
> Wade Humeniuk wrote:

Hallo

> > I really do not think a Lisp needs to be able to take full
> > advantage of a particular hardware and OS.

> Yah, that's the fundamental disconnect.  I think you should
> be able to write device drivers and operating systems in it.

So do I think. See back in history. The point to me is in words:
  How do I tell the CPU what I want to do. 
And this without any other dependence, the direct way is as fast as
possible. Let my try to describe in a forign language (feels like an
interpreter talks to the chip).

Have a look to Lisps while: it is intern translate to if (or something
else) and the if is written in C and when you compile it, it is angain
translate to hardware-near bytes and when you run it, it is translate to
CPU internals and when something comes back to your screen the whole is
transferred backwards. What a unnecessary mess. But there are more
complex things you can do ... 

My assembler (masm) is brand new in 1995 and works still fine for me
today, while the hardware has change mighty. When I start an
interpreter, I first need an environment to run on -and so I see lot of
things leads to: not portable. But to make it portable it is extra
expense, although all OS have a way to access int10 (save to file). The
direct way ends in a lisp based OS.

No matter what any language layout forced you to type in to get a mov ax
with a value from somewhere. As long as a CPU uses registers to make
things running, this will not be changeable. You can find this behavior
since before Z80 chips (my first register box). No one seems to have a
closer look to this. 

There is only one optimum to get things done with a register based CPU.
So why spend time for all the rest between? For historical reasons most
of this is loan from pascal (nearly disappeared today but the mechanics
are still in use). There is one CPU and it calculates all kinds of the
known digital world. Really, I wish I can say: There is one Lisp and it
computes all kind of thinkable digital things. 

There is something called microcode that keeps the CPU going on.
Developers will not publish that. But they need a way to communicate
with the world outside their chips and this way did not change a lot
(compatibility). The same kind of problem any interpreter has:
communicate with the world outside. This world may change wildly, but in
any way, simple bits are the fundamental parts.

(give me a beep --> error 763  <-- can't do that)

So you hold point A (your lisp) and you know point Z (your chip) and the
rest is wide open and gives you hard problems you don't want to handle.
But when you finished the steps B to Y you can start again because P has
changed. Is this a kind of intellectual game for highflyers? No, its the
daily moat for headsick training lessons. (I'm tired of this and
switched on my 386er board)

Let me say it like this: please make a new lisp-machine with
CPU-friendly code (direct-lisp). You will shed portability problems,
gain more speed and have a self made fundament for future comings. Use
this for all things between the interpreter and the CPU and you only
have to update the layer between these blocks.

stefan

still dreaming from lightwaves
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <u4qb8djqj.fsf@nhplace.com>
Wade Humeniuk <··················@telus.net> writes:

> Ray Dillinger wrote:
> 
> > I might go so far as to put the "dangerous" primitives in
> > a different section of the language manual, and require
> > some kind of "wheel" priveleges to check in code that uses
> > them, and use extraordinary measures in source control
> > systems to keep them isolated, track their use, and keep
> > them exclusively in the hands (and heads) of a relatively
> > few "system wizards."
> 
> What a terrible idea.  An autocratic meritocracy.  Though Lisp
> was developed under the auspices of an authoritative organization
> (the DoD) I think it fell out of favour because of the nature
> of the language (and the people using it).  It AND they could not
> be controlled by the measures you suggest.  I feel that Lisp is
> an organic language, if the above was attempted all that would
> happen is the mud would squeeze through your fingers.

Let me just observe that under free market capitalism, the model
is basically always one of "competing autocracies".  Every company
has someone that says "this is how it will be" and people either
voluntarily subscribe to that autocratic view, or they splinter off
and form their own autocracy where things work in a way that's 
different in some way that makes it possible for them and others
to subscribe to it (sometimes only different in "who's in charge").

Now I'm sure someone will chime in by saying that free software is
neither an autocracy nor a company, but they'll be missing my point,
which is really not about the organizational scheme but about the
competition.  Even under free software models, there are controls in
place as to whose contribution is accepted.  Some of the most open
models in existence, such as Wiki, periodically see problems from
vandals, forcing the owners to reveal their hands as autocrats to
restore order and to admit that there really is order.  

Even from non-vandals, I've been round and round with Stallman in
the early days, long before the GPL, when I used to contribute 
libraries to Emacs and he used to want to rewrite them to be in a
certain style.  I kept saying that if freedom really prevailed, I
could present my libraries in whatever form I wanted.  But he kept
saying, not in so many words, that it was his Emacs to release
and he wanted a sense of coherent order to it.  In my opinion (and
it's just my opinion), he tolerates a free and open model just as
long as it gets him what he wants, and ultimately tries to exert
control when he doesn't like the way others are doing things.  (I'm
not even criticizing that, except for the disingenuous way it leads
people to say that it works by no control being exerted.  It doesn't)

So you don't think I'm just picking on free software, let me say
that even the United States Constitution, which I like a lot, was
not evolved by democratic process.  I doubt it could have been.  My
personal belief is that if you don't start with a constitution that
is pretty close to correct, it will melt down.  It was fairly well
thrust upon people as a done deal, and although people were given
the ability to make arbitrary changes, those people mostly know that
it won't work unless they leave it mostly alone.  The first day they
try to make large changes (keyword: "a new Constitutional Convention")
is the day there will be no working US Constitution, IMO.  

My point, then, is this: The right way to make such changes is not to
tweak an extant, working system, but to make a new one that works
coherently and is controlled by someone with a new, coherent theory.
(That person is the autocrat in my story.  In the case of the US,
the autocrat controlling the initial architect is Jefferson or a
small band of him and his friends.  In the case of free software,
it's RMS and maybe a few others.  But birthing a new system requires
control even if that system itself is to have distributed control.)

In the real world, it's hard to birth a new system because you need
new land, new population, new dollars.  So it's not done much.  That's
a problem.  And it's an argument for robust space exploration because 
absent a very aggressive disease or a huge shortage of food or a large
war, the only way you're going to find unused land is to go into
space.

BUT, for cyberspace, unlike the real world, new spaces are easy to
cons.  If you don't like the way some autocrat has built something,
you can be your own autocrat and build something else.  So while I
agree at one level with Wade that an autocratic meritocracy (as he
describes it) is not the way to go, I disagree with what I think is
his intended conclusion, which is that that ought not be tried or that
it's something to talk Ray out of.  My answer to all of these disputes
is "sure, go make one. see if it works."

Sure, I do have some concern that if we splinter the Lisp community,
it will lose some of its clout in the world.  But other languages
at this point are already doing that.  My bigger concern is that 
if there's a need CL isn't serving and someone can figure out how to
fix it, they should try to find a way to preserve the good in Lisp
rather than simply abandoning it.  I think there's a value to having
people work together but mostly for optimizing chunk size.  e.g.,
if someone is disenchanted with CL, they should talk to the Arc people
or the Dylan people and see what they can work out before going to
make their own Lisp.  Eventually, it's good if someone works together.

I do think the Common Lisp community should continue to be about
working within the CL framework, since the principle value of that
particular community is the large collection of agreed-upon tools.
Each of those tools could have been designed otherwise, but their
value comes not from being Right since it's probably provable there
is no Right.  Their value comes from understanding that there need
to be Arbitrary decisions made even in the face of no Uniquely Right
choice.  The CL designers were willing to do that, and the result 
serves those who don't want to endlessly question ther rightness.

Now, CL might turn out to be mired at some point in the hill-climbing
problem.  (Then again, maybe all language designs are.)  But at least,
for now, there are people satisfied with it, so I personally like the
idea of pushing ahead with it, and I suggest that others "go somewhere
else".  But I don't really mean that in a negative way.  I don't fear
other hills popping up--I think a healthy competition is good.  But
I think that competition will work best if a person with an idea takes
control of their own design space and doesn't fritter away their
energy trying to incrementally modify a space already making other
people happy.

CL was never intended to be about forcing the Lisp community to hold
back.  It was intended to be a refuge where people could park when they
had work to do and didn't want Lisp to change every day so they could 
get the occasional commercial product done.  CL tries to enable a 
substantial amount of personal experimentation, and a substantial amount
of building up, but ultimately some innovation was expected to happen
in other dialectal forums.  Those forums were supposed to have been 
freed of compatibility concerns, at least for a while until they
got their own communities, by CL creating a space for people who worried
about compatibility.

I don't know that I'd use Ray's alternate Lisp but I would not dissuade
him from trying it and attracting his own base of users.

The MOO language, for example, has the capability he describes and is
quite cool.  It was designed by Pavel Curtis, who had Lisp background
(and worked on the CL Condition System committee, for example).  His
knowledge of Lisp influenced MOO's design in many ways and there is
much to learn from it that might take Lisp in interesting ways in the
future.  In fact, if people still funded Lisp language research,
integrating some of MOO's ideas back into Lisp would be one of my
preferred choices of projects to become involved in.  (And in that
regard, I would be following my own advice to "go elsewhere" to do
that deisgn. I would NOT try to assert that it would be good for CL to
become that.  I think it would probably take a new dialect to do it.
And I certainly wouldn't want compatibility concerns holding me back.
Then again, it might be possible to IMPLEMENT that new dialect in
CL, I'm not sure, but that's a different issue and a personal choice.
It's a strange and convoluted world.)

> I really do not think a Lisp needs to be able to take full
> advantage of a particular hardware and OS.

Neither do I.  But I make my remarks not to say "you can't do that"
just to say "I personally don't think that's what CL is trying to do"
(not that I define the meaning of CL).  My remarks about spanning the
language are primarily observational, not definitional.  I don't think
that's where Lisps have traditionally gotten their power.

> It is interesting that there are mapping problems between Lisps and
> Platforms.  It points out that there may be deficiencies in hardware
> and OS design (and could take some hints from the way Lisps have
> been designed).  As an example, the Dynamic nature of Lisp will be
> needed in future systems, as components may be added and removed in
> a running system (like USB devices).

These are good points.
From: Wade Humeniuk
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <z7Wye.100361$HI.71138@edtnps84>
Kent M Pitman wrote:
> 
> BUT, for cyberspace, unlike the real world, new spaces are easy to
> cons.  If you don't like the way some autocrat has built something,
> you can be your own autocrat and build something else.  So while I
> agree at one level with Wade that an autocratic meritocracy (as he
> describes it) is not the way to go, I disagree with what I think is
> his intended conclusion, which is that that ought not be tried or that
> it's something to talk Ray out of.  My answer to all of these disputes
> is "sure, go make one. see if it works."
> 
> Sure, I do have some concern that if we splinter the Lisp community,
> it will lose some of its clout in the world.  But other languages
> at this point are already doing that.  My bigger concern is that 
> if there's a need CL isn't serving and someone can figure out how to
> fix it, they should try to find a way to preserve the good in Lisp
> rather than simply abandoning it.  I think there's a value to having
> people work together but mostly for optimizing chunk size.  e.g.,
> if someone is disenchanted with CL, they should talk to the Arc people
> or the Dylan people and see what they can work out before going to
> make their own Lisp.  Eventually, it's good if someone works together.
> 

And this is a good point.  Being able to be a good cooperator and
follower is every bit as important being a leader.  With the example
of the US constitution, no matter how "right" it was, or no matter
how good the authority behind it, if people did not want to follow
it, its authoriy would fall apart.  Convincing people to give up their
freedom of action to the state is not an easy thing.  The US works
*in-some-sense* by having good followers.  It has just been by
observation of the posts that some people using Lisp are
not good followers.

As for Ray trying out anything he wants, its his choice.  Sure,
go make one, see if it works. :)  But some people do seem to not
what to hear this, they instead want CL to change (or even go
away, or for somehow history to magically change).  And in many cases
they want to lead change without following first.  For these people,
sure go ahead, make your changes, see if it works.  There is
nothing wrong with dangerous mud.

Wade
From: Juliusz Chroboczek
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <7iirzms3ya.fsf@lanthane.pps.jussieu.fr>
Wade Humeniuk <··················@telus.net>:

> And this is a good point.  Being able to be a good cooperator and
> follower is every bit as important being a leader.  With the example
> of the US constitution, no matter how "right" it was, or no matter
> how good the authority behind it, if people did not want to follow
> it, its authoriy would fall apart.

Hmm...  I'd just like to point out that it took 200 years, one civil
war and the invasion of a state that had decided to fork to give the
Constitution of the USA its current status.

                                        Juliusz
From: Wade Humeniuk
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <PDkze.141020$on1.45513@clgrps13>
Juliusz Chroboczek wrote:
> Wade Humeniuk <··················@telus.net>:
> 
> 
>>And this is a good point.  Being able to be a good cooperator and
>>follower is every bit as important being a leader.  With the example
>>of the US constitution, no matter how "right" it was, or no matter
>>how good the authority behind it, if people did not want to follow
>>it, its authoriy would fall apart.
> 
> 
> Hmm...  I'd just like to point out that it took 200 years, one civil
> war and the invasion of a state that had decided to fork to give the
> Constitution of the USA its current status.
> 

So I guess my statement should be appended to:  Social and political
forces have have contributed into making general American society into
good followers of its constitution.  (does that mean that groups
not willing to conform were killed off in some kind of evolutionary
process?)

Wade
From: Ulrich Hobelmann
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <3j6qsfFok0q8U2@individual.net>
Juliusz Chroboczek wrote:
> Wade Humeniuk <··················@telus.net>:
> 
> 
>>And this is a good point.  Being able to be a good cooperator and
>>follower is every bit as important being a leader.  With the example
>>of the US constitution, no matter how "right" it was, or no matter
>>how good the authority behind it, if people did not want to follow
>>it, its authoriy would fall apart.
> 
> 
> Hmm...  I'd just like to point out that it took 200 years, one civil
> war and the invasion of a state that had decided to fork to give the
> Constitution of the USA its current status.

And there are ghettos with *very high* crime rates.  You could say 
that entire parts of cities don't give a damn about the 
constitution, or laws in general.

-- 
By claiming a patent [...], I'm saying that you are not permitted 
to use your own knowledge to further your ends. By what right?
	Roderick T. Long
From: Don Geddis
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <87ackys0sw.fsf@sidious.geddis.org>
Kent M Pitman <······@nhplace.com> wrote on Wed, 06 Jul 2005:
> So you don't think I'm just picking on free software, let me say
> that even the United States Constitution, which I like a lot, was
> not evolved by democratic process.  I doubt it could have been.  My
> personal belief is that if you don't start with a constitution that
> is pretty close to correct, it will melt down.  It was fairly well
> thrust upon people as a done deal, and although people were given
> the ability to make arbitrary changes, those people mostly know that
> it won't work unless they leave it mostly alone.  The first day they
> try to make large changes (keyword: "a new Constitutional Convention")
> is the day there will be no working US Constitution, IMO.  
[...]
> In the case of the US, the autocrat controlling the initial architect is
> Jefferson or a small band of him and his friends.

This doesn't impact your main point, but I just wanted to comment on this
side story of yours.  I'm pretty sure that Jefferson wrote the Declaration
of Independence, but I think he was the US ambassador to France during the
Constitutional Convention.  Jefferson was probably the most prominent US
citizen who was _not_ involved in the Convention.

The main "architects" of the US Constitution were probably James Madison and
Ben Franklin (along with tens of others, including George Washington).  You
say that the Constitution "was not evolved by a democratic process".  Perhaps
you mean after it got written, when the states were only offered a yes/no
vote on the whole thing, but weren't allowed to make any alterations or only
approve of a subset.

However, during the Convention itself, I'm not at all clear on what the
process was for writing the document.  Presumably it was some process that
could indeed be called "democratic".

Anyway, I don't know whether you meant to refer to the Declaration of
Independence instead (which was written basically alone by Jefferson), but if
you did indeed mean the Constitution, then some of the details seem different
than you suggest.

        -- Don
_______________________________________________________________________________
Don Geddis                  http://don.geddis.org/               ···@geddis.org
Truth hurts.  Maybe not as much as jumping on a bicycle with a seat missing,
but it hurts.  -- Drebin, Naked Gun 2 1/2
From: Robert Uhl
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <m3irzmm012.fsf@4dv.net>
Don Geddis <···@geddis.org> writes:
> 
> However, during the Convention itself, I'm not at all clear on what
> the process was for writing the document.  Presumably it was some
> process that could indeed be called "democratic".

Almost certainly not, since it was the Convention, not the people
(demos) which formed it.  Really, the success of our government (insofar
as it has held true to its Founders' ideals--which is not very far at
all) is due to the fact that it was developed by an oligarchy.

-- 
Robert Uhl <http://public.xdi.org/=ruhl>
Of course, if you're writing the code to control a cruise missile, you
may not actually need an explicit loop exit.  The loop will be
terminated automatically at the appropriate moment.
                         -Programming Perl, 3rd Ed.
From: Ulrich Hobelmann
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <3j6rc3Foin9gU1@individual.net>
Robert Uhl wrote:
> Don Geddis <···@geddis.org> writes:
> 
>>However, during the Convention itself, I'm not at all clear on what
>>the process was for writing the document.  Presumably it was some
>>process that could indeed be called "democratic".
> 
> 
> Almost certainly not, since it was the Convention, not the people
> (demos) which formed it.

Well, what country was EVER democratic in the real sense of the 
word?  To me that would include that representatives would not 
implement laws that the public doesn't care about / understand, 
and that they respect it when the public rejects (or rather, tries 
to reject) a certain law.

In most countries I know, the representatives and the people are 
on opposing sides, mostly due to bribery (call it lobbyism) and 
neglect of property rights and the state spending peoples' money 
for just about anything they feel like.

> Really, the success of our government (insofar
> as it has held true to its Founders' ideals--which is not very far at
> all) is due to the fact that it was developed by an oligarchy.

Most good decisions and designs come from only a few minds.  That 
doesn't preclude that they are approved democratically my a 
majority or even in total consensus.

The problems I mentioned above are because often the people can 
NOT decide or veto what the representatives do, or aren't even asked.

-- 
By claiming a patent [...], I'm saying that you are not permitted 
to use your own knowledge to further your ends. By what right?
	Roderick T. Long
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <ud5purksd.fsf@nhplace.com>
Don Geddis <···@geddis.org> writes:

> Kent M Pitman <······@nhplace.com> wrote on Wed, 06 Jul 2005:
> > So you don't think I'm just picking on free software, let me say
> > that even the United States Constitution, which I like a lot, was
> > not evolved by democratic process.  I doubt it could have been.  My
> > personal belief is that if you don't start with a constitution that
> > is pretty close to correct, it will melt down.  It was fairly well
> > thrust upon people as a done deal, and although people were given
> > the ability to make arbitrary changes, those people mostly know that
> > it won't work unless they leave it mostly alone.  The first day they
> > try to make large changes (keyword: "a new Constitutional Convention")
> > is the day there will be no working US Constitution, IMO.  
> [...]
> > In the case of the US, the autocrat controlling the initial architect is
> > Jefferson or a small band of him and his friends.
> 
> This doesn't impact your main point, but I just wanted to comment on this
> side story of yours.  I'm pretty sure that Jefferson wrote the Declaration
> of Independence, but I think he was the US ambassador to France during the
> Constitutional Convention.  Jefferson was probably the most prominent US
> citizen who was _not_ involved in the Convention.

Ah.  Hmmm.  I didn't realize that part.  It isn't strictly relevant to
my point, but I do appreciate the correction.

> The main "architects" of the US Constitution were probably James Madison and
> Ben Franklin (along with tens of others, including George Washington).  You
> say that the Constitution "was not evolved by a democratic process".

Yes, I meant a process involving all the people.  I didn't mean to
refer to the process of drafting it "in committee".

This comes as little surprise, given the thought that Madison put into
the Federalist Papers (which I've been listening to on audio--quite a
set of documents.  (I'm somewhat of the opinion that it would make a
worthwhile replacement for some of the Shakespeare people read in high
school.  Not that I have anything against Shakespeare, but I had to
read several of his plays in high school, and probably just 1 or 2
would have been adequate if there had been something else meaty to do
instead.  The Federalist Papers are full of flowery wording, but also
would give a nice crossover between US History and Political Theory
and English... Too many US classrooms treat the various subjects as 
isolated and not involving the others.))

> Perhaps you mean after it got written, when the states were only
> offered a yes/no vote on the whole thing, but weren't allowed to
> make any alterations or only approve of a subset.

Yes, that's the point.  Not a vote of the people it affected.  Even
companies which I've described as autocracies often have boards of
directors or other management meetings that meet in the small.  But
they get a coherent idea and dictate it to the larger body, which either
takes it as an edict not to be questioned or leaves to form its own
governing circle.  

It is the individual workerbee questioning each decision that comes down
from on top that seems inefficient and not much a property of the free
market.  The market uses the right to quit and move to another project
as the recourse that protects the worker... at least to some extent.
(Obviously, this is sometimes infeasible and that creates another set of
problems.  Just as it's sometimes problematic for someone to clone a new
dialect of Lisp just because they don't like a certain feature, even though
their real recourse if they don't like CL [or any other language/dialect]
is, in general, to do just that.)

> However, during the Convention itself, I'm not at all clear on what the
> process was for writing the document.  Presumably it was some process that
> could indeed be called "democratic".

Yes, but "in the small".  Consensus decision making is tough going and breaks
down "in the large".

One reason we vote on representatives and don't resort to strict
populism in creating laws is that if we try to do detailed
decision-making in the large, we get bad results because it's too hard
to construct ballots that are contain appropriate dependencies.  You
end up wanting square pegs and square holes or round pegs and round
holes, but someone sets up the ballot to vote on pegs and holes
separately and people end up voting for square pegs and round holes.
This problem is richly discussed in the history of perceptrons, and
democracy seems a useful tool but just as hampered as perceptrons in
its ability to recognize various important patterns that some might
call "good ideas".

> Anyway, I don't know whether you meant to refer to the Declaration of
> Independence instead 

No, but I probably confused in my mind the history of the two.  I did
mean to refer to the Constitution and just got the players wrong.

> (which was written basically alone by Jefferson), but if you did
> indeed mean the Constitution, then some of the details seem
> different than you suggest.

Thanks for setting the record straight.
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <ur7e9ux2n.fsf@nhplace.com>
Paul Foley <···@below.invalid> (http://public.xdi.org/=pf) writes:

> On Thu, 07 Jul 2005 15:34:23 -0700, Don Geddis wrote:
> 
> > Kent M Pitman <······@nhplace.com> wrote on Wed, 06 Jul 2005:
> 
> >> In the case of the US, the autocrat controlling the initial architect is
> >> Jefferson or a small band of him and his friends.
> 
> > This doesn't impact your main point, but I just wanted to comment on this
> > side story of yours.  I'm pretty sure that Jefferson wrote the Declaration
> > of Independence, but I think he was the US ambassador to France during the
> > Constitutional Convention.  Jefferson was probably the most prominent US
> > citizen who was _not_ involved in the Convention.
> 
> My understanding is that he was sent to France to get him out the way
> so that the Constitution could be passed -- Jefferson was opposed, and
> presumably had a good shot at preventing it, had he been present.

Wow, I never heard anything like that.  But, on the other hand, it
does sound quite plausible the way you present it...

When being introduced to ANSI's committee voting rules for in-person
meetings, we were warned about how the voting dynamics would change if
someone took a bathroom or cigarette break (which could alter the
balance of power momentarily).  [Of course, later in the process, with
fewer members after more attrition, one person could easily change the
quorum state.]
From: Juliusz Chroboczek
Subject: Re: Competing autocracies [was: ILC2005: McCarthy...]
Date: 
Message-ID: <7imzoys433.fsf_-_@lanthane.pps.jussieu.fr>
Kent M Pitman <······@nhplace.com>:

> Let me just observe that under free market capitalism, the model
> is basically always one of "competing autocracies".

Right.

> Even under free software models, there are controls in place as to
> whose contribution is accepted.

Yes.  The main difference being that Free Software projects compete
for both users' and developers' mindshare, while proprietary projects
compete mostly for customers.

> In my opinion (and it's just my opinion), [a well-known Free
> Software developer] tolerates a free and open model just as long as
> it gets him what he wants, and ultimately tries to exert control
> when he doesn't like the way others are doing things.

Very true in general (let's avoid speaking about particular developers).

The difference being of course that the Free Software world has
developed a mechanism for dealing productively with disagreements
among developers -- the fork.  There are quite a few examples where a
successful fork has enabled developers that were not in power to move
a project in a direction where the original leaders would not go --
GCC and XFree86 come to mind.  There are also cases where both groups
of developers were right, in some suitable sense, and both branches of
the project are alive (FSF Emacs vs. XEmacs, CMUCL vs. SBCL).

Compare that to proprietary software, and the many proprietary
projects that died or are dying because their owners would not take
them where the users are (OS/2, Genera).

                                        Juliusz
From: Brian Downing
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <yufze.143646$xm3.135474@attbi_s21>
In article <····················@typhoon.sonic.net>,
Ray Dillinger  <····@sonic.net> wrote:
> The dangerous, grotty, binary stuff has to be done,
> regardless.  If you want to take advantage of particular
> hardware or a particular OS's capabilities, then one way
> or the other, you have to get down in the mud with the pigs
> and extend the Lisp implementation using a language that can
> subvert the GC or typesafety or pick apart the pointers and
> access the internal tables and read and write the typetags,
> etc.

What, you mean like this?

CL-USER> (let* ((foo (make-array 4 
                                 :initial-contents '(16 32 64 128)
                                 :element-type '(unsigned-byte 8)))
                (bar (make-array 4 
                                 :initial-contents '(#xde #xea #xbe #xef)
                                 :element-type '(unsigned-byte 8)))
                (vector-sap (sb-sys:vector-sap foo)))
           (print bar)
           (format t "~&~X" (sb-sys:sap-ref-32 vector-sap 0))
           (setf (sb-sys:sap-ref-32 vector-sap 0) 123456789)
           (print foo)
           (setf (sb-sys:sap-ref-32 vector-sap -4) 128) ; length
           (let ((*print-base* 16)) (print foo)))

#(222 234 190 239) 
10204080
#(7 91 205 21) 
#(7 5B CD 15 0 0 0 0 0 0 0 10 8 0 0 B 0 0 0 96 0 0 0 10 DE EA BE EF 0 0 0 0) 
                                        here's bar -----^

I think most self-hosting lisps have similar abilities.

-bcd
-- 
*** Brian Downing <bdowning at lavos dot net> 
From: Juliusz Chroboczek
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <7ir7eas650.fsf@lanthane.pps.jussieu.fr>
Brian Downing <·············@lavos.net> writes:

> CL-USER> (let* ((foo (make-array 4 
>                                  :initial-contents '(16 32 64 128)
>                                  :element-type '(unsigned-byte 8)))
>                 (bar (make-array 4 
>                                  :initial-contents '(#xde #xea #xbe #xef)
>                                  :element-type '(unsigned-byte 8)))
>                 (vector-sap (sb-sys:vector-sap foo)))
>            [...]

You really need to wrap the lot within SYS:WITHOUT-GCING, as the GC
doesn't move SAPs for you.

* (setf /a (make-array 4 :element-type '(unsigned-byte 8))) 
#(0 0 0 0)
* (setf /b (sys:vector-sap /a))
#.(SYSTEM:INT-SAP #x58025098)
* (cons 1 2)
(1 . 2)
* (ext:gc :full t)
NIL
* (sys:vector-sap /a)
#.(SYSTEM:INT-SAP #x5800C8A8)
* /b
#.(SYSTEM:INT-SAP #x58025098)
*



                                        Juliusz
From: Brian Downing
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <Ywgze.143719$xm3.61843@attbi_s21>
In article <··············@lanthane.pps.jussieu.fr>,
Juliusz Chroboczek  <···@pps.jussieu.fr> wrote:
> You really need to wrap the lot within SYS:WITHOUT-GCING, as the GC
> doesn't move SAPs for you.

Well, yes.  It was just a quick hack to prove a point, and the
grandparent poster did ask for "dangerous."  :-)

-bcd
-- 
*** Brian Downing <bdowning at lavos dot net> 
From: Ray Dillinger
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <ubqye.3167$p%3.19798@typhoon.sonic.net>
Christopher C. Stacy wrote:
> Ray Dillinger <····@sonic.net> writes:
> 
> 
>>C is rare among programming languages in that it takes downward
>>extensibility as a serious goal and does it well, and C and its
>>descendants practically rule the earth these days.  
>>I think this is not a coincidence.
> 
> 
> Contrary to popular belief, the C language does not provide a
> way to access low-levels (such as memory locations or registers) 
> of the machine that executes the program.  The ISO standard is
> quite clear on the fact that C program semantics are defined
> in terms of abstract execution environment (eg. "in which issues
> of optimization are irrelevant", 5.1.2.3).

sigh.  I'm not talking about the C spec, I'm talking
about C.

In the real world, C programmers can write subsections
of their programs in machine code and that machine code
will usually run fine even if the C code is compiled
by a different compiler.  It is outside the scope of
the *official* standard, but the data layouts of
structs, etc, are so strongly constrained by tradition
and expectation that they are an *effective* standard.

Hell, the tradition is *so* strong that *binary*
interfaces to OS calls are usually specified these
days in terms of C structs, and the implementors of
code to call those functions from *other languages*
all know exactly what data layout those C structs
refer to and can implement calls to them from other
language systems.

I'm not going to quibble about what's in C's spec.
The spec is (very clearly in this case) not the
language.

				Bear
From: Pascal Costanza
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <3j1ikiFntleeU1@individual.net>
Ray Dillinger wrote:
> Christopher C. Stacy wrote:
> 
>> Ray Dillinger <····@sonic.net> writes:
>>
>>> C is rare among programming languages in that it takes downward
>>> extensibility as a serious goal and does it well, and C and its
>>> descendants practically rule the earth these days.  I think this is 
>>> not a coincidence.
>>
>> Contrary to popular belief, the C language does not provide a
>> way to access low-levels (such as memory locations or registers) of 
>> the machine that executes the program.  The ISO standard is
>> quite clear on the fact that C program semantics are defined
>> in terms of abstract execution environment (eg. "in which issues
>> of optimization are irrelevant", 5.1.2.3).
> 
> sigh.  I'm not talking about the C spec, I'm talking
> about C.

So you're effectively comparing C / some implementation against ANSI 
Common Lisp / a spec. You should compare specs against specs and 
implementations against implementations...


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Ray Dillinger
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <Jm%ye.3317$p%3.21124@typhoon.sonic.net>
Pascal Costanza wrote:
> Ray Dillinger wrote:

>> sigh.  I'm not talking about the C spec, I'm talking
>> about C.
> 
> 
> So you're effectively comparing C / some implementation against ANSI 
> Common Lisp / a spec. You should compare specs against specs and 
> implementations against implementations...

I'm comparing areas where C has achieved useful and exploitable
uniformity in implementations with areas where Lisp has achieved
useful and exploitable uniformity in implementations.

Some of those areas are dictated by the specs of the respective
langauges, and some not; I don't really care which is which.

				Bear
From: Paolo Amoroso
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <871x6eyffn.fsf@plato.moon.paoloamoroso.it>
Ray Dillinger <····@sonic.net> writes:

> BTW, I would *love* to look at the Lisp Machine source code.
> Can I?  If so where?  And IIRC, the LispM crewe had NLAMBDA

Have a look at this site:

  http://sdf.lonestar.org/

The bottom of the page says:

  SDF uses DEC (hp) Alphas running NetBSD, TOPS-20 and Symbolics GENERA

If this means they have a LispM accessible online, and they offer
accounts on it, then it should be possible to see some source code,
provided it is also kept online.


Paolo
-- 
Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film
Recommended Common Lisp libraries/tools:
- ASDF/ASDF-INSTALL: system building/installation
- CL-PPCRE: regular expressions
- UFFI: Foreign Function Interface
From: Christopher C. Stacy
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <u7jgk9rg2.fsf@news.dtpq.com>
Pascal Costanza <··@p-cos.net> writes:
> In that regard, standards and standardization hurt.

What kind of bold iconoclastic geniuses are these, 
who would be conceiving and bringing to life their
great new ideas for programming languages, but who 
will now never do anything, merely because someone 
pointed out that Common Lisp already exists?
From: Pascal Costanza
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <3i1pikFj8ditU1@individual.net>
Christopher C. Stacy wrote:
> Pascal Costanza <··@p-cos.net> writes:
> 
>>In that regard, standards and standardization hurt.
> 
> What kind of bold iconoclastic geniuses are these, 
> who would be conceiving and bringing to life their
> great new ideas for programming languages, but who 
> will now never do anything, merely because someone 
> pointed out that Common Lisp already exists?

That's not what I said.


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Christopher C. Stacy
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <u1x6sb95c.fsf@news.dtpq.com>
"Sashank Varma" <············@yahoo.com> writes:
> (1) Has progress in Lisp slowed dramatically since CLtL1?
> (2) Did CLtL1 *cause* this slowdown?

I think we need  "better" basis for a useful 
discussion about this,  By which I mean:
Define "progress" and "slowdown".

And do people think that those are mutually exclusive?
From: Sashank Varma
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1119593976.117783.258390@g43g2000cwa.googlegroups.com>
Christopher C. Stacy wrote:

> "Sashank Varma" <············@yahoo.com> writes:
> > (1) Has progress in Lisp slowed dramatically since CLtL1?
> > (2) Did CLtL1 *cause* this slowdown?
>
> I think we need  "better" basis for a useful
> discussion about this,  By which I mean:
> Define "progress" and "slowdown".
>
> And do people think that those are mutually exclusive?

I'm not sure what you're getting at. Progress can slow down in the
sense of decelerating, i.e., the second derivative is negative. There's
no doubt there's been progress over the last 21 years (e.g., the MOP).
Question (1) is about whether Lisp has declined from being a fecund
source of programming language innovations, as it was in the 1960s and
1970s.
From: Christopher C. Stacy
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uk6kk9s48.fsf@news.dtpq.com>
"Sashank Varma" <············@yahoo.com> writes:

> Christopher C. Stacy wrote:
> 
> > "Sashank Varma" <············@yahoo.com> writes:
> > > (1) Has progress in Lisp slowed dramatically since CLtL1?
> > > (2) Did CLtL1 *cause* this slowdown?
> >
> > I think we need  "better" basis for a useful
> > discussion about this,  By which I mean:
> > Define "progress" and "slowdown".
> >
> > And do people think that those are mutually exclusive?
> 

> I'm not sure what you're getting at. Progress can slow 
> down in the sense of decelerating, i.e., the second
> derivative is negative.

I'm not sure what you're getting at.
What is the definition of "progress"?
New features?
Is that desirable?

Is the complaint that there has been insufficient new features 
or new languages and that the fault lies in having developed 
a language which already suited the needs of its users?

Besides, I don't see anyone stopping anyone else from 
inventing their own versions of Lisp.

Also, one can play all kinds of guessing games about imaginary
alternate universe histories, so I'm not even sure what the 
point of the discussion is.

> Question (1) is about whether Lisp has declined from
> being a fecund source of programming language
> innovations, as it was in the 1960s and 1970s.

Seems to me that most of the world haven't even 
caught up to the Lisp ideas from two decades ago.

Also seems to me that people have taken some ideas
from other languages and applied them to Lisp.
(For example, in the last few weeks I saw messages
here from people taking ideas from functional
programming and from distributed programming
languages and trying to make dialects of Lisp.)

I can't imagine what the point is of whether Lisp 
has "declined", anyway.   Perhaps nobody has had
any super great ideas lately.   That's somehow
the fault of people who like to use Common Lisp?

If someone has their own great ideas and wants to stop
working on Common Lisp and go invent their own language, 
they should feel free.  It's not like if they come into
close proximity with Lisp people that things are going
dangerously Arc.
From: Sashank Varma
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1119599498.231951.125310@o13g2000cwo.googlegroups.com>
Christopher C. Stacy wrote:

> I'm not sure what you're getting at.
> What is the definition of "progress"?
> New features?
> Is that desirable?

Not in and of itself.

I don't mean "new" features in the sense of "standardizing things that
are not yet standardized in Common Lisp but that are standardized in
other languages," like threads. I actually don't care about this stuff.
I'm happy to use the extensions my particular implentation provides to
interface to mundane services outside of Lisp.

> Besides, I don't see anyone stopping anyone else from
> inventing their own versions of Lisp.

Agreed.

> Seems to me that most of the world haven't even
> caught up to the Lisp ideas from two decades ago.

Agreed. But has Common Lisp the language progressed at all since the
ANS appeared?

I guess this is the real pea under my mattress. Lisp was for the first
three decades of its life a fecund source of new ideas about
programming languages. During this time, the language splintered into
different dialects -- MacLisp, Interlisp, Scheme, Lisp Machine Lisp,
etc. -- but this did not slow the pace of innovation. In fact, it had
the opposite effect.

The problem with many sufficiently diffferent dialects is that porting
code is difficult, so Common Lisp came into being. Soon thereafter,
innovation slowed. Sure, things like Connection Machine Lisp and the
MOP came into existence, but CLtL2 and the ANS seem to have been about
codifying ideas that already existed in one form or another by the mid
1980sw, but did not make it into CLtL1 for political or pragmatic
reasons.

So the question is: Why is Lisp producing fewer programming language
innovations now than it was 25 years ago? One "explanation" is
standardization -- this is what Henry Baker and John McCarthy seemed to
have been saying at the 2005 ILC. However, I think this is just a
coincidence of history. As you say, the ANS does not stop someone from
inventing a new Lisp variant.

> I can't imagine what the point is of whether Lisp
> has "declined", anyway.   Perhaps nobody has had
> any super great ideas lately.

This is my great fear. In the Dynamic Language Wizards panel discussion
of a year ago, someone lamented that the current ACM model computer
science curriculum devotes on 5 hours to the study of programming
language design. Guy Steele said maybe this wasn't a bad thing, that
maybe computer science had passed beyond the time when new programming
languages had to be invented on a weekly basis. Maybe we really are at
some kind of language design plateau.

Another possibility is that some other force is holding us back. As
Lispers, the invention of new programming language constructs is our
particular forte. So what's stopping us? Not standardization. What
then?

===

Here is another possible explanation for what I and some others feel: I
was talking with a friend recently while visiting Pittsburgh. She's a
Zen type. I told her that my visit was disorienting because normally I
feel nostalgic when I return to the place where I did my undergraduate
work, but this time was different. She told me that nostalgia is a
distortion, a way of hiding from one's current reality in the myth of
the past. The laments one hears about the slowdown in Lisp's progress
might be just this -- nostalgia.

What's weird is that they are rarely spoken by those who actually lived
through the halcyon days of MIT in the 1970s, Symbolics in the early
1980s, etc. More frequently, they are uttered by (relative) newbies
like myself -- I discovered the language in the spring of 1988 -- who
just missed the golden age of Lisp. It is a second-hand nostalgia we
feel.

Hmmm.
From: Matthias Buelow
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <863br81248.fsf@drjekyll.mkbuelow.net>
"Sashank Varma" <············@yahoo.com> writes:

>This is my great fear. In the Dynamic Language Wizards panel discussion
>of a year ago, someone lamented that the current ACM model computer
>science curriculum devotes on 5 hours to the study of programming
>language design. Guy Steele said maybe this wasn't a bad thing, that
>maybe computer science had passed beyond the time when new programming
>languages had to be invented on a weekly basis. Maybe we really are at
>some kind of language design plateau.

Plus, programming language research has moved on to languages more
"fruitful" in theoretical results, i.e., functional languages like ML
and Haskell, with strong type systems, where there are quite a few
open problems still. Perhaps Lisp simply has become too boring for
research with no really interesting problems left? Writing the nth
object system, compiler, or garbage collector or what have you isn't
going to cut it anymore. That's old, well understood stuff by now,
that is, it has moved to mainstream. And you apparently cannot
mathematically formalize a dynamic type system as well as a strict,
inferential one, and this alone would make it quite unattractive for
use in research papers.

mkb.
From: Peter Seibel
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <m24qbnhk6b.fsf@gigamonkeys.com>
Matthias Buelow <···@incubus.de> writes:

> "Sashank Varma" <············@yahoo.com> writes:
>
>>This is my great fear. In the Dynamic Language Wizards panel
>>discussion of a year ago, someone lamented that the current ACM
>>model computer science curriculum devotes on 5 hours to the study of
>>programming language design. Guy Steele said maybe this wasn't a bad
>>thing, that maybe computer science had passed beyond the time when
>>new programming languages had to be invented on a weekly
>>basis. Maybe we really are at some kind of language design plateau.
>
> Plus, programming language research has moved on to languages more
> "fruitful" in theoretical results, i.e., functional languages like
> ML and Haskell, with strong type systems, where there are quite a
> few open problems still. Perhaps Lisp simply has become too boring
> for research with no really interesting problems left? Writing the
> nth object system, compiler, or garbage collector or what have you
> isn't going to cut it anymore. That's old, well understood stuff by
> now, that is, it has moved to mainstream. And you apparently cannot
> mathematically formalize a dynamic type system as well as a strict,
> inferential one, and this alone would make it quite unattractive for
> use in research papers.

Well, there are other avenues of research. Much of what Baker talked
about at the just-completed ILC were things that would fall in the
domain of programming language user interface. Which *is* an area
where Lisp has historically been both strong and a source of
innovation. Indeed, one of Baker's big points was that when the Lisp
community came to the fork in the road between Interlisp (with it's
structure editor) and Maclisp (with programs ultimately stored as text
in files), they made a mistake by taking the Maclisp fork. I don't
know if can get a PhD in computer science for coming up with a better
tool for programmers (as opposed to a better mathamatical basis for
proving things about computation) but I do agree with Baker that there
are still improvements to be made in how programmers interact with
their programs.

-Peter

-- 
Peter Seibel           * ·····@gigamonkeys.com
Gigamonkeys Consulting * http://www.gigamonkeys.com/
Practical Common Lisp  * http://www.gigamonkeys.com/book/
From: alex goldman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <39991889.fj3PZjOk38@yahoo.com>
Peter Seibel wrote:

> Interlisp (with it's
> structure editor)

What's that structure editor? I'd like to see some screenshots, or play with
it.
From: Fred Gilham
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <u7wtoj78un.fsf@snapdragon.csl.sri.com>
alex goldman <·····@spamm.er> writes:

> Peter Seibel wrote:
> 
> > Interlisp (with it's
> > structure editor)
> 
> What's that structure editor? I'd like to see some screenshots, or
> play with it.

Look for the LFG system, which runs using the ENVOS Interlisp
implementation.  Here's the link to their page.  Look for the LFG
software on this page.

http://www2.parc.com/istl/groups/nltt/medley/

Once you get this running, get an Interlisp lisp listener and type
(SEDIT:SEDIT).  At this point you will become completely lost.  This
means you did everything right so far.  You will be unable to do
anything until you find some documentation.  I'm sure there's some out
there somewhere.

-- 
Fred Gilham                                        ······@csl.sri.com
The PTPL (People's Trotskyist Programming League) believes that
hackers are elitist and that all software should be created by the
masses flailing away at millions of keyboards.  I didn't have the
heart to tell them that this has already been tried and the result is
called Linux.
From: Robert Maas, see http://tinyurl.com/uh3t
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <REM-2005jun29-003@Yahoo.Com>
> From: alex goldman <·····@spamm.er>
> > Interlisp (with it's
> > structure editor)
> What's that structure editor?

I'll explain it to you. First a digression: My first exposure to Lisp
was when I saw Larry Masinter sitting in the Stanford A.I. lab dialed
out to a computer at BBN running InterLisp. He showed me how the system
worked, including the editor, the DWIM feature, etc. I had no way to
get an account at BBN, so I started using cruddy Stanford Lisp 1.6
until UCI-LISP came along which was basically BBN-Lisp (InterLisp)
editor grafted into Stanford Lisp 1.6.

The basic idea is that you have an interactive Lisp environment, with
some files of Lisp source loaded, mostly DEFUNs and SETQs. At any time
you can type in raw text as s-exprssions to define new functions or set
new values, or you can edit something you've already entered. You pull
up the DEFUN or the value cell of a variable, into the editor. It shows
you the print form, with anything too deep or too long replaced by # in
a structured way. For example, you might see:
(defun count-isolated-hyphens (str)
 (prog (ix0 cnt ix1) (setq ix0 0) (setq cnt 0) lp
   (setq ix1 #) (cond (ix1 # # #)) (return cnt)))
(Sorry, I never had access to BBN/InterLisp, and I don't have any
UCI-Lisp code around, so I copied a function from one of my CommonLisp
programs to illustrate the BBN-Lisp editor, I hope it's OK.)
So anyway, initially you're looking at the top level of the DEFUN
there, but you can move into any particular part you want to edit,
usign a text command. For example, something like D 4 means go down
into the 4th sub-expression (numbered from 1 instead of 0, so 0 is a
special escape for something I forget). So let's say you did that, now
you'd see:
(prog (ix0 cnt ix1) (setq ix0 0) (setq cnt 0) lp
 (setq ix1 (search " - " str :start2 ix0))
 (cond (ix1 (incf cnt) (setq ix0 (+ ix1 3)) (go lp))) (return cnt))
Now you can see the whole thing because it fits within the limits that
were set up. Now suppose you want to move the third sub-expression to
before the second, i.e. swap them. So you say something like M 3 B 2
and it now shows:
(prog (setq ix0 0) (ix0 cnt ix1) (setq cnt 0) lp
 (setq ix1 (search " - " str :start2 ix0))
 (cond (ix1 (incf cnt) (setq ix0 (+ ix1 3)) (go lp))) (return cnt))
Now here's where the fun begins: You have commands to "move
parentheses", which really means spreading or unspreading some group of
sub-expressions (in the sense that APPLY is spread but EVAL is
nospread, understand)? So you give a command to insert or delete a
parens before or after some sub-expression, and it does the same thing
as if you were in EMACS and you deleted or insterted a parens character
then immediately ran to the other end of the sub-expression to delete
or insert the matching parens. Another thing you can do is mark a
particular sub-expression or range of sub-expressions, cut it out of
the actual program structure and put it elsewhere in a special
register, which you can then splice in elsewhere later after you've
moved your viewpoint around. You also have powerful commands to
recursively enter the editor on some sub-expression, etc.

Now that you have the basic idea, with all the actual command names
screwed up but what do you expect appx. 27 years after I last used
UCI-LISP, let's see if we can find a UCI-LISP manual online. It has a
really nice section fully documenting how to use the editor. ... Thanks
to Google, which is fully compatible with lynx, I found it online!! I
mis-remembered the abbreviation character, it's & instead of #, # is
used for printing circular lists I now seem to recall, but here's the
document you want to read, and backtrace up to Google search I did:
  5. PDP-10 Archive: 43,50322/edit.doc from decuslib10-04
     http://pdp-10.trailing-edge.com/decuslib10-04/01/43,50322/edit.doc.html
  4. DECUS 10-LIB-4 Contains 10-210 through 10-241, except 10-223.
     http://pdp-10.trailing-edge.com/decuslib10-04/index.html
  3. History of LISP         --         Software Collection Committee
     http://community.computerhistory.org/scc/projects/LISP/
  2. Google Search: uci-lisp manual
     http://www.google.com/search?hl=en&ie=ISO-8859-1&q=uci-lisp+manual&btnG=Google+Search
Item 5 is the UCI-LISP editor manual, item 4 is the entire contents of
the backup tape including source code and documentation for UCI-LISP,
item 3 is a project trying to trace the history of Lisp forward from
MIT Lisp 1.5. Enjoy!
From: alex goldman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <2794648.LyTmuMMyIH@yahoo.com>
Robert Maas, see http://tinyurl.com/uh3t wrote:

> Now here's where the fun begins: You have commands to "move
> parentheses", which really means spreading or unspreading some group of
> sub-expressions (in the sense that APPLY is spread but EVAL is
> nospread, understand)? So you give a command to insert or delete a
> parens before or after some sub-expression, and it does the same thing
> as if you were in EMACS and you deleted or insterted a parens character
> then immediately ran to the other end of the sub-expression to delete
> or insert the matching parens. Another thing you can do is mark a
> particular sub-expression or range of sub-expressions, cut it out of
> the actual program structure and put it elsewhere in a special
> register, which you can then splice in elsewhere later after you've
> moved your viewpoint around. You also have powerful commands to
> recursively enter the editor on some sub-expression, etc.

Doesn't Emacs/Slime already let you do these things?
From: Juliusz Chroboczek
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <7i3br1gdr0.fsf@lanthane.pps.jussieu.fr>
>> What's that structure editor?

You can get a ``free for non-commercial usage'' copy of Medley (the
Interlisp environment for stock hardware) for Linux/x86 and
Solaris/Sparc from

  http://www2.parc.com/istl/groups/nltt/medley/

Make sure to check the license before playing with it.

After starting the environment as described in the README, left-click
on the LFG icon and choose ``kill LFG windows''.  Then right-click on
the background, and choose an entry in the ``EXEC'' submenu.

Remember that Interlisp is case-sensitive and tends to prefer
upper-case: CapsLock is your friend.  Type (EXIT) (or (IL:EXIT) from
Common Lisp) to quit the environment.

                                        Juliusz
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uekalymt2.fsf@nhplace.com>
Juliusz Chroboczek <···@pps.jussieu.fr> writes:

> >> What's that structure editor?
> 
> You can get a ``free for non-commercial usage'' copy of Medley (the
> Interlisp environment for stock hardware) for Linux/x86 and
> Solaris/Sparc from [...]

Or get a PDP10 emulator, enter Maclisp, and use the "kludgey binford editor".
Heh...
From: Juliusz Chroboczek
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <7ivf3xexq4.fsf@lanthane.pps.jussieu.fr>
Kent M Pitman <······@nhplace.com>:

> Or get a PDP10 emulator, enter Maclisp, and use the "kludgey binford
> editor".  Heh...

I'll do that as soon as you put a bootable image on the web.

(Please?)

                                        Juliusz
From: Christopher C. Stacy
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <u64vwbxe0.fsf@news.dtpq.com>
Juliusz Chroboczek <···@pps.jussieu.fr> writes:

> Kent M Pitman <······@nhplace.com>:
> 
> > Or get a PDP10 emulator, enter Maclisp, and use the "kludgey binford
> > editor".  Heh...
> 
> I'll do that as soon as you put a bootable image on the web.
> 
> (Please?)

Bootable image of what?
From: Matthias Buelow
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <86fyv0sqar.fsf@drjekyll.mkbuelow.net>
······@news.dtpq.com (Christopher C. Stacy) writes:

>> > Or get a PDP10 emulator, enter Maclisp, and use the "kludgey binford
>> > editor".  Heh...
>> 
>> I'll do that as soon as you put a bootable image on the web.
>> 
>> (Please?)
>
>Bootable image of what?

Hmm, what OS did Maclisp run on? TOPS-10/20? ITS? And what pdp-10
emulator is available/recommended, if any? I have used `sim' in the
past, to emulate a pdp-11 to run Unix v5..v7 on it for fun, which
worked pretty well, but it doesn't seem to be able to emulate the -10.
I'd be interested in seeing some early Lisp in its "native"
environment, in an emulation if possible...

mkb.
From: Christopher C. Stacy
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <u1x6kbv4u.fsf@news.dtpq.com>
Matthias Buelow <···@incubus.de> writes:

> ······@news.dtpq.com (Christopher C. Stacy) writes:
> 
> >> > Or get a PDP10 emulator, enter Maclisp, and use the "kludgey binford
> >> > editor".  Heh...
> >> 
> >> I'll do that as soon as you put a bootable image on the web.
> >> 
> >> (Please?)
> >
> >Bootable image of what?
> 
> Hmm, what OS did Maclisp run on? TOPS-10/20? ITS? And what pdp-10
> emulator is available/recommended, if any? I have used `sim' in the
> past, to emulate a pdp-11 to run Unix v5..v7 on it for fun, which
> worked pretty well, but it doesn't seem to be able to emulate the -10.
> I'd be interested in seeing some early Lisp in its "native"
> environment, in an emulation if possible...

See these places: 
  http://www.its.os.org/
  http://www.cosmic.com/u/mirian/its/
  http://victor.se/bjorn/its/

(You could also run TOPS-20 on the same emulator.
MACLISP was ported to TOPS-20.
I don't know what pre-packaged TOPS-20 systems
have MACLISP on them, though.)
From: Rob Warnock
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <DNSdnYFYFJg6IVnfRVn-vQ@speakeasy.net>
Christopher C. Stacy <······@news.dtpq.com> wrote:
+---------------
| See these places: 
|   http://www.its.os.org/
|   http://www.cosmic.com/u/mirian/its/
|   http://victor.se/bjorn/its/
| 
| (You could also run TOPS-20 on the same emulator.
| MACLISP was ported to TOPS-20.
| I don't know what pre-packaged TOPS-20 systems
| have MACLISP on them, though.)
+---------------

Also look in the archives of "alt.sys.pdp10". There is a quite active
PDP-10 hobbyist(?) community, and several very good PDP-10 emulators
now, which will boot & run TOPS-10 7.04 [I think] and some version of
TOPS-20, and ITS as well [I think]. There are emulators for (at least)
the KA-10 [I think], KL-10B, and KS-10:

    http://www.columbia.edu/acis/history/pdp10.html
    [Pretty pictures and lots more linkies...]

    http://klh10.trailing-edge.com/
    * The KLH10 Distribution:  klh10-2.0a.tgz
    * The KLH10 Auxiliary Distribution: klh10-2.0a-aux.tgz
    * Starter KL TOPS-10 7.04 filesystem (TWONKY): twonky-a11120.tar
    * Starter ITS filesystem (PI-ITS): pi-its-a11110.tar 

    http://pdp-10.trailing-edge.com/
    PDP-10 software archive
    [Includes some TOPS-10 and TOPS-20 software distribution tapes
    and layered products under the DEC "36-bit hobbyist license".]

    http://www.sourceforge.net/projects/ts10/
    TS10 now is multi-system emulator system that includes three
    working emulators: PDP-10 (KS10 and KL10), PDP-11 (KDF11 and
    KDJ11), and VAX (MicroVAX II and VAXserver 3900).

    http://www.inwap.com/pdp10/
    http://www.inwap.com/pdp10/emulators/
    Contents:
    * TS10 from Timothy Stark (updated 1-May-2001)
    * SIMH [Bob Supnik]
    * E10
    * dzclient
      Daniel Seagraves added a DZ11 module for Tim Stark's TS10 emulator
      that allows multiple users to telnet into the virtual PDP-10.

    http://www.freshports.org/emulators/its/
    Bootable ITS filesystem for KLH-10 PDP-10 emulator


-Rob

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: Joe Marshall
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <psu9kihe.fsf@comcast.net>
Peter Seibel <·····@gigamonkeys.com> writes:

> Indeed, one of Baker's big points was that when the Lisp
> community came to the fork in the road between Interlisp (with it's
> structure editor) and Maclisp (with programs ultimately stored as text
> in files), they made a mistake by taking the Maclisp fork. 

Of course Baker is wrong about this.

But he got the other points right.

-- 
~jrm
From: Peter Seibel
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <m2zmtderl8.fsf@gigamonkeys.com>
Joe Marshall <·············@comcast.net> writes:

> Peter Seibel <·····@gigamonkeys.com> writes:
>
>> Indeed, one of Baker's big points was that when the Lisp
>> community came to the fork in the road between Interlisp (with it's
>> structure editor) and Maclisp (with programs ultimately stored as text
>> in files), they made a mistake by taking the Maclisp fork. 
>
> Of course Baker is wrong about this.

Why?

-Peter

-- 
Peter Seibel           * ·····@gigamonkeys.com
Gigamonkeys Consulting * http://www.gigamonkeys.com/
Practical Common Lisp  * http://www.gigamonkeys.com/book/
From: Joe Marshall
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <wtohhyx3.fsf@comcast.net>
Peter Seibel <·····@gigamonkeys.com> writes:

> Joe Marshall <·············@comcast.net> writes:
>
>> Peter Seibel <·····@gigamonkeys.com> writes:
>>
>>> Indeed, one of Baker's big points was that when the Lisp
>>> community came to the fork in the road between Interlisp (with it's
>>> structure editor) and Maclisp (with programs ultimately stored as text
>>> in files), they made a mistake by taking the Maclisp fork. 
>>
>> Of course Baker is wrong about this.
>
> Why?

Well, I'm being a bit facetious, but structure editing is hard to use.
The problem is that you *do* want to play with fragments of code that
are not syntactically correct.  Obviously, you need to get them into a
correct form in order to run them, but when you are moving code
around, swapping arguments, factoring out subexpressions, etc. you
have a syntactic mess.  I claim that the intermediate mess is a good
thing because it is easier to make a mess and clean it up than to
remain clean from start to finish.

On the other hand, I think that it would have been a better idea to
store the structure in files rather than the text.  The text should
have been created on the fly for editing purposes.

-- 
~jrm
From: Peter Seibel
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <m2r7epdp5d.fsf@gigamonkeys.com>
Joe Marshall <·············@comcast.net> writes:

> Peter Seibel <·····@gigamonkeys.com> writes:
>
>> Joe Marshall <·············@comcast.net> writes:
>>
>>> Peter Seibel <·····@gigamonkeys.com> writes:
>>>
>>>> Indeed, one of Baker's big points was that when the Lisp
>>>> community came to the fork in the road between Interlisp (with it's
>>>> structure editor) and Maclisp (with programs ultimately stored as text
>>>> in files), they made a mistake by taking the Maclisp fork. 
>>>
>>> Of course Baker is wrong about this.
>>
>> Why?
>
> Well, I'm being a bit facetious, but structure editing is hard to
> use.  The problem is that you *do* want to play with fragments of
> code that are not syntactically correct.  Obviously, you need to get
> them into a correct form in order to run them, but when you are
> moving code around, swapping arguments, factoring out
> subexpressions, etc. you have a syntactic mess.  I claim that the
> intermediate mess is a good thing because it is easier to make a
> mess and clean it up than to remain clean from start to finish.

Yup. But to me the fact that the underlying system is ultimately
storing code as objects doesn't necessarily imply that there can't be
tools that allow you to edit it as text with all the messyness that
implies.

> On the other hand, I think that it would have been a better idea to
> store the structure in files rather than the text.  The text should
> have been created on the fly for editing purposes.

Yes. I'm 100% with you there.

-Peter

-- 
Peter Seibel           * ·····@gigamonkeys.com
Gigamonkeys Consulting * http://www.gigamonkeys.com/
Practical Common Lisp  * http://www.gigamonkeys.com/book/
From: Lars Brinkhoff
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <85d5pv9iy3.fsf@junk.nocrew.org>
>>>> Peter Seibel <·····@gigamonkeys.com> writes:
>>>>> Indeed, one of Baker's big points was that when the Lisp
>>>>> community came to the fork in the road between Interlisp (with
>>>>> it's structure editor) and Maclisp (with programs ultimately
>>>>> stored as text in files), they made a mistake by taking the
>>>>> Maclisp fork.
[...]
Peter Seibel <·····@gigamonkeys.com> writes:
> Joe Marshall <·············@comcast.net> writes:
>> On the other hand, I think that it would have been a better idea to
>> store the structure in files rather than the text.  The text should
>> have been created on the fly for editing purposes.
> Yes. I'm 100% with you there.

However, is Common Lisp source code, especially with reader macros,
compatible with structured source files?  I've thought about this (for
several minutes!), and it doesn't seem trivial.
From: Joe Marshall
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <64vmvi2i.fsf@ccs.neu.edu>
Lars Brinkhoff <·········@nocrew.org> writes:

>>>>> Peter Seibel <·····@gigamonkeys.com> writes:
>>>>>> Indeed, one of Baker's big points was that when the Lisp
>>>>>> community came to the fork in the road between Interlisp (with
>>>>>> it's structure editor) and Maclisp (with programs ultimately
>>>>>> stored as text in files), they made a mistake by taking the
>>>>>> Maclisp fork.
> [...]
> Peter Seibel <·····@gigamonkeys.com> writes:
>> Joe Marshall <·············@comcast.net> writes:
>>> On the other hand, I think that it would have been a better idea to
>>> store the structure in files rather than the text.  The text should
>>> have been created on the fly for editing purposes.
>> Yes. I'm 100% with you there.
>
> However, is Common Lisp source code, especially with reader macros,
> compatible with structured source files?  I've thought about this (for
> several minutes!), and it doesn't seem trivial.

It ought to be.  Reader macros have to return objects (or nothing), so
going from text to structure should work if the text is syntactically
valid.  

Going the other way might be an issue.  When I write a reader macro
for special objects, I try to make sure that the print method for
those objects uses the reader macro.  This won't work if the reader
macro returns list structure, though.

-----

In thinking about this more, I'm less sure of what the `right thing'
is.  If we are going to store structured data in files, it has to have
some structured representation.  A binary format like a fasl file is
easy to load, but not very portable, and you need a running lisp
system to make sense of it.  

For a moment, let's pretend XML is useful.  We could store structured
data serialized to XML.  That would have the advantage of being both
portable and (somewhat) human readable.  You could, in a pinch, edit
the structure with VI or Notepad.  A validating parser could ensure
that the structure is correct before deserializing it.  External tools
could manipulate the structure without interpreting it; for example,
you could use XSLT programs to do simple syntactic code
transformations like renaming identifiers.  Enough fantasy.  XML is a
bloated mess and a more readable and concise alternative is at hand:
s-expressions. 

So Lisp text files already *are* structured.  Since the structure
markup uses ascii characters, it is easy to edit the structure as text
without needing a Lisp system to intermediate.  A validating parser
for s-expressions is easy to write, and some external tools (like
emacs) can manipulate the structure without interpreting it.

So what's wrong with the MacLisp approach?

~jrm
From: Lars Brinkhoff
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <85u0j58z48.fsf@junk.nocrew.org>
Joe Marshall <···@ccs.neu.edu> writes:
> Lars Brinkhoff <·········@nocrew.org> writes:
>> Peter Seibel <·····@gigamonkeys.com> writes:
>>> Joe Marshall <·············@comcast.net> writes:
>>>> On the other hand, I think that it would have been a better idea
>>>> to store the structure in files rather than the text.  The text
>>>> should have been created on the fly for editing purposes.
>>> Yes. I'm 100% with you there.
>> However, is Common Lisp source code, especially with reader macros,
>> compatible with structured source files?  I've thought about this
>> (for several minutes!), and it doesn't seem trivial.
> It ought to be.  Reader macros have to return objects (or nothing),
> so going from text to structure should work if the text is
> syntactically valid.

I don't think that's an ideal solution.  E.g.:

I probably expect #. to run at compile-file or load time, not at
file-saving time.

If #+ and #- are resolved when saving the source code, that means my
conditionalized code won't work if I move it to other implementations.

If I enter #xff8240 in my code, I don't want to edit it as 16745024
later.

> Going the other way might be an issue.  When I write a reader macro
> for special objects, I try to make sure that the print method for
> those objects uses the reader macro.  This won't work if the reader
> macro returns list structure, though.

Right.  I don't know what the #I infix reader macro returns, but I'm
guessing it's list structure, and most users would probably be
dissapointed to see their infix expression turned to a prefix
expression when then try to edit it.

> So Lisp text files already *are* structured.

Yes, they certainly are quite good!  I'm not really arguing that
source files *should* be structured, merely reasoning about the
consequences if one were to attempt to make them so.  (And is there a
good enough reason?)

> some external tools (like emacs) can manipulate the structure
> without interpreting it.

I think manipulating text source code with an external tool is a far
cry from doing it inside Lisp with structured source code.

> So what's wrong with the MacLisp approach?

I didn't hear what Henry Baker said (are his slides available yet?),
but I guess this came up in the context of refactoring and otherwise
manipulating source code.  Presumably you may want to process your
source code as structured data.  Just reading a text file, processing
it, and printing it back out doesn't seem to be good enough.
From: Joe Marshall
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <ackxfkge.fsf@ccs.neu.edu>
Lars Brinkhoff <·········@nocrew.org> writes:

> Joe Marshall <···@ccs.neu.edu> writes:
>> Lars Brinkhoff <·········@nocrew.org> writes:
>>> Peter Seibel <·····@gigamonkeys.com> writes:
>>>> Joe Marshall <·············@comcast.net> writes:
>>>>> On the other hand, I think that it would have been a better idea
>>>>> to store the structure in files rather than the text.  The text
>>>>> should have been created on the fly for editing purposes.
>>>> Yes. I'm 100% with you there.
>>> However, is Common Lisp source code, especially with reader macros,
>>> compatible with structured source files?  I've thought about this
>>> (for several minutes!), and it doesn't seem trivial.
>> It ought to be.  Reader macros have to return objects (or nothing),
>> so going from text to structure should work if the text is
>> syntactically valid.
>
> I don't think that's an ideal solution.  E.g.:
>
> I probably expect #. to run at compile-file or load time, not at
> file-saving time.
>
> If #+ and #- are resolved when saving the source code, that means my
> conditionalized code won't work if I move it to other implementations.
>
> If I enter #xff8240 in my code, I don't want to edit it as 16745024
> later.

Agreed.  It wouldn't be ideal, but it *would* work.  The output will
be functionally equivalent to the input, but not textually equivalent.

Things like conditionalization could be replaced with special forms
similar to how #, was replaced by LOAD-TIME-VALUE.

You could also sprinkle hints through the internal code:

    #xff8240  would be internally represented as something like

    (progn 'hint:hexidecimal-literal 16745024)

which would of course be printed as #xff8240.

>> Going the other way might be an issue.  When I write a reader macro
>> for special objects, I try to make sure that the print method for
>> those objects uses the reader macro.  This won't work if the reader
>> macro returns list structure, though.
>
> Right.  I don't know what the #I infix reader macro returns, but I'm
> guessing it's list structure, and most users would probably be
> dissapointed to see their infix expression turned to a prefix
> expression when then try to edit it.

That's an interesting issue.  I for one, would *hate* to work on the
infix version of anyone's code and would prefer that it turned into
prefix notation.  There are always some that would prefer to have the
infix version.  So this is a case where you might *want* to change the
output for the specific user.

>> So Lisp text files already *are* structured.
>
> Yes, they certainly are quite good!  I'm not really arguing that
> source files *should* be structured, merely reasoning about the
> consequences if one were to attempt to make them so.  (And is there a
> good enough reason?)
>
>> some external tools (like emacs) can manipulate the structure
>> without interpreting it.
>
> I think manipulating text source code with an external tool is a far
> cry from doing it inside Lisp with structured source code.

It is.  Manipulating Lisp as text source with an external tool such as
Emacs is superior to anything I've used that tries to do it internally.


>> So what's wrong with the MacLisp approach?
>
> I didn't hear what Henry Baker said (are his slides available yet?),

Baker's slides are available at 

  http://www.international-lisp-conference.org/multimedia/baker-slides.pdf

> but I guess this came up in the context of refactoring and otherwise
> manipulating source code.  Presumably you may want to process your
> source code as structured data.  Just reading a text file, processing
> it, and printing it back out doesn't seem to be good enough.

No, it isn't, but it is close.  I've written programs that read Lisp
source but preserve the comments, indentation, and whitespace so that
the emitted file doesn't have those stripped out.

Baker said that MacLisp `made the wrong choice' and Interlisp `made
the right one' vis-a-vis text vs. structure, but I think he's wrong.
There are many advantages to storing code as ascii text, and the
result *is* mostly structured.
From: Marco Baringer
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <m2k6kh9h9k.fsf@soma.local>
Joe Marshall <·············@comcast.net> writes:

> Well, I'm being a bit facetious, but structure editing is hard to use.
> The problem is that you *do* want to play with fragments of code that
> are not syntactically correct.  Obviously, you need to get them into a
> correct form in order to run them, but when you are moving code
> around, swapping arguments, factoring out subexpressions, etc. you
> have a syntactic mess.  I claim that the intermediate mess is a good
> thing because it is easier to make a mess and clean it up than to
> remain clean from start to finish.

it is however easiest to to have an editor which keeps things
clean-ish along the way, so that the final cleanup of the mess is
minimal.

> On the other hand, I think that it would have been a better idea to
> store the structure in files rather than the text.  The text should
> have been created on the fly for editing purposes.

personally i like the middle ground emacs takes 'syntax/semantics
aware text editing'. deep down it's all just text (which really is the
best way to store and manage source code) however the editor knows
that the text has a particular meaning and provides commands which
take advantage of this. for example, in emacs, you can always insert
random characters willy nilly but you also have commands like
kill-sexp, transpose-sexp, uplist, backward-uplist, etc.

-- 
-Marco
Ring the bells that still can ring.
Forget the perfect offering.
There is a crack in everything.
That's how the light gets in.
	-Leonard Cohen
From: Paul Wallich
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <d9nf06$1b6$1@reader1.panix.com>
Joe Marshall wrote:
> Peter Seibel <·····@gigamonkeys.com> writes:
> 
> 
>>Joe Marshall <·············@comcast.net> writes:
>>
>>
>>>Peter Seibel <·····@gigamonkeys.com> writes:
>>>
>>>
>>>>Indeed, one of Baker's big points was that when the Lisp
>>>>community came to the fork in the road between Interlisp (with it's
>>>>structure editor) and Maclisp (with programs ultimately stored as text
>>>>in files), they made a mistake by taking the Maclisp fork. 
>>>
>>>Of course Baker is wrong about this.
>>
>>Why?
> 
> 
> Well, I'm being a bit facetious, but structure editing is hard to use.
> The problem is that you *do* want to play with fragments of code that
> are not syntactically correct.  Obviously, you need to get them into a
> correct form in order to run them, but when you are moving code
> around, swapping arguments, factoring out subexpressions, etc. you
> have a syntactic mess.  I claim that the intermediate mess is a good
> thing because it is easier to make a mess and clean it up than to
> remain clean from start to finish.

My (very hazy) recollections of writing Interlisp code in the mid-80s 
were that the structure editor did a fairly good job of letting you do 
things that were temporarily malformed. Part of it is the editing style 
people develop, part is a structure editor that looks pretty darn 
wysiwig on the surface. (working from structure also lets you do 
wonderful things with collapsing and expanding various chunks of what 
you're working on -- something that can be done with text, but only by 
emulating a structure editor)

> On the other hand, I think that it would have been a better idea to
> store the structure in files rather than the text.  The text should
> have been created on the fly for editing purposes.

What do you mean by "store the structure in files"? It always seemed to 
me that the problem wasn't so much what was stored in files (since the 
structure and the text were isomorphic enough for most purposes) as the 
act of storing it in files rather than some kind of persistent objects/ 
a database.

paul
From: Joe Marshall
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <u0jkhdma.fsf@comcast.net>
Paul Wallich <··@panix.com> writes:

> My (very hazy) recollections of writing Interlisp code in the mid-80s
> were that the structure editor did a fairly good job of letting you do
> things that were temporarily malformed. Part of it is the editing
> style people develop, part is a structure editor that looks pretty
> darn wysiwig on the surface. (working from structure also lets you do
> wonderful things with collapsing and expanding various chunks of what
> you're working on -- something that can be done with text, but only by
> emulating a structure editor)

Well, my memory is pretty hazy, too, so I could be wrong.

>> On the other hand, I think that it would have been a better idea to
>> store the structure in files rather than the text.  The text should
>> have been created on the fly for editing purposes.
>
> What do you mean by "store the structure in files"? It always seemed
> to me that the problem wasn't so much what was stored in files (since
> the structure and the text were isomorphic enough for most purposes)
> as the act of storing it in files rather than some kind of persistent
> objects/ a database.

Right.

-- 
~jrm
From: lin8080
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <42BF29AE.D14DD289@freenet.de>
Joe Marshall schrieb:

> Well, I'm being a bit facetious, but structure editing is hard to use.
> The problem is that you *do* want to play with fragments of code that
> are not syntactically correct.  Obviously, you need to get them into a
> correct form in order to run them, but when you are moving code
> around, swapping arguments, factoring out subexpressions, etc. you
> have a syntactic mess.  ...

Oh. Are ther any docus, papers, links to this theme? (play with
fragments) It is interesting to me and I what to know more about.

Thank you

stefan
From: Joe Marshall
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <psu8hdfn.fsf@comcast.net>
lin8080 <·······@freenet.de> writes:

> Joe Marshall schrieb:
>
>> Well, I'm being a bit facetious, but structure editing is hard to use.
>> The problem is that you *do* want to play with fragments of code that
>> are not syntactically correct.  Obviously, you need to get them into a
>> correct form in order to run them, but when you are moving code
>> around, swapping arguments, factoring out subexpressions, etc. you
>> have a syntactic mess.  ...
>
> Oh. Are ther any docus, papers, links to this theme? (play with
> fragments) It is interesting to me and I what to know more about.

Not that I know of.

I just noticed that one of the big frustrations I have with strong-arm
typing is that you must have a syntactically complete as well as
correct program.  Sometimes I *want* to run program fragments that I
*know* will break.  I think it is important for a system to allow you
to play with half-constructed partly broken code.


-- 
~jrm
From: Robert Maas, see http://tinyurl.com/uh3t
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <REM-2005jul26-009@Yahoo.Com>
> From: Joe Marshall <·············@comcast.net>
> I just noticed that one of the big frustrations I have with
> strong-arm typing is that you must have a syntactically complete as
> well as correct program.  Sometimes I *want* to run program fragments
> that I *know* will break.  I think it is important for a system to
> allow you to play with half-constructed partly broken code.

Hey, if this were contract Bridge, and you were my partner, I'd raise
your bid to slam!

With my style of developing one line of code at a time, bottom-up
tool-building, input-forward unit testing, I virtually never write
anything that would be considered a "complete program". Everything I
write is a fragment that calls previously-written tools (some tools
from the CMUCL or ELISP API, some tools I wrote myself earlier). And
now with BeanShell in Java I've been doing the same kind of software
development in Java too. To make a unit-testing Java IDE, I wrote ELISP
and Java functions whereby I type all my java code in GNU EMACS, select
the region I want, then click a keystroke to pass that region to a FIFO
that is being read by Java for BeanShell parsing and execution.

Line-at-a-time program-fragment development really does relieve me of
the burden of trying to fit my currently-under-test code into a larger
program framework, so I can spend 100% of my mental energy
concentrating on what I'm actually working on and 0% of my energy
wasted trying to fit one new line of code into a complete "program".

And in the course of line-at-a-time or fragment-at-a-time development,
it's trivial to unit-test the error messages, deliberately create code
that will break just to make sure the error message is working, in case
later I ever really do need that error message to tell me about some
subtle bug I put in my mistake. You'd never believe how very many times
I've found my error message is wrong, missing some parameter to match
the format specification, which I can immediately put in to correct the
problem during my unit-testing of the error message. It sure is nice to
get the testing of error messages out of the way immediately after
writing them, before they're ever needed to detect actual bugs in the
actual production code.
From: Andreas Eder
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <m3psu6n1t6.fsf@banff.eder.de>
lin8080 <·······@freenet.de> writes:

> Oh. Are ther any docus, papers, links to this theme? (play with
> fragments) It is interesting to me and I what to know more about.

Well, there is a book by Barstow, Shrobe and Sandewall (editors) with
the tilte 'Interactive Programming Environments' which contains a few
artivles relevent to this discussion e.g.
 'The Interlisp Programming Environment' by Teitelman and Masinter, 
 'Programming ni an Interactive Environment: The Lisp Experience' by
 Sandewall, and many other interesting articles.

Then there is the archive at bitsavers.org where you can find a lot of
manuals about Interlisp in the /pdf/xerox/interlisp directory.

'Andreas
-- 
Wherever I lay my .emacs, there's my $HOME.
From: Edi Weitz
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uu0jnlw62.fsf@agharta.de>
On 24 Jun 2005 12:27:19 +0200, Matthias Buelow <···@incubus.de> wrote:

> Plus, programming language research has moved on to languages more
> "fruitful" in theoretical results, i.e., functional languages like
> ML and Haskell, with strong type systems, where there are quite a
> few open problems still. Perhaps Lisp simply has become too boring
> for research with no really interesting problems left? Writing the
> nth object system, compiler, or garbage collector or what have you
> isn't going to cut it anymore. That's old, well understood stuff by
> now, that is, it has moved to mainstream. And you apparently cannot
> mathematically formalize a dynamic type system as well as a strict,
> inferential one, and this alone would make it quite unattractive for
> use in research papers.

Languages that are "fruitful" in this sense are good for researchers
who have to publish papers or want to write a thesis.  That doesn't
necessarily mean they're good for real world applications as well.

Cheers.
Edi.

-- 

Lisp is not dead, it just smells funny.

Real email: (replace (subseq ·········@agharta.de" 5) "edi")
From: Sashank Varma
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1119664571.726435.70300@g49g2000cwa.googlegroups.com>
Matthias Buelow wrote:

> Plus, programming language research has moved on to languages more
> "fruitful" in theoretical results, i.e., functional languages like ML
> and Haskell, with strong type systems, where there are quite a few
> open problems still.

You are revealing a severe mathematical bias. Sticking to functional
languages purely because it is easy to prove theorems about their
programs is like a drunk looking for his keys not where he lost them,
but by the lamp post because that's where the light is best. This
attitude has turned Artifical Intelligence into a sad little branch of
applied mathematics -- a wholescale retreat from its original goals.
Sad to see it is also corroding your part of the programming languages
world.

> Perhaps Lisp simply has become too boring for
> research with no really interesting problems left?

Lisp is only boring relative to the goal of getting theorems.

> And you apparently cannot
> mathematically formalize a dynamic type system as well as a strict,
> inferential one, and this alone would make it quite unattractive for
> use in research papers.

Oh, I get it. Lisp is only boring relative to the goal of getting
*easy* theorems.
From: William D Clinger
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1119722089.582122.204210@g14g2000cwa.googlegroups.com>
Sashank Varma wrote:
> You are revealing a severe mathematical bias. Sticking to functional
> languages purely because it is easy to prove theorems about their
> programs is like a drunk looking for his keys not where he lost them,
> but by the lamp post because that's where the light is best.

Did you see J Strother Moore's presentation?

Did you notice that ACL2 is implemented in a purely
functional subset of Common Lisp, and can only prove
properties of programs written in that functional
subset?

Do you think your criticism applies to ACL2?

> Oh, I get it. Lisp is only boring relative to the goal of getting
> *easy* theorems.

Do you think ACL2 can prove only *easy* theorems?

Will
From: Sashank Varma
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1119768673.916272.6940@z14g2000cwz.googlegroups.com>
William D Clinger wrote:

> Sashank Varma wrote:
> > You are revealing a severe mathematical bias. Sticking to functional
> > languages purely because it is easy to prove theorems about their
> > programs is like a drunk looking for his keys not where he lost them,
> > but by the lamp post because that's where the light is best.
>
> Did you see J Strother Moore's presentation?

Nope. I had to leave early that night.

> Did you notice that ACL2 is implemented in a purely
> functional subset of Common Lisp, and can only prove
> properties of programs written in that functional
> subset?
>
> Do you think your criticism applies to ACL2?

I think you misunderstand what I said. A poster said Lisp is no longer
interesting to programming language researchers because the current
goal of said researchers is to prove theorems about programming
languages (e.g., static type systems) and programs written in those
languages. I said that it's pretty myopic if mathematical tractability
is a necessary condition for a programming language to be considered
worthy of study.

> > Oh, I get it. Lisp is only boring relative to the goal of getting
> > *easy* theorems.
>
> Do you think ACL2 can prove only *easy* theorems?

The poster also said that one reason that Lisp is difficult to prove
theorems about is its dynamic typing. My response is that if the only
acceptable method in the programming language community is mathematical
proof, then you'd think some brave would want to test his or her chops
against a difficult problem such as this.
From: Joe Marshall
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <slz5hyfd.fsf@comcast.net>
"Sashank Varma" <············@yahoo.com> writes:

> I think you misunderstand what I said. A poster said Lisp is no longer
> interesting to programming language researchers because the current
> goal of said researchers is to prove theorems about programming
> languages (e.g., static type systems) and programs written in those
> languages. 

That is true for a number of researchers, but not all.

> I said that it's pretty myopic if mathematical tractability
> is a necessary condition for a programming language to be considered
> worthy of study.

Good grief!  It doesn't take much to render a language completely
intractable.  If the language is intractable, then it becomes
impossible to say simple things like `that expression returns an
integer (if it returns at all)' or `this expression is composed of
those two subexpressions'.

The reason researchers play with `toy' languages is because they
haven't figured out how to deal with `real' ones.  The problems are
*really hard*.  It isn't myopia, it's desperation.


>> > Oh, I get it. Lisp is only boring relative to the goal of getting
>> > *easy* theorems.
>>
>> Do you think ACL2 can prove only *easy* theorems?
>
> The poster also said that one reason that Lisp is difficult to prove
> theorems about is its dynamic typing. My response is that if the only
> acceptable method in the programming language community is mathematical
> proof, then you'd think some brave would want to test his or her chops
> against a difficult problem such as this.

Given the high level of mathematicians that have tried and failed at
this, you'd have to be *very* ballsy and *very* good.  The former is a
tad easier.

-- 
~jrm
From: Sashank Varma
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1119821710.460355.57260@g49g2000cwa.googlegroups.com>
Joe Marshall wrote:
> "Sashank Varma" <············@yahoo.com> writes:
> > I said that it's pretty myopic if mathematical tractability
> > is a necessary condition for a programming language to be considered
> > worthy of study.
>
> Good grief!  It doesn't take much to render a language completely
> intractable.  If the language is intractable, then it becomes
> impossible to say simple things like `that expression returns an
> integer (if it returns at all)' or `this expression is composed of
> those two subexpressions'.

I didn't say "computationally tractable" in the sense of Turing. I said
"mathematical tractability," by which I mean amenable to formal
analysis -- the difference between Haskell and Applesoft Basic, say.

Once again, the original poster said that only languages that are
amenable to mathematical analysis are interesting to programming
language researchers. My point is that this is myopic given the range
of features that escape such analysis.

You can make the point that it is pragmatic to start with simple,
tractable cases and then work your way up to the complexity of the real
world -- like first assuming a frictionless world. However, the
original poster adopted the inverse stance. But if
     (1) mathematical analyzability implies
         interesting to programming language researchers
is true, it does not necessarily follow that
     (2) mathematical un-analyzability implies
         un-interesting to programming language researchers
is true. This is the inference/claim that I have been trying to re-but.
From: Joe Marshall
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <3br4iu8r.fsf@comcast.net>
"Sashank Varma" <············@yahoo.com> writes:

> it does not necessarily follow that
>      (2) mathematical un-analyzability implies
>          un-interesting to programming language researchers
> is true.

That is true.  It isn't that the intractable isn't interesting, it is
that mathematicians have no idea where to start.  They'd *love* to do
research on real-life languages, but it's just too hard.

~jrm
From: Matthias Buelow
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <86mzpck8al.fsf@drjekyll.mkbuelow.net>
Joe Marshall <·············@comcast.net> writes:

>That is true.  It isn't that the intractable isn't interesting, it is
>that mathematicians have no idea where to start.  They'd *love* to do
>research on real-life languages, but it's just too hard.

I don't want to start a flame war, since I'm more on the Lisp side
myself, but since when is (Common) Lisp more "real-life" than Haskell?
From what I see, Haskell seems to be quite popular these days and is
gaining popularity rapidly. This is evidenced by the fact, that even
the Perl community, and you probably can't get any more practical and
"real-life" than them, are producing a compiler for Perl 6 written
entirely in Haskell ("pugs"). While the visibility of Haskell in the
general developer community is still relatively low, involvement with
popular, non-academic projects like Perl, aswell as research effort by
the-company-that-shall-not-be-named to port Haskell (and ML) to .NET
surely makes people more aware of originally purely academic
(functional) programming languages, and their application to real-life
problems. Now I don't see similar things going on with Lisp. If it
weren't for Graham's popular essays, which are regularly slashdotted,
the visibility of Common Lisp in the general developer community would
be near _zero_.

mkb.
From: Sashank Varma
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1119847005.513506.107130@g14g2000cwa.googlegroups.com>
Matthias Buelow wrote:

> I don't want to start a flame war, since I'm more on the Lisp side
> myself, but since when is (Common) Lisp more "real-life" than Haskell?
> From what I see, Haskell seems to be quite popular these days and is
> gaining popularity rapidly. This is evidenced by the fact, that even
> the Perl community, and you probably can't get any more practical and
> "real-life" than them, are producing a compiler for Perl 6 written
> entirely in Haskell ("pugs").

I remember hearing something about this...

> While the visibility of Haskell in the
> general developer community is still relatively low, involvement with
> popular, non-academic projects like Perl, aswell as research effort by
> the-company-that-shall-not-be-named to port Haskell (and ML) to .NET
> surely makes people more aware of originally purely academic
> (functional) programming languages, and their application to real-life
> problems.

I was under the impression the a good chunk of the MS CLR team were
hired from the academic functional programming community. It's hard to
call these ports "practical" until MS actually uses them for something.
(Please, no MS Bob jokes here.) Until then, they seem more like
vanity/research projects, not evidence of Haskell's widespread
adoption.

> Now I don't see similar things going on with Lisp. If it
> weren't for Graham's popular essays, which are regularly slashdotted,
> the visibility of Common Lisp in the general developer community would
> be near _zero_.

Well Graham got his reputation in the general community by
co-developing what eventually became Yahoo! stores largely in Common
Lisp. That's pretty practical. Orbitz also uses Common Lisp heavily.
That's also pretty practical.
From: Matthias Buelow
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <86vf40kcvd.fsf@drjekyll.mkbuelow.net>
"Sashank Varma" <············@yahoo.com> writes:

>is true, it does not necessarily follow that
>     (2) mathematical un-analyzability implies
>         un-interesting to programming language researchers
>is true. This is the inference/claim that I have been trying to re-but.

Well. That is of course true, as you formulate it. There's only the
question where (Common) Lisp comes into play.

From my -limited, since I'm not a programming language researcher-
outsider viewpoint, I currently see two main forks for PL research:
1) the "mathematical", analytical way, this is, from what I
  understand, exclusively strongly typed, polymorphic languages like
  Haskell or ML (even more so, often pure functional languages, which
  would even exclude ML), and
2) the "practical" way, this appears to be almost exclusively Java and
  .NET languages (i.e., C#), plus topics about integration amongst these
  languages.
Correct me if you have different experience but that's how the
situation currently presents itself to my (limited) point of view. And
from this situation, my conjecture that Lisp currently appears to be
"uninteresting" for language research, holds.

mkb.
From: Sashank Varma
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1119827388.002550.91390@g43g2000cwa.googlegroups.com>
Matthias Buelow wrote:

> "Sashank Varma" <············@yahoo.com> writes:
>
> >is true, it does not necessarily follow that
> >     (2) mathematical un-analyzability implies
> >         un-interesting to programming language researchers
> >is true. This is the inference/claim that I have been trying to re-but.
>
> Well. That is of course true, as you formulate it. There's only the
> question where (Common) Lisp comes into play.
>
> From my -limited, since I'm not a programming language researcher-
> outsider viewpoint, I currently see two main forks for PL research:
> 1) the "mathematical", analytical way, this is, from what I
>   understand, exclusively strongly typed, polymorphic languages like
>   Haskell or ML (even more so, often pure functional languages, which
>   would even exclude ML), and
> 2) the "practical" way, this appears to be almost exclusively Java and
>   .NET languages (i.e., C#), plus topics about integration amongst these
>   languages.
> Correct me if you have different experience but that's how the
> situation currently presents itself to my (limited) point of view.

This is my sense as well.

> And
> from this situation, my conjecture that Lisp currently appears to be
> "uninteresting" for language research, holds.

I guess I don't follow. If Lisp is of little interest to researchers in
camp (1), why can't it be of interest to researchers in camp (2)? It is
a practical solution for a number of interesting problems, and for this
reason seems to me a language worthy of analysis, modification, and
extension. Pascal Costanza's talk at ILC2005 was exactly of this type.

FWIW, your original comment also reads this way, i.e., bars Common Lisp
from study by more practical programming language researchers, at least
to my eyes.

Matthias Buelow wrote:
> Plus, programming language research has moved on to languages more
> "fruitful" in theoretical results, i.e., functional languages like ML
> and Haskell, with strong type systems, where there are quite a few
> open problems still. Perhaps Lisp simply has become too boring for
> research with no really interesting problems left?

Hence my string of responses.
From: Matthias Buelow
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <86r7eok8x3.fsf@drjekyll.mkbuelow.net>
"Sashank Varma" <············@yahoo.com> writes:

>I guess I don't follow. If Lisp is of little interest to researchers in
>camp (1), why can't it be of interest to researchers in camp (2)? It is
>a practical solution for a number of interesting problems, and for this
>reason seems to me a language worthy of analysis, modification, and
>extension. Pascal Costanza's talk at ILC2005 was exactly of this type.
>FWIW, your original comment also reads this way, i.e., bars Common Lisp
>from study by more practical programming language researchers, at least
>to my eyes.

My wild guess is, that "practical" researchers likely prefer to focus
on languages that have a _slightly_ higher visibility in the industry
than Lisp...

mkb.
From: Joe Marshall
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <y88whejd.fsf@comcast.net>
Matthias Buelow <···@incubus.de> writes:

> From my -limited, since I'm not a programming language researcher-
> outsider viewpoint, I currently see two main forks for PL research:
> 1) the "mathematical", analytical way, this is, from what I
>   understand, exclusively strongly typed, polymorphic languages like
>   Haskell or ML (even more so, often pure functional languages, which
>   would even exclude ML), and
> 2) the "practical" way, this appears to be almost exclusively Java and
>   .NET languages (i.e., C#), plus topics about integration amongst these
>   languages.
> Correct me if you have different experience but that's how the
> situation currently presents itself to my (limited) point of view. And
> from this situation, my conjecture that Lisp currently appears to be
> "uninteresting" for language research, holds.

The Programming Languages and Theory group at Northeastern uses Scheme
(which is a damn sight closer to lisp than Haskell or Java).

There are a few here that think that strong typing is overrated and
that Java is an abomination (although less so than C++).

-- 
~jrm
From: Robert Maas, see http://tinyurl.com/uh3t
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <REM-2005aug11-003@Yahoo.Com>
> From: Joe Marshall <·············@comcast.net>
> There are a few here that think that strong typing is overrated and
> that Java is an abomination (although less so than C++).

Not to mention that C and C++ don't even provide strong typing,
although C++ at least has the decency to advertise more of its
violations explicitly.

By the way, we can think of strong typing in two ways, compile-time
typing, and run-time typing. Lisp is the winner for run-time typing,
with Java second best. I'm not sure what the winner for compile-time
typing is, maybe Pascal or Ada? Inescapable compile-time typing is a
hinderance to all software R&D, whereas optional compile-time typing as
in Lisp is a liberation when strong run-time typing is always there to
signal mistakes during unit testing.

Oh wait, this is comp.lang.lisp, so everyone except the beginners
already knows that, right?

From: Bruce Stephens
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <87psskqxp8.fsf@cenderis.demon.co.uk>
·······@Yahoo.Com (Robert Maas, see http://tinyurl.com/uh3t) writes:

[...]

> I'm not sure what the winner for compile-time typing is, maybe
> Pascal or Ada?

Surely one of the non-lisp functional languages, Haskell or ML
(depending on whether you want to be pure or impure)?

[...]
From: Matthias Buelow
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <86acleo394.fsf@drjekyll.mkbuelow.net>
"Sashank Varma" <············@yahoo.com> writes:

>You are revealing a severe mathematical bias. Sticking to functional
>languages purely because it is easy to prove theorems about their
>programs is like a drunk looking for his keys not where he lost them,
>but by the lamp post because that's where the light is best. This
>attitude has turned Artifical Intelligence into a sad little branch of
>applied mathematics -- a wholescale retreat from its original goals.
>Sad to see it is also corroding your part of the programming languages
>world.

I'm talking about what I observe.. I might be wrong, of course. After
all, this whole stuff isn't just done for its own sake, but people are
trying to make a career on some topics, and also be able to pay some
bills from their research work, and they naturally chose areas where
there's the chance of reaping results in the near future and
publishing a few papers about it. I don't think the "mathematization",
or so, is a bad thing. Maybe AI was a bit too hippie-style vague and
nebulous 20 years ago, and then bubble burst, when it couldn't deliver
that which has been promised? That's what I suspect, I can't verify
it, not having been there in that time, so correct me if I'm wrong.
BTW., AI has little to do with Lisp. I was talking about Lisp, but you
have changed tracks. Today, apparently, most AI programming is done in
Java, as it seems.

mkb.
From: Rainer Joswig
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <joswig-22400F.17531225062005@news-europe.giganews.com>
In article <··············@drjekyll.mkbuelow.net>,
 Matthias Buelow <···@incubus.de> wrote:

> "Sashank Varma" <············@yahoo.com> writes:
> 
> >You are revealing a severe mathematical bias. Sticking to functional
> >languages purely because it is easy to prove theorems about their
> >programs is like a drunk looking for his keys not where he lost them,
> >but by the lamp post because that's where the light is best. This
> >attitude has turned Artifical Intelligence into a sad little branch of
> >applied mathematics -- a wholescale retreat from its original goals.
> >Sad to see it is also corroding your part of the programming languages
> >world.
> 
> I'm talking about what I observe.. I might be wrong, of course. After
> all, this whole stuff isn't just done for its own sake, but people are
> trying to make a career on some topics, and also be able to pay some
> bills from their research work, and they naturally chose areas where
> there's the chance of reaping results in the near future and
> publishing a few papers about it. I don't think the "mathematization",
> or so, is a bad thing. Maybe AI was a bit too hippie-style vague and
> nebulous 20 years ago, and then bubble burst, when it couldn't deliver
> that which has been promised? That's what I suspect, I can't verify
> it, not having been there in that time, so correct me if I'm wrong.
> BTW., AI has little to do with Lisp. I was talking about Lisp, but you
> have changed tracks. Today, apparently, most AI programming is done in
> Java, as it seems.

Do you have any numbers?
From: Matthias Buelow
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <8664w2nz9k.fsf@drjekyll.mkbuelow.net>
Rainer Joswig <······@lisp.de> writes:

>> BTW., AI has little to do with Lisp. I was talking about Lisp, but you
>> have changed tracks. Today, apparently, most AI programming is done in
>> Java, as it seems.
>
>Do you have any numbers?

For what, for the Java hypothesis? At my university, the AI department
has made the switch a couple years ago (before 2000) and there doesn't
seem to be any Lisp left (they had used, and offered courses, on Lisp
before that). And at least in multiagent simulations, Java and C++
(either with simulation class libraries or with development shells,
like SeSAm) seem to be predominant in the field in general, as I was
told. This is probably also due to the fact that all students know
Java (since it's taught in required courses and used by most
projects), whereas you'd be hard pressed to find anyone who knows
Lisp.

mkb.
From: alex goldman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <3861705.3Drd72nrax@yahoo.com>
Matthias Buelow wrote:

> Rainer Joswig <······@lisp.de> writes:
> 
>>> BTW., AI has little to do with Lisp. I was talking about Lisp, but you
>>> have changed tracks. Today, apparently, most AI programming is done in
>>> Java, as it seems.
>>
>>Do you have any numbers?
> 
> For what, for the Java hypothesis? At my university, the AI department
> has made the switch a couple years ago (before 2000) and there doesn't
> seem to be any Lisp left (they had used, and offered courses, on Lisp
> before that). And at least in multiagent simulations, Java and C++
> (either with simulation class libraries or with development shells,
> like SeSAm) seem to be predominant in the field in general, as I was
> told. This is probably also due to the fact that all students know
> Java (since it's taught in required courses and used by most
> projects), whereas you'd be hard pressed to find anyone who knows
> Lisp.
> 
> mkb.

But do they use Java for *research* ? I know some AI researchers who do, but
I don't think Java dominates.
From: Matthias Buelow
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <861x6qny2l.fsf@drjekyll.mkbuelow.net>
alex goldman <·····@spamm.er> writes:

>But do they use Java for *research* ? I know some AI researchers who do, but
>I don't think Java dominates.

Well, for research they use the shells they have developed. These were
originally written in Lisp (MCL afaik) but have been rewritten (or are
in the process of being rewritten) in Java in the last years. As
concrete examples, there are SeSAm, a multiagent simulation
development environment (http://www.simsesam.de/), which has replaced
the original Lisp.SeSAm, and D3Web
(http://d3.informatik.uni-wuerzburg.de/), a knowledge system shell-kit,
which is replacing the old D3 (although the latter is still in use).

mkb.
From: Rainer Joswig
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <BEE3A5E6.DDFA%joswig@lisp.de>
Am 25.06.2005 19:04 Uhr schrieb "Matthias Buelow" unter <···@incubus.de> in
··············@drjekyll.mkbuelow.net:

> Rainer Joswig <······@lisp.de> writes:
> 
>>> BTW., AI has little to do with Lisp. I was talking about Lisp, but you
>>> have changed tracks. Today, apparently, most AI programming is done in
>>> Java, as it seems.
>> 
>> Do you have any numbers?
> 
> For what, for the Java hypothesis? At my university, the AI department
> has made the switch a couple years ago (before 2000) and there doesn't
> seem to be any Lisp left (they had used, and offered courses, on Lisp
> before that). And at least in multiagent simulations, Java and C++
> (either with simulation class libraries or with development shells,
> like SeSAm) seem to be predominant in the field in general, as I was
> told. This is probably also due to the fact that all students know
> Java (since it's taught in required courses and used by most
> projects), whereas you'd be hard pressed to find anyone who knows
> Lisp.

I think plenty PERL, C, C++ is used in 'AI programming'. Plus Java in some
areas. Lots of natural language stuff tends to be written in Perl.
I just don't believe that MOST AI programming is done in Java. Some.
And lots in some areas. But in some areas there are lots of other
languages/tools. Brute force speech stuff is often in C.
Vision stuff in C/C++. Many planners are still written
in Lisp. Some KR stuff (see below) is still in Lisp.

Plus I think that plenty of that is not worth being mentioned as AI-related
anyway. If you look here for example ( http://www.daml.org/tools/ ) lots
of the tools are just that, 'tools' (like browsers, editors, parsers, ...).
Only little core stuff with large amounts of glue around it.
Some of the more hardcore AI systems (the inference engines) are written in
Java. Some in Lisp. Probably the better ones like FaCT and Racer. Or the
larger ones like Cyc (which is written in SubL (a Common Lisp dialect) which
has versions written in C and Common Lisp).

I also believe that lots of students know Java and use Java in their
'research'. I also believe that lots of the research should not
carry the label 'AI'. I recently saw some paper some students wrote about
some scheduling application. Most of the stuff was about UI level
things. The scheduling algorithms behind that were mostly a joke. I cannot
believe that it will create anything useful in that application domain
or advance the state of the art of scheduling. I'm pretty sure
the students learned more about J2EE than about scheduling algorithms.

So, yes, there you are right. Most of the things people now call
'AI programming' is done in Java. ;-)
From: Matthias
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <36wekapl7vy.fsf@hundertwasser.ti.uni-mannheim.de>
Rainer Joswig <······@lisp.de> writes:

> I think plenty PERL, C, C++ is used in 'AI programming'. Plus Java in some
> areas. Lots of natural language stuff tends to be written in Perl.
> I just don't believe that MOST AI programming is done in Java. Some.
> And lots in some areas. But in some areas there are lots of other
> languages/tools. Brute force speech stuff is often in C.
> Vision stuff in C/C++. 

In research, the dominant language for Computer Vision is Matlab, with
C++ coming second (esp. for time-critical applications).  The reasons
are: Good support for linear algebra, signal processing & filtering,
good visualization, good research code is available, interactivity
(i.e. interpreted).  Note that this is mostly about the libraries, the
language itself is unimpressive.

BTW, comparing Norvig's two AI books (Paradigms of AI Programming,
1991, and AI: A Modern Approach (with Russell, 2002)) one notes
that in the second book the implementation language is not important
any more.
From: Rainer Joswig
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <joswig-677ABF.17282126062005@news-europe.giganews.com>
In article <···············@hundertwasser.ti.uni-mannheim.de>,
 Matthias <··@spam.please> wrote:

> Rainer Joswig <······@lisp.de> writes:
> 
> > I think plenty PERL, C, C++ is used in 'AI programming'. Plus Java in some
> > areas. Lots of natural language stuff tends to be written in Perl.
> > I just don't believe that MOST AI programming is done in Java. Some.
> > And lots in some areas. But in some areas there are lots of other
> > languages/tools. Brute force speech stuff is often in C.
> > Vision stuff in C/C++. 
> 
> In research, the dominant language for Computer Vision is Matlab, with
> C++ coming second (esp. for time-critical applications).  The reasons
> are: Good support for linear algebra, signal processing & filtering,
> good visualization, good research code is available, interactivity
> (i.e. interpreted).  Note that this is mostly about the libraries, the
> language itself is unimpressive.
> 
> BTW, comparing Norvig's two AI books (Paradigms of AI Programming,
> 1991, and AI: A Modern Approach (with Russell, 2002)) one notes
> that in the second book the implementation language is not important
> any more.

Sure, the first is about "AI programming" in Lisp and the other
one is about AI in general.
From: Matthias
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <36wfyv6pd4j.fsf@hundertwasser.ti.uni-mannheim.de>
"Sashank Varma" <············@yahoo.com> writes:

> You are revealing a severe mathematical bias. Sticking to functional
> languages purely because it is easy to prove theorems about their
> programs is like a drunk looking for his keys not where he lost them,
> but by the lamp post because that's where the light is best. This
> attitude has turned Artifical Intelligence into a sad little branch of
> applied mathematics -- a wholescale retreat from its original goals.

Actually, since AI relies more on mathematics than on mostly
unjustified heuristics things are improving a lot in terms of results.

OCR is solved, speech recognition is almost solved, machine
translation is far from solved but getting useful.  Even Internet
search, the big thing, relies not on heuristic search algorithms but
on factorization of large sparse matrices.  
From: Wade Humeniuk
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <iAVue.98728$on1.14474@clgrps13>
Sashank Varma wrote:
> 
> This is my great fear. In the Dynamic Language Wizards panel discussion
> of a year ago, someone lamented that the current ACM model computer
> science curriculum devotes on 5 hours to the study of programming
> language design. Guy Steele said maybe this wasn't a bad thing, that
> maybe computer science had passed beyond the time when new programming
> languages had to be invented on a weekly basis. Maybe we really are at
> some kind of language design plateau.
> 

Yes we are at a plateau.  If you want a new label for where computer
languages are at, its, "The Age of Libraries".  Its what the industry
and programmers fret about.  Where are the libraries?  I cannot do
anything without a "standard library"!  And to be frank, NO amount
of programming language innovation will solve the "library" problem.

Language design is dead, dead, dead because its been taken over by
pseudo-programming language vocabularies (i.e natural language).
Its the "standardization" by imprecise-human-language-and-thought
that has slowed down any innovation.  Its not the CL syntax or the
individual built-in functions of the ANSI CL spec that has slowed
things down, its the implicit higher level organization in the
spec which has slowed down things (like, the condition system, the breakdown
of various phases of the compilation process, the concept of the
sequence).  These are powerful and attractive ideas that cements one's
thinking.

> Another possibility is that some other force is holding us back. As
> Lispers, the invention of new programming language constructs is our
> particular forte. So what's stopping us? Not standardization. What
> then?
> 

Its reality that is holding people back, like I said above, no amount
of new programming language constructs will help out the concerns of
the computer industry.

If you want one thing holding back innovation its the anthropomorphizing
of the computer.

Wade
From: lin8080
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <42BC7EF5.89C0FEDD@freenet.de>
Sashank Varma schrieb:

> Another possibility is that some other force is holding us back. As
> Lispers, the invention of new programming language constructs is our
> particular forte. So what's stopping us? Not standardization. What
> then?

The correct answer is: Lisp is a secret weapon.

Remember, once Apple had a very fast machine (guess it was G5), and soon
something unexpected happend (no export). Or, in May 2005 mathematics
celebrate a new record, RSA200 was defined (200 digits prim), but the
secret was, a us-firm had done this long before. So, as they say, this
age is called  "Informations Age", and what is the best way to collect
infos? (ask goggle (aha, they do something new)).

Again. Lisp is a secret weapon. Better keep an eye on that, right?

stefan

not enough? this thread has too many postings, eh?
From: Ray Dillinger
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <dhrve.2067$p%3.14602@typhoon.sonic.net>
Sashank Varma wrote:
> Christopher C. Stacy wrote:
> 
>>Besides, I don't see anyone stopping anyone else from
>>inventing their own versions of Lisp.
> 
> 
> Agreed.

Agreed, again (I am in fact one such person) but making
an alternate Lisp with the scale of CL is difficult, and
having it become popular doubly so.

Paul Graham has the right idea with Arc; it's pointless
to release it before it's complete enough to compete with
CL, and that means the implementation could easily take
a decade.

So there's a chilling effect, IMO; I have a fairly
cool experimental Lisp with functions that are both first-
class and first-order, for the first time supporting
truly separate translation - you don't have to have
macro definitions available on the same machine when
you're preparing code that contains calls to macro-like
functions for execution, and you can effectively pass
arguments after dynamically linking.  It would be a
fertile ground, possibly, for compiler research, since
I do a lot of things the "expensive" way and ought to
be able to find ways to optimize the common cases
without reducing the semantics - But it's slower than
CL, and its function-call semantics are not close enough
to CL's to use existing libraries. and without a bunch
of wrappers and libraries duplicating much of the
functionality of CL, it's moderately pointless to attempt
to release it.

>>Seems to me that most of the world haven't even
>>caught up to the Lisp ideas from two decades ago.

They're a lot closer now than they were then.

> The problem with many sufficiently diffferent dialects is that porting
> code is difficult, so Common Lisp came into being. Soon thereafter,
> innovation slowed. Sure, things like Connection Machine Lisp and the
> MOP came into existence, but CLtL2 and the ANS seem to have been about
> codifying ideas that already existed in one form or another by the mid
> 1980s, but did not make it into CLtL1 for political or pragmatic
> reasons.

I would tend to agree with this analysis.  Lisp language
advances have been basically stalled since the middle 1980s.

There are still a lot of profound ideas in programming
languages and you could still glom them into Lisp; we
should be able to handle arrays as well as APL, strings
as well as Perl, numbers as well as Fortran, Unification
as well as Prolog, and hardware modeling (with "wire"
variables, etc) as well as Verilog.

We should have good ways of exploiting parallelism and
distributed programs and good ways of expressing ambiguity.
We should have easier access to lazy semantics than we
have.

We should have ways of exploiting new ideas in CPU design;
consider reconfigurable ASICs with an FPGA-like design,
but with the gates controlled by the contents of specialized
RAM memory in the device instead of "burned in," and you
have what may be fertile ground for a real-time reconfigurable
DSP coprocessor; compilers would need to be written to exploit
such devices, reconfiguring the device multiple times during
a program execution, and Lisp compilers ought to be in the
forefront.

All of these things have been not happening.  In particular,
the Common Lisp standard has made the language less "hackable"
in terms of providing no way to get at raw binary I/O on
hardware ports and hiding the hardware addresses and binary
form of your tagged data from the programs.  More portability?
Yes.  Good abstraction? Yes, and that's helpful for building
programs up.  But good de-abstraction and access to the binary
foundations?  No, and that's chilling to "downward" extension
like access to hardware.

And there's as much to be done in downward extension as there
is in upward extension, really; the people working with the
cool problems of today and new CPU architectures need hardware
access and the ability to write grotty machine-code to interface
with high-level data representations.  When they find that
Lisp won't give it to 'em, and has adopted a design that is so
underspecified as to make it hard to build on, they move on to
other languages.  They can look at a C struct and write machine
code to manipulate its bits,and that machine code will be portable
between virtually all C implementations that run on that hardware,
because C has downward extensibility; the same is not true of
a Lisp struct, because Lisp has only upward extensibility.  You
can't even guess how a particular implementation will lay out
that struct in memory, so downard extensibility is cut off.  So
they tend to work in C more than Lisp when they need downward
extensions.


>>I can't imagine what the point is of whether Lisp
>>has "declined", anyway.   Perhaps nobody has had
>>any super great ideas lately.
> 
> 
> This is my great fear. In the Dynamic Language Wizards panel discussion
> of a year ago, someone lamented that the current ACM model computer
> science curriculum devotes on 5 hours to the study of programming
> language design. Guy Steele said maybe this wasn't a bad thing, that
> maybe computer science had passed beyond the time when new programming
> languages had to be invented on a weekly basis. Maybe we really are at
> some kind of language design plateau.

To put it in mildest possible terms, I do not believe so.
We may have reached a plateau in our exploitation of main
CPU's, but most machines these days have a GPU or DSP lying
around, with profoundly different strengths and abilities,
and we haven't begun to develop idioms and techniques for
taking proper advantage of those.  Why not?  Most CPU's
today are multiprocessor cores, and have we developed idioms
and semantics to keep all the heads warm in a useful way? No.

And the hardware designers aren't standing still; While
language designers are still thinking in terms of single
CPU's and linear flow of control, those guys (at least the
ones who aren't gutless) are building fundamentally new
devices for which those language ideas are already obsolete.
We ought to be in the forefront of finding idioms and
techniques for using those devices.

> Another possibility is that some other force is holding us back. As
> Lispers, the invention of new programming language constructs is our
> particular forte. So what's stopping us? Not standardization. What
> then?

Lack of downward extensibility?  Semantic models based on linear
flow of control?  Bulk?


				Bear
From: Robert Maas, see http://tinyurl.com/uh3t
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <REM-2005aug13-002@Yahoo.Com>
> From: Ray Dillinger <····@sonic.net>
> the Common Lisp standard has made the language less "hackable"
> in terms of providing no way to get at raw binary I/O on
> hardware ports and hiding the hardware addresses and binary
> form of your tagged data from the programs.  More portability?
> Yes.  Good abstraction? Yes, and that's helpful for building
> programs up.  But good de-abstraction and access to the binary
> foundations?  No, and that's chilling to "downward" extension
> like access to hardware.

Supposed we revived the PSL (Portable Standard Lisp) concept of
SysLisp? Basically it's LAP (Lisp Assembly Program) with a layer of
macros on top of it and a way to compile lisp code into those macros,
plus peephole optimizations at both the LAP and lap-macro level. So
then in your Lisp code you can write at any of the three levels (lisp,
lap-macros, or raw LAP). In that way, if there's any way in machine
language to interface some device, then you can just write LAP to
generate the exact machine language you need, then you can put wrappers
around that LAP to make it useful from applications. Once the basic
mechanism is available, then we have the flexibility to hack low-level
devices however we want and to provide libraries that provide
"high-level" abstractions of those new low-level device control tools.
From: Ray Dillinger
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <a6vLe.9009$p%3.36741@typhoon.sonic.net>
Robert Maas, see http://tinyurl.com/uh3t wrote:

> Supposed we revived the PSL (Portable Standard Lisp) concept of
> SysLisp? Basically it's LAP (Lisp Assembly Program) with a layer of
> macros on top of it and a way to compile lisp code into those macros,
> plus peephole optimizations at both the LAP and lap-macro level. So
> then in your Lisp code you can write at any of the three levels (lisp,
> lap-macros, or raw LAP). In that way, if there's any way in machine
> language to interface some device, then you can just write LAP to
> generate the exact machine language you need, then you can put wrappers
> around that LAP to make it useful from applications. Once the basic
> mechanism is available, then we have the flexibility to hack low-level
> devices however we want and to provide libraries that provide
> "high-level" abstractions of those new low-level device control tools.

I would absolutely adore that.

				Bear
From: Robert Maas, see http://tinyurl.com/uh3t
Subject: SysLisp (was: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, ...)
Date: 
Message-ID: <REM-2005aug19-010@Yahoo.Com>
> From: Ray Dillinger <····@sonic.net>
> > Supposed we revived the PSL (Portable Standard Lisp) concept of
> > SysLisp? ...
> I would absolutely adore that.

Wow, that's the first positive response I've ever gotten in the several
times I've expressed that feature-wish. Thanks!
From: Ray Dillinger
Subject: Re: SysLisp
Date: 
Message-ID: <uGPOe.10341$p%3.39770@typhoon.sonic.net>
Robert Maas, see http://tinyurl.com/uh3t wrote:
>>From: Ray Dillinger <····@sonic.net>
>>
>>>Supposed we revived the PSL (Portable Standard Lisp) concept of
>>>SysLisp? ...
>>
>>I would absolutely adore that.
> 
> 
> Wow, that's the first positive response I've ever gotten in the several
> times I've expressed that feature-wish. Thanks!


Well, that's precisely the ability that C has with inline
assembler; the point is that someobody can write a library
like (say) curses *FROM SCRATCH* as opposed to making an
interface to a library already written for some other
language and thereafter dealing with/depending on the
runtime of that other language.  True that it won't be
portable, but portability will be a matter of *one*
chunk of code in *one* place, and you won't be tracking
updates and changes in the curses library of other
systems; just hardware, which is much more stable.

It allows you to define routines that work natively on
your own language's abstractions and data types rather
than defining all of those abstractions and datatypes
using the primitives of another runtime, and that's very
important.  It means that you have more freedom to pick
your own abstractions, as opposed to a strong reason to
use the same abstractions as the implementation
language, which may not be appropriate to the one you're
implementing.

Alternatively, it allows you a rock-solid method of building
interfaces to libraries written in other languages, too,
if that's what you want to do.  I've yet to see an FFI for
Lisp that gracefully handles calls to code written in (say)
Forth, Smalltalk, or Prolog.  But if I could insert raw
assembler, and I had a binary-map of the data structures
used in both languages, I could build one without needing
help from the Lisp implementor and without running everything
through a C substrate.

			Bear
From: Robert Maas, see http://tinyurl.com/uh3t
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <REM-2005jun29-002@Yahoo.Com>
> From: "Sashank Varma" <············@yahoo.com>
> The laments one hears about the slowdown in Lisp's progress might be
> just this -- nostalgia. What's weird is that they are rarely spoken
> by those who actually lived through the halcyon days of MIT in the
> 1970s, ...

I think I qualify slightly. I took a serious liking for Lisp in 1973,
snuck on campus terminals pretending to be a student just to connect to
SU-AI and run Lisp 1.6 or UCI-Lisp. A few years later I discovered
MacLisp which was a great improvement. A few years later I was forced
to downgrade to SL (Standard Lisp) then slight upgrade to PSL (Portable
Standard Lisp). I didn't like CL (Common Lisp) when it first was
described, because it was grossly large compared to PSL, would take
forever to implement. But I eventually learned to really appreciate
some of its new features:
- Keyword parameters
- Multiple return values instead of hacked lists returned
- Rich library of common utilities, mostly nicely designed
- FORMAT, UNWIND-PROTECT, SETF and places, ...

Nostalgia? No, CL is much better than the predecessors (I never had
access to LispMachine Lisp, which I suspect was great too). But MacLisp
had one feature I really think CL should get: Software File Arrays (do
I remember the name correctly after 25 years?). They allowed you to
define streams which provided basic I/O functions any way you wanted,
and then the standard I/O functions could deal with those the same way
they deal with system-defined I/O streams. PSL had another feature I
wish CL had: SysLisp, the ability to compile s-expressions not for
running in a full Lisp environment, but for running in a bare machine
environment, thereby allowing us to write s-expression code for the
kernel of an operating system or a compiler etc. The SysLisp compiled
to high-level LAP macros, which could then be peephole-optimized at
that abstract level, then the result got expanded to LAP code, which
then could be peephole optimized at the machine-language level, then
the result got translated to assembly-language, which could then be
edited into an assembly-language application which could then be
assembled and loaded and executed as native program. If CL had SysLisp,
maybe CL instead of C would be the preferred program for system
utilities and operating systems for new CPUs. PeepHole optimization is
a lot easier when you have your list of instructions or your list of
high-level macros as a linked list instead of as text strings and you
have the power of Lisp instead of only C to manipulate those lists. And
Java has supplied some modern utilities, such as XML parsers, threads,
GUI, applets, where CL has fallen behind. I'd like to see CL adopt a
lot of those Java ideas. And finally there are some ideas that seem
never to get implemented fully, such as interval arithmetic. Also the
way the Java compiler recognizes dependencies between files and
automatically compiles all the necessary files in a single pass, is a
big win most of the time. CL should figure out how to do that too.

By the way, Java with BeanShell has now essentially caught up with
Lisp's read-eval-print loop, thereby supporting (equally in either
language) single-line-of-code unit-testing as well as
single-function/method unit-testing as a natural thing that happens
during code development instead of a royal pain of writing formal test
rigs outside the normal code-writing stream. The most important use for
single-line unit-testing is in testing error messages. You can directly
call the error-message signaller from the REP or BeanShell immediately
after you write that line of code, then you can contrive bogus data and
test the two lines of code that say:
  (if (horribleconditionthatdependsonseveralparametersbeinginconsistent)
      (signalerror "Longwinded detailed informative error message"))
to see if it really does detect the bogus condition and issue the error
message. Then fix the data to be good data and verify the test doesn't
apply so the error message doesn't occur. Try to unit-test such in C
someday, royal pain!! And how many times, if you don't routinely
unit-test every single line of code, do you write a test and an error
message that look good to you, but the horrible condition never occurs
during unit testing of the entire function because it's too much
trouble to set up test rigs for that, but it turns out the error
message has an error, typically mismatch between FORMAT directives
within the error string and parameters that supply values to those
directives, so the first time the horriblecondition happens weeks later
in production runs, the error message blows up like this:
  Error in format: No more arguments.
    Not enough items in list ~S to satisfy table ~S.
  while processing indirect format string:
    ~&·@<Error in function ~S:  ~3i~:_~?~:>
  Restarts:
    0: [ABORT] Return to Top-Level.
  Debug  (type H for help)
  ("DEF-FORMAT-INTERPRETER #\\?" #<FORMAT::FORMAT-ERROR {90173AD}>)
so you spend an hour tracking down where the error message is located
and trying to figure out what that second parameter was supposed to be
so you can put it in and figuring out a way to re-create the original
circumstances so you can then re-run the program to see the actual
error message you should have seen, before you finally have the info
you need to help you figure out what the real bug was that caused it to
try to print the error message in the first place?
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uu0jhfjb3.fsf@nhplace.com>
·······@Yahoo.Com (Robert Maas, see http://tinyurl.com/uh3t) writes:

> MacLisp
> had one feature I really think CL should get: Software File Arrays (do
> I remember the name correctly after 25 years?). They allowed you to
> define streams which provided basic I/O functions any way you wanted,
> and then the standard I/O functions could deal with those the same way
> they deal with system-defined I/O streams.

Software File Arrays (usually just SFAs) were designed by Howard
Cannon, later the author of Flavors.  They are the largely overlooked
"missing link" in the evolution of certain data abstraction ideas.  I
remember observing that using SFAs and user-defined messages, you
could make I/O streams that didn't do I/O--that is, they were useful
just as a storage abstraction.  They amounted to an object system
minus the syntax abstractions--you had to reference your memory
manually.  It's easy to understand why he did Flavors next.  

In context, these were a wonderful step up, but in CL they would be a
step back, I think.

I interpret your remark here as a request for user-defined streams,
and I suggest that Gray Streams (named for David Gray, their inventor,
and more or less contemporary with ANSI CL) and later Simple Streams
(from Franz [they've been through a couple revs of this, maybe it has
a different name now]) are essentially the proper answer to this in
context.

Although the one thing that SFAs offered that is still arguably useful
would be an opening of up of the opaque storage model so that one could
create objects like structs with indexed slot style access.  This could
again be done in an individual implementation by adding a new metaclass
similar to struct, or perhaps the same metaclass with just some extended
access control.

[I'm not going to respond to your other suggestions here.  I just wanted to
 add information about SFA's to put them in context for readers of c.l.l.
 who might not have experienced them.]
From: Barry Margolin
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <barmar-BCE0D8.23374224062005@comcast.dca.giganews.com>
In article <·············@news.dtpq.com>,
 ······@news.dtpq.com (Christopher C. Stacy) wrote:

> Besides, I don't see anyone stopping anyone else from 
> inventing their own versions of Lisp.

Nothing's stopping them, but it's effectively discouraged.  If someone 
tries to promote their own version, many Lisp devotees will point out 
all the important CL features that it's missing, or explain how one 
could accomplish the same results using CL, etc.  I think it would be 
very difficult for a new Lisp dialect to gain a foothold these days.

-- 
Barry Margolin, ······@alum.mit.edu
Arlington, MA
*** PLEASE post questions in newsgroups, not directly to me ***
From: Paul F. Dietz
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <rKmdnSoxiPQMSyHfRVn-2Q@dls.net>
Barry Margolin wrote:

> Nothing's stopping them, but it's effectively discouraged.  If someone 
> tries to promote their own version, many Lisp devotees will point out 
> all the important CL features that it's missing, or explain how one 
> could accomplish the same results using CL, etc.  I think it would be 
> very difficult for a new Lisp dialect to gain a foothold these days.

I don't know if the Lisp devotees per se are the impediment here.
The new lisp would have to have some decisive advantage in some
way in order to compensate for its lower maturity, smaller
community, and arguably lower chance of survival.

So what's one of those advantages?  How about immutable types
with hash consing?  Systems like ACL2 would benefit from this.
I'd like to see how that could be integrated into CL, perhaps
by subclassing CONS with immutable cells.

	Paul
From: GP lisper
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1119689103.5c2159b81029c3b0c72ad3be52b99905@teranews>
On Fri, 24 Jun 2005 22:59:12 -0500, <·····@dls.net> wrote:
> Barry Margolin wrote:
>
>> Nothing's stopping them, but it's effectively discouraged.  If someone 
>> tries to promote their own version, many Lisp devotees will point out 
>> all the important CL features that it's missing, or explain how one 
>> could accomplish the same results using CL, etc.  I think it would be 
>> very difficult for a new Lisp dialect to gain a foothold these days.
>
> I don't know if the Lisp devotees per se are the impediment here.

Look into c.l.l. history for a recent request about LUSH.  Went down
just as Barry describes.  At some point, people did come around to the
question asked.


-- 
The LOOP construct is really neat, it's got a lot of knobs to turn!
Don't push the yellow one on the bottom.
From: Paul F. Dietz
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <ZZ6dnSyS57rApiDfRVn-2Q@dls.net>
GP lisper wrote:

>>I don't know if the Lisp devotees per se are the impediment here.
> 
> 
> Look into c.l.l. history for a recent request about LUSH.  Went down
> just as Barry describes.

And yet, are they the impediment?  Seems you're engaging
in post hoc reasoning here.

	Paul
From: GP lisper
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1119725141.4f5ae44c01434a754017c0c79c75a55e@teranews>
On Sat, 25 Jun 2005 06:09:17 -0500, <·····@dls.net> wrote:
>
> GP lisper wrote:
>
>>>I don't know if the Lisp devotees per se are the impediment here.
>> 
>> Look into c.l.l. history for a recent request about LUSH.  Went down
>> just as Barry describes.
>
> And yet, are they the impediment?  Seems you're engaging
> in post hoc reasoning here.

LUSH, like any free project, could use help.  How many will assist
them when their reception is as above?


-- 
The LOOP construct is really neat, it's got a lot of knobs to turn!
Don't push the yellow one on the bottom.
From: Paul F. Dietz
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <9ZWdnX_jnYueJiDfRVn-tQ@dls.net>
GP lisper wrote:

>>And yet, are they the impediment?  Seems you're engaging
>>in post hoc reasoning here.
> 
> 
> LUSH, like any free project, could use help.  How many will assist
> them when their reception is as above?

You seem to be making an assumption about cause and effect
that you haven't justified.  Is there just a correlation between
the reaction and the participation, or does the reaction actually
cause lower participation?  You can't assume the former implies
the latter without additional evidence.

	Paul
From: Christopher C. Stacy
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uhdflfnoi.fsf@news.dtpq.com>
GP lisper <········@CloudDancer.com> writes:

> On Fri, 24 Jun 2005 22:59:12 -0500, <·····@dls.net> wrote:
> > Barry Margolin wrote:
> >
> >> Nothing's stopping them, but it's effectively discouraged.  If someone 
> >> tries to promote their own version, many Lisp devotees will point out 
> >> all the important CL features that it's missing, or explain how one 
> >> could accomplish the same results using CL, etc.  I think it would be 
> >> very difficult for a new Lisp dialect to gain a foothold these days.
> >
> > I don't know if the Lisp devotees per se are the impediment here.
> 
> Look into c.l.l. history for a recent request about LUSH.  Went down
> just as Barry describes.  At some point, people did come around to the
> question asked.

I recall that conversation rather differently.
From: Paolo Amoroso
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <87ekaps623.fsf@plato.moon.paoloamoroso.it>
GP lisper <········@CloudDancer.com> writes:

> Look into c.l.l. history for a recent request about LUSH.  Went down
> just as Barry describes.  At some point, people did come around to the
> question asked.

There's another interesting case of a new Lisp dialect.  Some time
ago, a 16 years old programmer won a computing contest with a
submission that was a new Lisp dialect:

  Student Wins Award For New Lisp
  http://lemonodor.com/archives/001037.html

Due to the contest's rules, he was not at liberty of discussing the
details of his project in depth before a ceremony scheduled for next
fall.  But the few details he did discuss generated interest and
positive feedback here.


Paolo
-- 
Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film
Recommended Common Lisp libraries/tools:
- ASDF/ASDF-INSTALL: system building/installation
- CL-PPCRE: regular expressions
- UFFI: Foreign Function Interface
From: Rob Warnock
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <UtmdnT7EEJW_zyLfRVn-hw@speakeasy.net>
Paolo Amoroso  <·······@mclink.it> wrote:
+---------------
| There's another interesting case of a new Lisp dialect.  Some time
| ago, a 16 years old programmer won a computing contest with a
| submission that was a new Lisp dialect:
|   Student Wins Award For New Lisp
|   http://lemonodor.com/archives/001037.html
| Due to the contest's rules, he was not at liberty of discussing the
| details of his project in depth before a ceremony scheduled for next
| fall.  But the few details he did discuss generated interest and
| positive feedback here.
+---------------

Substantially more detail (though still not complete) is available here:

    http://lemonodor.com/archives/001038.html

He says he'll GPL it as soon as it's settled down a bit.


-Rob

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: Christopher C. Stacy
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uis03oy7m.fsf@news.dtpq.com>
"Paul F. Dietz" <·····@dls.net> writes:

> How about immutable types with hash consing?  Systems like ACL2
> would benefit from this.  I'd like to see how that could be
> integrated into CL, perhaps by subclassing CONS with immutable cells.

Sure, okay -- so what's the problem that is 
impeding anybody from trying this out?

(I think we agree that it isn't the existance of Common Lisp.)
From: Tayssir John Gabbour
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1119674543.736642.111200@z14g2000cwz.googlegroups.com>
Christopher C. Stacy wrote:
> "Paul F. Dietz" <·····@dls.net> writes:
> > How about immutable types with hash consing?  Systems like ACL2
> > would benefit from this.  I'd like to see how that could be
> > integrated into CL, perhaps by subclassing CONS with immutable cells.
>
> Sure, okay -- so what's the problem that is
> impeding anybody from trying this out?
>
> (I think we agree that it isn't the existance of Common Lisp.)

Did McCarthy give a nuanced reason?

I would guess it's something like: Lisp's standardization closed off a
lot of research money because it was "completed" in some way. Pure
guess though.

Tayssir
From: Christopher C. Stacy
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <umzpfoygz.fsf@news.dtpq.com>
Barry Margolin <······@alum.mit.edu> writes:

> In article <·············@news.dtpq.com>,
>  ······@news.dtpq.com (Christopher C. Stacy) wrote:
> 
> > Besides, I don't see anyone stopping anyone else from 
> > inventing their own versions of Lisp.
> 
> Nothing's stopping them, but it's effectively discouraged.  If someone 
> tries to promote their own version, many Lisp devotees will point out 
> all the important CL features that it's missing, or explain how one 
> could accomplish the same results using CL, etc.  I think it would be 
> very difficult for a new Lisp dialect to gain a foothold these days.

That's because the Common Lisp community is composed
mainly of people who are trying to solve problems,
and Common Lisp is a good enough base to allow them
to do that.  These people are not very interested in
designing new languages - they already have a language
that is good enough for what they want to do.
Lisp did not evolve in a vacuum - it evolved to suit a
particular (if very general) set of users.   

So even though Lisp obviously can be the base for
something else, nobody has proposed anything else 
of real interest to the user community.

Who is it that would be interested in a new language?

I think this whole thing about Lisp not evolving due
to Common Lisp as largely malarky.  For one thing,
evolutionary progress did not stop with Common Lisp.
It had already pretty much stopped years before that.  
Common Lisp was about gathering the resulting disarray
of stuff together into something that would suit some
particular technical and political needs.

Nothing prevents people from coming up with ideas
and proposing them.  I just don't see anyone coming
up with very many good new ideas for Lisp; Certainly
no well-developed ones.

I don't feel much need for a better language.  
I mean, I can imagine a few things, but I don't personally
have time to work on figuring out whether they are very
good ideas.  I have work to do, and unfortunately it's not
fiddling around with new languages.

The people who are fiddling around with new languages are
mostly not using Lisp, but they haven't shown me anything
terribly interesting, either.   I don't think that the pace
of evolution has anything to do with Common Lisp.

It never had anything to do with Lisp.
It had to do with solving problems, 
and Lisp was merely the vehicle. 
Apparently the complaint is: "Problem solved".
From: Paul F. Dietz
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <krCdnaMFZbNAjyHfRVn-hQ@dls.net>
Sashank Varma wrote:

> This brings up an interesting question: Is the binding constraint of
> the standard, which was critical during the 1980s and 1990s, gonna
> choke the community now that it is again showing signs of growth?

I don't believe it will.  Common Lisp has plenty of room
for extension without doing violence to the current standard,
and the flaws I see in the current standard can be tolerated
or surmounted, IMO.  The bigger question in my mind is whether
the current organization of the lisp community will allow
concensus to be reached on these extensions.

	Paul
From: lin8080
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <42BBCBC2.670717C0@freenet.de>
Sashank Varma schrieb:


> The more general question -- Did standardization produce
> stultification? -- is quite provocative though. 
[...]


> This brings up an interesting question: Is the binding constraint of
> the standard, which was critical during the 1980s and 1990s, gonna
> choke the community now that it is again showing signs of growth?

In my oppinion standards are only good for sellers but not for
developers. It is very difficult to do something new only using
standards. So in my trying, i through away any kind of standards. (sure
there are some standards remaining, like a based-on bone-body).

Other viewpoint is that non-standards are hard to manage. They need a
big part of describe and how to use and most of them are case
specialized. (try to teach this, or search for on the web, brrr, see
fortran survive story)

> Another possibility is that the community
> will split in a healthy way, with business users adhering closely to
> the standard in the interests of portability, and with academics again
> experimenting with new features and birthing new dialects.

Instead of making a new dialect, let a module (interface) come in. While
creating a new dialect one is forced to create all kinds of existing
"standarts" too. This is not always a good solution. 

It shows the necessary of listening to the users wishes (which takes
some time) and enable them to quick-hack the actual needed tool
(utility), a balance act between using one universal tool or tonnes of
tools-boxes (sometimes this is called scripting).

...
stefan
From: Kenny Tilton
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <XENue.13901$XB2.3624265@twister.nyc.rr.com>
Peter Seibel wrote:

> Pascal Costanza <··@p-cos.net> writes:
> 
> 
>>Kenny Tilton wrote:
>>
>>>[apologies if this has materialised in similar form or does so soon
>>>unbeknownst to me, but from where I sit it appears Google ate a
>>>similar report posted yesterday via google groups.]
>>>Dr. McCarthy joined with Henry Baker, his predecessor at the
>>>microphone, in bemoaning the standardization of Common Lisp as
>>>stultifying if not mortifying, in that it ended innovation.
>>
>>As much as I like Common Lisp, I think he has a point here.
> 
> 
> As did Baker, or rather a dozen or so good points--

As one who actually writes application Lisp code, no, Baker bid not have 
a good point at all. For one thing, he manifested utter ignorance of 
refactoring enhancements in other IDEs. For another, oh gimme a fucking 
break, I really need a refactoring tool when I decide to change a class 
name. (Hint: refactoring is all I do, this is not an issue.)

The incredibly sad thing is that baker's premise was "why is Lisp so 
unpopular?" As was his predecesssor's. When I said, get a grip, Lisp is 
taking off, audience members cried out, WTF are you talking about?

These dinosaurs are cute, but they do not follow cll and they have no 
clue about its recent upopularity upsurge.

-- 
Kenny

Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film

"If you plan to enter text which our system might consider to be 
obscene, check here to certify that you are old enough to hear the 
resulting output." -- Bell Labs text-to-speech interactive Web page
From: Espen Vestre
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <kwpsuctc0i.fsf@merced.netfonds.no>
Kenny Tilton <·······@nyc.rr.com> writes:

> These dinosaurs are cute, but they do not follow cll and they have no
> clue about its recent upopularity upsurge.

I realised that this upsurge is real when I went to the lisp meeting
in Amsterdam this spring and compared it to the ELUGM-99, also in
Amsterdam. ELUGM-99 was a three-day event organized by Franz, ie., you
would think, a much larger event than this years' one-day meeting.
But if I remember right, it had only half the number of attendees
(which we still thought was o.k.). 

(I just discovered that the pictures from ELUGM '99 are still
 on the Franz Web server:
http://www.franz.com/services/conferences_seminars/ELUGM_Pics/)
-- 
  (espen)
From: Kenny Tilton
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uCUue.26715$IX4.12021@twister.nyc.rr.com>
Espen Vestre wrote:

> Kenny Tilton <·······@nyc.rr.com> writes:
> 
> 
>>These dinosaurs are cute, but they do not follow cll and they have no
>>clue about its recent upopularity upsurge.
> 
> 
> I realised that this upsurge is real when I went to the lisp meeting
> in Amsterdam this spring and compared it to the ELUGM-99, also in
> Amsterdam. ELUGM-99 was a three-day event organized by Franz, ie., you
> would think, a much larger event than this years' one-day meeting.
> But if I remember right, it had only half the number of attendees
> (which we still thought was o.k.). 

yeah, one of the great reasons for going to a conference is being 
surrounded by dozens of people actively using Lisp and the consequent 
morale boost.

The ALU should send an email to all the attendees: "If you do not know 
CL is winning, Kenny says read c.l.l. every day and cheer up."


-- 
Kenny

Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film

"If you plan to enter text which our system might consider to be 
obscene, check here to certify that you are old enough to hear the 
resulting output." -- Bell Labs text-to-speech interactive Web page
From: Paolo Amoroso
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <8764w2i55w.fsf@plato.moon.paoloamoroso.it>
Kenny Tilton <·······@nyc.rr.com> writes:

> The ALU should send an email to all the attendees: "If you do not know
> CL is winning, Kenny says read c.l.l. every day and cheer up."

Another useful reading resource:

  Planet Lisp
  http://planet.lisp.org


Paolo
-- 
Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film
Recommended Common Lisp libraries/tools:
- ASDF/ASDF-INSTALL: system building/installation
- CL-PPCRE: regular expressions
- UFFI: Foreign Function Interface
From: Paolo Amoroso
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <87br5w87f7.fsf@plato.moon.paoloamoroso.it>
Peter Seibel <·····@gigamonkeys.com> writes:

> Pascal Costanza <··@p-cos.net> writes:
[...]
>> As much as I like Common Lisp, I think he has a point here.
>
> As did Baker, or rather a dozen or so good points--can't wait until
> his full slidedeck is available.

Any examples?


Paolo
-- 
Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film
Recommended Common Lisp libraries/tools:
- ASDF/ASDF-INSTALL: system building/installation
- CL-PPCRE: regular expressions
- UFFI: Foreign Function Interface
From: Carl Shapiro
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <ouyr7es3yah.fsf@panix3.panix.com>
Paolo Amoroso <·······@mclink.it> writes:

> Peter Seibel <·····@gigamonkeys.com> writes:
> 
> > Pascal Costanza <··@p-cos.net> writes:
> [...]
> >> As much as I like Common Lisp, I think he has a point here.
> >
> > As did Baker, or rather a dozen or so good points--can't wait until
> > his full slidedeck is available.
> 
> Any examples?

http://www.international-lisp-conference.org/multimedia/baker-slides.pdf

More material will become available as soon as I collect the slides
and receive permission to distribute it.  Same goes for the audio we
recorded during all of the conference tracks.
From: Peter Seibel
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <m2zmtfg48v.fsf@gigamonkeys.com>
Paolo Amoroso <·······@mclink.it> writes:

> Peter Seibel <·····@gigamonkeys.com> writes:
>
>> Pascal Costanza <··@p-cos.net> writes:
> [...]
>>> As much as I like Common Lisp, I think he has a point here.
>>
>> As did Baker, or rather a dozen or so good points--can't wait until
>> his full slidedeck is available.
>
> Any examples?

So despite Kenny's scoffing I haven't yet seen an environment that
makes it as easy (and accurate) as it ought to be to change the names
of things. Apparently the refactoring browser for Smalltalk was pretty
good but I've never actually seen it in action. And some of the newer
Java tools such as Eclipse can sort of do it but two slowly and not
accurately enough in my brief (and recent) experience.

I also have been thinking along similar lines as Baker about the need
to integrate tests with the code they test and I'm not imposed to the
idea of the language environment using type checking to help me find
bugs in my code as long as it's like a butler rather than a
dominattrix--who was it (Baker?) who drew the distinction between
"strong" and "strong-armed" type checking?

I also like the idea of being able to inline and uninline functions
easily.

That said, I'm not sure I understood what Baker's problem with dynamic
variables was. They're definitely on my list of "important Lisp
features".

Okay, so that wasn't really a dozen. But there a bunch of things on
his other slides that are at least food for thought.

-Peter

-- 
Peter Seibel           * ·····@gigamonkeys.com
Gigamonkeys Consulting * http://www.gigamonkeys.com/
Practical Common Lisp  * http://www.gigamonkeys.com/book/
From: alex goldman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1738517.UFfRr06LEb@yahoo.com>
Peter Seibel wrote:

> That said, I'm not sure I understood what Baker's problem with dynamic
> variables was. They're definitely on my list of "important Lisp
> features"

As Paolo would put it, "you may learn Scheme" :-)
From: Jens Axel Søgaard
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <42bc405d$0$215$edfadb0f@dread12.news.tele.dk>
Peter Seibel wrote:

> So despite Kenny's scoffing I haven't yet seen an environment that
> makes it as easy (and accurate) as it ought to be to change the names
> of things. 

Do you know the syntax-checker tool in DrScheme?

When you click the "syntax-check" button the tool internally
expands the source code and figures out with variables is
bound where.

Now moving the mouse over a binding instance of variable will
overlay arrows to the bound instances. Right clicking in the
binding instance reveals a rename menu item that will rename
the binding instance and all bound instances.

The third screenshot of

<http://www.plt-scheme.org/software/drscheme/tour/tour-Z-H-12.html>

shows the overlayed arrows.

For better screen shots see:

<http://www.ccs.neu.edu/scheme/pubs/jfp01-fcffksf.ps>

-- 
Jens Axel Søgaard
From: Peter Seibel
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <m2mzpffyuz.fsf@gigamonkeys.com>
Jens Axel S�gaard <······@soegaard.net> writes:

> Peter Seibel wrote:
>
>> So despite Kenny's scoffing I haven't yet seen an environment that
>> makes it as easy (and accurate) as it ought to be to change the names
>> of things. 
>
> Do you know the syntax-checker tool in DrScheme?
>
> When you click the "syntax-check" button the tool internally
> expands the source code and figures out with variables is
> bound where.
>
> Now moving the mouse over a binding instance of variable will
> overlay arrows to the bound instances. Right clicking in the
> binding instance reveals a rename menu item that will rename
> the binding instance and all bound instances.

I have to admit that I've installed PLT Scheme several times and poked
at it a bit but never really got deeply into it. I'll move it up a bit
on my list of things to check out. Thanks.

-Peter

-- 
Peter Seibel           * ·····@gigamonkeys.com
Gigamonkeys Consulting * http://www.gigamonkeys.com/
Practical Common Lisp  * http://www.gigamonkeys.com/book/
From: Jens Axel Søgaard
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <42bc4612$0$305$edfadb0f@dread12.news.tele.dk>
Jens Axel Søgaard wrote:
> When you click the "syntax-check" button the tool internally
> expands the source code and figures out with variables is
> bound where.
> 
> Now moving the mouse over a binding instance of variable will
> overlay arrows to the bound instances. Right clicking in the
> binding instance reveals a rename menu item that will rename
> the binding instance and all bound instances.

I forgot:

Moving the mouse over an bound instance will make the IDE
overlay an arrow to the binding instance. Right clicking
will give you an option to "jump to binding instance".
This is quite handy when the binding instance isn't nearby,
say at the top of the source or in a another module.

-- 
Jens Axel Søgaard
From: Kenny Tilton
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <ZwNue.13900$XB2.3623717@twister.nyc.rr.com>
Pascal Costanza wrote:

> Kenny Tilton wrote:
> 
>> [apologies if this has materialised in similar form or does so soon 
>> unbeknownst to me, but from where I sit it appears Google ate a 
>> similar report posted yesterday via google groups.]
>>
>> Dr. McCarthy joined with Henry Baker, his predecessor at the 
>> microphone, in bemoaning the standardization of Common Lisp as 
>> stultifying if not mortifying, in that it ended innovation.
> 
> 
> As much as I like Common Lisp, I think he has a point here.

Please get back to us when you have some application functionality you 
cannot express in Common Lisp. As much as you think you like CL....

....PWUAAAAHAHAHAHAHAHAHAHAHAHH!!!!!!!

-- 
Kenny

Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film

"If you plan to enter text which our system might consider to be 
obscene, check here to certify that you are old enough to hear the 
resulting output." -- Bell Labs text-to-speech interactive Web page
From: Pascal Costanza
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <3i1p2dFjdnjsU1@individual.net>
Kenny Tilton wrote:
> 
> Pascal Costanza wrote:
> 
>> Kenny Tilton wrote:
>>
>>> [apologies if this has materialised in similar form or does so soon 
>>> unbeknownst to me, but from where I sit it appears Google ate a 
>>> similar report posted yesterday via google groups.]
>>>
>>> Dr. McCarthy joined with Henry Baker, his predecessor at the 
>>> microphone, in bemoaning the standardization of Common Lisp as 
>>> stultifying if not mortifying, in that it ended innovation.
>>
>> As much as I like Common Lisp, I think he has a point here.
> 
> Please get back to us when you have some application functionality you 
> cannot express in Common Lisp.

...only after you have made sure that you're not implicitly using a 
Turing equivalence argument here. ;-P



Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Kenny Tilton
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <wxUue.26714$IX4.23506@twister.nyc.rr.com>
Pascal Costanza wrote:

> Kenny Tilton wrote:
> 
>>
>> Pascal Costanza wrote:
>>
>>> Kenny Tilton wrote:
>>>
>>>> [apologies if this has materialised in similar form or does so soon 
>>>> unbeknownst to me, but from where I sit it appears Google ate a 
>>>> similar report posted yesterday via google groups.]
>>>>
>>>> Dr. McCarthy joined with Henry Baker, his predecessor at the 
>>>> microphone, in bemoaning the standardization of Common Lisp as 
>>>> stultifying if not mortifying, in that it ended innovation.
>>>
>>>
>>> As much as I like Common Lisp, I think he has a point here.
>>
>>
>> Please get back to us when you have some application functionality you 
>> cannot express in Common Lisp.
> 
> 
> ...only after you have made sure that you're not implicitly using a 
> Turing equivalence argument here. ;-P

Nope. Fire away. But I /am/ armed with DEFMACRO, so get those shields up. :)

-- 
Kenny

Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film

"If you plan to enter text which our system might consider to be 
obscene, check here to certify that you are old enough to hear the 
resulting output." -- Bell Labs text-to-speech interactive Web page
From: Pascal Costanza
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <3i2uk0Fjp8nkU1@individual.net>
Kenny Tilton wrote:
> 
> Pascal Costanza wrote:
> 
>> Kenny Tilton wrote:
>>>
>>> Please get back to us when you have some application functionality 
>>> you cannot express in Common Lisp.
>>
>> ...only after you have made sure that you're not implicitly using a 
>> Turing equivalence argument here. ;-P
> 
> Nope. Fire away. But I /am/ armed with DEFMACRO, so get those shields 
> up. :)

Oh, so you _are_ using a Turing equivalence argument. ;)

To put it like this: I am interested in the kinds of macros that noone 
has written yet.


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Ray Dillinger
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <POrve.2068$p%3.14799@typhoon.sonic.net>
Pascal Costanza wrote:

> To put it like this: I am interested in the kinds of macros that noone 
> has written yet.

Really?

Okay...  indulge me for a minute here then.  Let's say you
have a lisp with lazy semantics (promises get passed as arguments
instead of computed values).

But, in this particular lisp, promises are transparent objects.
You can use accessors to extract the form to be evaluated or
the environment it's to be evaluated in, substitute different
forms, or even substitute a different environment, before forcing
the promise.  Alternatively, you can just *not* force the promise,
and instead use an explicit call to eval, with an environment as
a second function.

Now, I claim that you can model any macrology that any lisp
has ever had, in terms of first-class lambda functions, in
this Lisp dialect. They can be stored in structures, returned
from functions, used with map and apply, and etc.

Do you find this interesting?

(And before you ask, yes, the performance hit is awful....)

				Bear
From: Pascal Costanza
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <3i8kcoFkf74uU1@individual.net>
Ray Dillinger wrote:
> Pascal Costanza wrote:
> 
>> To put it like this: I am interested in the kinds of macros that noone 
>> has written yet.
> 
> Really?
> 
> Okay...  indulge me for a minute here then.  Let's say you
> have a lisp with lazy semantics (promises get passed as arguments
> instead of computed values).
> 
> But, in this particular lisp, promises are transparent objects.
> You can use accessors to extract the form to be evaluated or
> the environment it's to be evaluated in, substitute different
> forms, or even substitute a different environment, before forcing
> the promise.  Alternatively, you can just *not* force the promise,
> and instead use an explicit call to eval, with an environment as
> a second function.
> 
> Now, I claim that you can model any macrology that any lisp
> has ever had, in terms of first-class lambda functions, in
> this Lisp dialect. They can be stored in structures, returned
> from functions, used with map and apply, and etc.
> 
> Do you find this interesting?

Yes, I do, but that is not what I am talking about. I said "kinds of 
macros", not "kinds of macro systems".

I know that you are working on such a beast. Have you already taken a 
look at 3-Lisp? You definitely should...


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Ray Dillinger
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <h8Uve.2192$p%3.15028@typhoon.sonic.net>
Pascal Costanza wrote:
> Ray Dillinger wrote:
> 
>> Pascal Costanza wrote:
>>
>>> To put it like this: I am interested in the kinds of macros that 
>>> noone has written yet.
 >>
>> <clip>
>> Now, I claim that you can model any macrology that any lisp
>> has ever had, in terms of first-class lambda functions, in
>> this Lisp dialect. They can be stored in structures, returned
>> from functions, used with map and apply, and etc.
>>
>> Do you find this interesting?
> 
> 
> Yes, I do, but that is not what I am talking about. I said "kinds of 
> macros", not "kinds of macro systems".

Hmmm. Okay, can you give an example?  As far as I can
tell, Lispy macros give pretty much free rein to rewrite
S-expressions into whatever other S-expressions you like.
It's hard to imagine an interesting macro that is of a
kind no one has written yet.

> I know that you are working on such a beast. Have you already taken a 
> look at 3-Lisp? You definitely should...

That is, actually, very interesting.  It looks a lot like
how what I've been working on is, um, *starting* to look.
I wouldn't have seen the similarity when I began, but I
feel like I've been sort of propelled along this road of
design inevitability ... and 3-lisp looks a lot like it's
in the neighborhood of where I'm at now.

My toy lisp has gone through a bunch of stages, where things
got more complicated, and then simpler.  Different classes
of functions were introduced, and then several kinds of
limited syntax transforming "wrappers" around functions,
then a standard calling convention so code didn't have to
know what function class it was calling, and then lightning
struck and I realized that there was a general implementation
that covered all these entities and made them just functions;
they had wound up having the same representation and
implementation under the hood, so I could simplify the
language by just having a more general lambda form.

It's interesting how it's evolved over time.  Being elbows-
deep in the actual implementation guts keeps making me think
about things, and using the results (I'm hacking a roguelike
game as a sort of language testbed) makes me think about
what's still wrong or clumsy to express, and so I implement
something for that corner case, then I realize that something
I've implemented to cover a "corner case" actually covers a
more general class of cases, and use it to replace the
original code giving a change in semantics....

It keeps surprising me.  It's very cool, in some ways, to
be finding that more general constructs that are easier to
use are usually also conceptually simpler, and while
sometimes harder to implement, usually eliminate the need
for implementing a bunch of other things.  To see it getting
more powerful as it gets simpler makes me feel strongly
that I am on a "right track."

But it's not looking much like Scheme or CL anymore.

				Bear
From: Pascal Costanza
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <3ib0cmFk7mv0U1@individual.net>
Ray Dillinger wrote:
> Pascal Costanza wrote:
> 
>> Ray Dillinger wrote:
>>
>>> Pascal Costanza wrote:
>>>
>>>> To put it like this: I am interested in the kinds of macros that 
>>>> noone has written yet.
> 
>>> <clip>
>>> Now, I claim that you can model any macrology that any lisp
>>> has ever had, in terms of first-class lambda functions, in
>>> this Lisp dialect. They can be stored in structures, returned
>>> from functions, used with map and apply, and etc.
>>>
>>> Do you find this interesting?
>>
>> Yes, I do, but that is not what I am talking about. I said "kinds of 
>> macros", not "kinds of macro systems".
> 
> Hmmm. Okay, can you give an example?  As far as I can
> tell, Lispy macros give pretty much free rein to rewrite
> S-expressions into whatever other S-expressions you like.
> It's hard to imagine an interesting macro that is of a
> kind no one has written yet.

Take a look at my work on dynamically scoped functions, or my ILC 2005 
talk about special classes with dynamically rebindable slots. I haven't 
found related work that does something similar like the former, and the 
latter reportedly exists for Lisp Machines but has been claimed to be 
unimplementable on mainstream CPUs - I have been able to implement this 
is portable Common Lisp (+ MOP). These are admittedly very small steps, 
but that's the kind of language research I happen to be interested in 
and what I think of when I talk about "macros that haven't been 
implemented yet". Note that this was a half-joking attempt to respond to 
Kenny who seems to think that macros already provide the answer to 
everything. This may be true in a certain sense, but doesn't tell us 
much about _how_ to achieve _specific_ functionality.

I believe there are more things that can (and need to) be done, also in 
other directions.

A further comment: A macro is a local source-to-source transformation, 
which means that global issues cannot be taken into account which can be 
a limitation. (One attempt to break out of these limitations were 
"Macros that reach out and touch somewhere" - see 
http://www2.parc.com/csl/groups/sda/publications/papers/Kiczales-Macros91/for-web.pdf 
)

>> I know that you are working on such a beast. Have you already taken a 
>> look at 3-Lisp? You definitely should...
> 
> That is, actually, very interesting.  It looks a lot like
> how what I've been working on is, um, *starting* to look.
> I wouldn't have seen the similarity when I began, but I
> feel like I've been sort of propelled along this road of
> design inevitability ... and 3-lisp looks a lot like it's
> in the neighborhood of where I'm at now.

I hope you'll keep us informed about the progress you make...


Cheers,
Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Joe Marshall
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <fyv3k31y.fsf@ccs.neu.edu>
Pascal Costanza <··@p-cos.net> writes:

> Take a look at my work on dynamically scoped functions, or my ILC 2005
> talk about special classes with dynamically rebindable slots. I
> haven't found related work that does something similar like the
> former, and the latter reportedly exists for Lisp Machines but has
> been claimed to be unimplementable on mainstream CPUs - I have been
> able to implement this is portable Common Lisp (+ MOP). 

The Lisp Machine used `external value cell pointers' which were a kind
of bindable invisible pointer.  The bound slot would contain one of
these objects and when the slot was referenced, the microcode would
trap and dereference the actual value cell.

The pointer was invisible for the purposes of SETF and simple reading,
but was opaque (a normal value) for the purposes of winding and
unwinding the special PDL (the per-thread special variable stack).

It wasn't that this was thought to be unimplementable on stock
hardware, but rather that supporting invisible pointers in general
would kill performance because every read would need a conditional
branch on the pointer type.
From: Pascal Costanza
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <3ib86bFkqom4U1@individual.net>
Joe Marshall wrote:
> Pascal Costanza <··@p-cos.net> writes:
> 
> 
>>Take a look at my work on dynamically scoped functions, or my ILC 2005
>>talk about special classes with dynamically rebindable slots. I
>>haven't found related work that does something similar like the
>>former, and the latter reportedly exists for Lisp Machines but has
>>been claimed to be unimplementable on mainstream CPUs - I have been
>>able to implement this is portable Common Lisp (+ MOP). 
> 
> 
> The Lisp Machine used `external value cell pointers' which were a kind
> of bindable invisible pointer.  The bound slot would contain one of
> these objects and when the slot was referenced, the microcode would
> trap and dereference the actual value cell.
> 
> The pointer was invisible for the purposes of SETF and simple reading,
> but was opaque (a normal value) for the purposes of winding and
> unwinding the special PDL (the per-thread special variable stack).

Thanks for the comment. Do you happen to know about any citable 
publication where this is described? I would like really like to refer 
to some description of this in my papers.

> It wasn't that this was thought to be unimplementable on stock
> hardware, but rather that supporting invisible pointers in general
> would kill performance because every read would need a conditional
> branch on the pointer type.

Hm, I checked again and it was indeed Tim Bradshaw who claimed several 
times that LETF is not implementable. See for example 
http://groups-beta.google.com/group/comp.lang.lisp/msg/d0328b4e183cc55a?hl=en 
- he made stronger statements in other postings.


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Cameron MacKinnon
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <fOGdnbtHHv746l3fRVn-jQ@rogers.com>
Pascal Costanza wrote:
> Joe Marshall wrote:
> 
>> It wasn't that this was thought to be unimplementable on stock 
>> hardware, but rather that supporting invisible pointers in general 
>> would kill performance because every read would need a conditional 
>> branch on the pointer type.
> 
> 
> Hm, I checked again and it was indeed Tim Bradshaw who claimed
> several times that LETF is not implementable. See for example 
> http://groups-beta.google.com/group/comp.lang.lisp/msg/d0328b4e183cc55a?hl=en
>  - he made stronger statements in other postings.

If you can segregate all the invisible pointers into one continuous area
of memory, one strategy is to leave that area unmapped (and put the
actual forwarding addresses elsewhere...)

Normal pointers work as before, but dereferencing a forwarding pointer
causes a virtual memory fault, and you handle the fixup in the fault
handler. Doing that, you get rid of the conditional branch with "every
read" but the cost of handling forwarding pointers is likely 100-200
cycles per. This works if the majority of pointers are 'normal' and you
can afford the performance hit for relatively infrequent forwarding
pointer case.

Could this work for you?

-- 
Cameron MacKinnon
Toronto, Canada
From: Pascal Costanza
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <3ic3fqFk7s1cU1@individual.net>
Cameron MacKinnon wrote:
> Pascal Costanza wrote:
> 
>> Joe Marshall wrote:
>>
>>> It wasn't that this was thought to be unimplementable on stock 
>>> hardware, but rather that supporting invisible pointers in general 
>>> would kill performance because every read would need a conditional 
>>> branch on the pointer type.
>>
>> Hm, I checked again and it was indeed Tim Bradshaw who claimed
>> several times that LETF is not implementable. See for example 
>> http://groups-beta.google.com/group/comp.lang.lisp/msg/d0328b4e183cc55a?hl=en 
>>
>>  - he made stronger statements in other postings.
> 
> If you can segregate all the invisible pointers into one continuous area
> of memory, one strategy is to leave that area unmapped (and put the
> actual forwarding addresses elsewhere...)
> 
> Normal pointers work as before, but dereferencing a forwarding pointer
> causes a virtual memory fault, and you handle the fixup in the fault
> handler. Doing that, you get rid of the conditional branch with "every
> read" but the cost of handling forwarding pointers is likely 100-200
> cycles per. This works if the majority of pointers are 'normal' and you
> can afford the performance hit for relatively infrequent forwarding
> pointer case.
> 
> Could this work for you?

Maybe, I don't know. This sounds very low-level. I already have a 
portable implementation.

Thanks anyway.


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Joe Marshall
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <y88ufrn8.fsf@comcast.net>
Cameron MacKinnon <··········@clearspot.net> writes:

> Pascal Costanza wrote:
>> Joe Marshall wrote:
>>
>>> It wasn't that this was thought to be unimplementable on stock
>>> hardware, but rather that supporting invisible pointers in general
>>> would kill performance because every read would need a conditional
>>> branch on the pointer type.
>> Hm, I checked again and it was indeed Tim Bradshaw who claimed
>> several times that LETF is not implementable. See for example
>> http://groups-beta.google.com/group/comp.lang.lisp/msg/d0328b4e183cc55a?hl=en
>>  - he made stronger statements in other postings.
>
> If you can segregate all the invisible pointers into one continuous area
> of memory, one strategy is to leave that area unmapped (and put the
> actual forwarding addresses elsewhere...)
>
> Normal pointers work as before, but dereferencing a forwarding pointer
> causes a virtual memory fault, and you handle the fixup in the fault
> handler. Doing that, you get rid of the conditional branch with "every
> read" but the cost of handling forwarding pointers is likely 100-200
> cycles per. This works if the majority of pointers are 'normal' and you
> can afford the performance hit for relatively infrequent forwarding
> pointer case.
>
> Could this work for you?

The external value cell pointers were `snapped' when they were read,
not when they were dereferenced.  You could delay snapping them, but
then you would complicate EQ.

-- 
~jrm
From: Joe Marshall
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <3br2h69w.fsf@comcast.net>
Pascal Costanza <··@p-cos.net> writes:

> Joe Marshall wrote:
>> Pascal Costanza <··@p-cos.net> writes:
>>
>>>Take a look at my work on dynamically scoped functions, or my ILC 2005
>>>talk about special classes with dynamically rebindable slots. I
>>>haven't found related work that does something similar like the
>>>former, and the latter reportedly exists for Lisp Machines but has
>>>been claimed to be unimplementable on mainstream CPUs - I have been
>>> able to implement this is portable Common Lisp (+ MOP).
>> The Lisp Machine used `external value cell pointers' which were a
>> kind
>> of bindable invisible pointer.  The bound slot would contain one of
>> these objects and when the slot was referenced, the microcode would
>> trap and dereference the actual value cell.
>> The pointer was invisible for the purposes of SETF and simple
>> reading,
>> but was opaque (a normal value) for the purposes of winding and
>> unwinding the special PDL (the per-thread special variable stack).
>
> Thanks for the comment. Do you happen to know about any citable
> publication where this is described? I would like really like to refer
> to some description of this in my papers.

There is a brief mention of them here:

  http://www.dridus.com/~nyef/lispm/ssdn2/sect12.html


-- 
~jrm
From: Marcin 'Qrczak' Kowalczyk
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <871x6hf01s.fsf@qrnik.zagroda>
Pascal Costanza <··@p-cos.net> writes:

> Hm, I checked again and it was indeed Tim Bradshaw who claimed
> several times that LETF is not implementable.

It's unimplementable for arbitrary places in the presence of threads
(a given container would need to appear to have different elements in
different threads), but it makes sense when the place is "special".

I never liked that CL uses the same "let" construct for two completely
different operations: creating a new variable, and rebinding an
existing special variable.


Summary: CL places are only readable and settable; local rebinding is
a third operation which may be supported by particular kinds of places.


My language Kogut has various kinds of variables, the most important
being:
- constants (you can read them)
- plain mutable variables (you can read them and set them)
- dynamically scoped variables (you can read them, set them,
  and rebind them locally)

Local rebinding creates a new binding of a variable for the current
thread, which is then read and set independently from other bindings -
like in CL (except that it's inherited by created threads).

Any kind of variable may be used in any scope: global, local, or
object field. So you can have dynamically scoped object fields for
example, their semantics is obvious.

Elements of mutable collections behave like mutable variables, because
collections generally only provide reading and setting elements (and
their implementations are suited only for that).

But variables are first-class objects, so you could also define
collections with dynamically scoped elements - by making a collection
*of* dynamically scoped variables, and redirecting element access
to variable contents. These would be separate kinds of collections;
you can't locally rebind a plain mutable variable nor an element of a
regular collection. I don't see a utility of such collections though.

-- 
   __("<         Marcin Kowalczyk
   \__/       ······@knm.org.pl
    ^^     http://qrnik.knm.org.pl/~qrczak/
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uirzson4l.fsf@nhplace.com>
Marcin 'Qrczak' Kowalczyk <······@knm.org.pl> writes:

> Pascal Costanza <··@p-cos.net> writes:
> 
> > Hm, I checked again and it was indeed Tim Bradshaw who claimed
> > several times that LETF is not implementable.
> 
> It's unimplementable for arbitrary places in the presence of threads
> (a given container would need to appear to have different elements in
> different threads), but it makes sense when the place is "special".

But the place this always comes up is in CLIM and the reason is that you
need LETF to assure that the "pen state" (usually an operating system
resource) and other such non-special-variable things are kept in proper
state per-process.

> I never liked that CL uses the same "let" construct for two completely
> different operations: creating a new variable, and rebinding an
> existing special variable.

I'm a little mixed on this.  There were strong compatibility reasons for
doing it the way we did it because LOTS of legacy code uses the old way
of doing this.  But mostly I agree that having a separate construct would
be useful.

FWIW, ISLISP [see islisp.info] uses DYNAMIC-LET.  And it's not hard to 
implement this in CL.

 (defmacro dynamic-let (bindings &body forms)
   `(let ,bindings
      (declare (special ;just in case not globally special
                 ,@(mapcar #'car bindings))) 
      ,@forms))

Of course, you can't force others to use it.  But you can adopt your own
personal programming style of using it to make special bindings more
visible to the human reader of your code:  e.g.,

  (DYNAMIC-LET ((X 3) (Y 4))
    (EVAL '(+ X Y)))
  => 7

> My language Kogut has various kinds of variables, the most important
> being:
> - constants (you can read them)

Of curiosity, mostly for my own personal mental statistics-taking,
can you bind them?  In ISLISP, you can bind them, in CL you cannot.
e.g., ISLISP lets you do

 (defconstant pie 3)
 (let ((pie 4)) (* 2 pie)) => 8 

I personally think this is a choice that language designers are forced
to make in a mostly arbitrary way, since there are arguments on both
sides of this one.
From: Christopher C. Stacy
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <upsu03j78.fsf@news.dtpq.com>
Kent M Pitman <······@nhplace.com> writes:

> Marcin 'Qrczak' Kowalczyk <······@knm.org.pl> writes:
> 
> > Pascal Costanza <··@p-cos.net> writes:
> > 
> > > Hm, I checked again and it was indeed Tim Bradshaw who claimed
> > > several times that LETF is not implementable.
> > 
> > It's unimplementable for arbitrary places in the presence of threads
> > (a given container would need to appear to have different elements in
> > different threads), but it makes sense when the place is "special".
> 
> But the place this always comes up is in CLIM and the reason is that you
> need LETF to assure that the "pen state" (usually an operating system
> resource) and other such non-special-variable things are kept in proper
> state per-process.
> 
> > I never liked that CL uses the same "let" construct for two completely
> > different operations: creating a new variable, and rebinding an
> > existing special variable.

*LET*
From: Christopher C. Stacy
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uhdfc3ioo.fsf@news.dtpq.com>
······@news.dtpq.com (Christopher C. Stacy) writes:

> Kent M Pitman <······@nhplace.com> writes:
> 
> > Marcin 'Qrczak' Kowalczyk <······@knm.org.pl> writes:
> > 
> > > Pascal Costanza <··@p-cos.net> writes:
> > > 
> > > > Hm, I checked again and it was indeed Tim Bradshaw who claimed
> > > > several times that LETF is not implementable.
> > > 
> > > It's unimplementable for arbitrary places in the presence of threads
> > > (a given container would need to appear to have different elements in
> > > different threads), but it makes sense when the place is "special".
> > 
> > But the place this always comes up is in CLIM and the reason is that you
> > need LETF to assure that the "pen state" (usually an operating system
> > resource) and other such non-special-variable things are kept in proper
> > state per-process.
> > 
> > > I never liked that CL uses the same "let" construct for two completely
> > > different operations: creating a new variable, and rebinding an
> > > existing special variable.
> 
> *LET*

For those with newsreaders who change the literal text
into something else for display purposes, that joke was:

{asterisk}LET{asterisk}
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uk6k8rdui.fsf@nhplace.com>
······@news.dtpq.com (Christopher C. Stacy) writes:

> ······@news.dtpq.com (Christopher C. Stacy) writes:
 
[...]

> > *LET*
> 
> For those with newsreaders who change the literal text
> into something else for display purposes, that joke was:
> 
> {asterisk}LET{asterisk}

For those with such newsreaders, more generally:

 Disable the autobold feature or expect a lot of suggestions you
 get here about how to fix your code that uses special variables 
 to never seem to work.
From: Marcin 'Qrczak' Kowalczyk
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <87acl4lrd3.fsf@qrnik.zagroda>
Kent M Pitman <······@nhplace.com> writes:

>> My language Kogut has various kinds of variables, the most important
>> being:
>> - constants (you can read them)
>
> Of curiosity, mostly for my own personal mental statistics-taking,
> can you bind them?  In ISLISP, you can bind them, in CL you cannot.
> e.g., ISLISP lets you do
>
>  (defconstant pie 3)
>  (let ((pie 4)) (* 2 pie)) => 8 

A local variable completely shadows any other meaning of the name.
So yes, because this is a completely independent variable. You can
shadow a constant with a mutable variable, or vice versa.

-- 
   __("<         Marcin Kowalczyk
   \__/       ······@knm.org.pl
    ^^     http://qrnik.knm.org.pl/~qrczak/
From: Pascal Costanza
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <3iodmhFmfnt9U1@individual.net>
Marcin 'Qrczak' Kowalczyk wrote:

> Pascal Costanza <··@p-cos.net> writes:
> 
>>Hm, I checked again and it was indeed Tim Bradshaw who claimed
>>several times that LETF is not implementable.
> 
> It's unimplementable for arbitrary places in the presence of threads
> (a given container would need to appear to have different elements in
> different threads), but it makes sense when the place is "special".

Sorry to disappoint you, but I have implemented it in portable Common 
Lisp + MOP.

> I never liked that CL uses the same "let" construct for two completely
> different operations: creating a new variable, and rebinding an
> existing special variable.

Correct, it is suboptimal that LET is used for both. ISLISP has a better 
design in this regard, special variables need to be bound with 
dynamic-let instead of let there.

> Summary: CL places are only readable and settable; local rebinding is
> a third operation which may be supported by particular kinds of places.
> 
> My language Kogut has various kinds of variables, the most important
> being:
> - constants (you can read them)
> - plain mutable variables (you can read them and set them)
> - dynamically scoped variables (you can read them, set them,
>   and rebind them locally)

I think it could be useful to have dynamically scoped variables that you 
can read and rebind, but not set. This may allow some optimizations. 
However, I haven't thought this through yet.

[...]

> But variables are first-class objects, so you could also define
> collections with dynamically scoped elements - by making a collection
> *of* dynamically scoped variables, and redirecting element access
> to variable contents. These would be separate kinds of collections;
> you can't locally rebind a plain mutable variable nor an element of a
> regular collection. I don't see a utility of such collections though.

In Common Lisp, global variables are also first class - they are just 
symbols. PROGV allows you to programmatically rebind them. This is what 
I use in my DLETF implementation, and although ANSI Common Lisp doesn't 
specify PROGV in detail, all Common Lisp implementations I have checked 
do what is necessary to turn (gensymmed) symbols into dynamically 
rebindable places.

For further details see my ILC05 paper.


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Paul F. Dietz
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <LrWdnShBHJ43nFrfRVn-gA@dls.net>
Pascal Costanza wrote:
> Marcin 'Qrczak' Kowalczyk wrote:

>> It's unimplementable for arbitrary places in the presence of threads
>> (a given container would need to appear to have different elements in
>> different threads), but it makes sense when the place is "special".
> 
> 
> Sorry to disappoint you, but I have implemented it in portable Common 
> Lisp + MOP.

And it was a nice trick!

	Paul
From: Marcin 'Qrczak' Kowalczyk
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <87hdfcre2k.fsf@qrnik.zagroda>
Pascal Costanza <··@p-cos.net> writes:

>> It's unimplementable for arbitrary places in the presence of threads
>> (a given container would need to appear to have different elements in
>> different threads), but it makes sense when the place is "special".
>
> Sorry to disappoint you, but I have implemented it in portable Common
> Lisp + MOP.

I guess the semantics is different than I would expect, because I'm
pretty sure that the semantics I consider natural is unimplementable
in CL with threads (without threads you could temporarily replace the
value). Where can I read about this?

> I think it could be useful to have dynamically scoped variables
> that you can read and rebind, but not set. This may allow some
> optimizations. However, I haven't thought this through yet.

Indeed, they would allow the compiler to cache the value when the
variable is read several times in the same scope.

-- 
   __("<         Marcin Kowalczyk
   \__/       ······@knm.org.pl
    ^^     http://qrnik.knm.org.pl/~qrczak/
From: Paul F. Dietz
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <X_2dnVn8Y8wZvFrfRVn-jw@dls.net>
Marcin 'Qrczak' Kowalczyk wrote:

> I guess the semantics is different than I would expect, because I'm
> pretty sure that the semantics I consider natural is unimplementable
> in CL with threads (without threads you could temporarily replace the
> value). Where can I read about this?

The trick is to associate the slot with a symbol (a different
symbol for each instance of the object), then use
PROGV to dynamically bind to the symbol.

	Paul
From: Marcin 'Qrczak' Kowalczyk
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <8764vslr97.fsf@qrnik.zagroda>
"Paul F. Dietz" <·····@dls.net> writes:

> Marcin 'Qrczak' Kowalczyk wrote:
>
>> I guess the semantics is different than I would expect, because I'm
>> pretty sure that the semantics I consider natural is unimplementable
>> in CL with threads (without threads you could temporarily replace the
>> value). Where can I read about this?
>
> The trick is to associate the slot with a symbol (a different symbol
> for each instance of the object), then use PROGV to dynamically bind
> to the symbol.

This doesn't work for an arbitrary place, only for slots you define.
What about local rebinding of (car c)?

-- 
   __("<         Marcin Kowalczyk
   \__/       ······@knm.org.pl
    ^^     http://qrnik.knm.org.pl/~qrczak/
From: Paul F. Dietz
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <VdudnearYvikuVrfRVn-pQ@dls.net>
Marcin 'Qrczak' Kowalczyk wrote:

> This doesn't work for an arbitrary place, only for slots you define.
> What about local rebinding of (car c)?

I think we have a misunderstanding here about what was being claimed.

	Paul
From: Pascal Costanza
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <3iouv5FmhfraU1@individual.net>
Marcin 'Qrczak' Kowalczyk wrote:
> Pascal Costanza <··@p-cos.net> writes:
> 
>>>It's unimplementable for arbitrary places in the presence of threads
>>>(a given container would need to appear to have different elements in
>>>different threads), but it makes sense when the place is "special".
>>
>>Sorry to disappoint you, but I have implemented it in portable Common
>>Lisp + MOP.
> 
> I guess the semantics is different than I would expect, because I'm
> pretty sure that the semantics I consider natural is unimplementable
> in CL with threads (without threads you could temporarily replace the
> value). Where can I read about this?

The paper isn't online (yet). I can send it to you if you like to have it.

Further down this thread:

Paul F. Dietz wrote:
 > Marcin 'Qrczak' Kowalczyk wrote:
 >
 >> This doesn't work for an arbitrary place, only for slots you define.
 >> What about local rebinding of (car c)?
 >
 > I think we have a misunderstanding here about what was being claimed.

Apparently that's the case. In my implementation, the place must indeed 
be prepared in some way. For example, you have to declare a slot special 
in order to be able to use dletf, like this:

(defclass person ()
   ((name :initarg :name
          :accessor person-name
          :special t))
   (:metaclass special-class))

The :special declaration does the trick for you. Then you can do stuff 
like this:

(defvar *p* (make-instance 'person :name "Dr. Jekyll"))

(dletf (((person-name *p*) "Mr. Hide"))
   ...)

...and this has the "natural" expected effect, that is, the slot is only 
rebound in the current thread, not in others.

I don't think it's a serious restriction to require the special 
declaration for such slots. To the contrary, it resonates with the 
requirement to distinguish between lexical and special variables elsewhere.

In principle, it's possible to make this work with CAR and other places 
as well, by way of shadowing the respective symbols and reimplementing 
them. Admittedly, this is a bit hackish (but the CLOS integration is 
quite elegant).


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Marcin 'Qrczak' Kowalczyk
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <87psu0i2bk.fsf@qrnik.zagroda>
Pascal Costanza <··@p-cos.net> writes:

> In my implementation, the place must indeed be prepared in some way.

Ok, this is what I expected (and this is similar to what my language
provides).

> In principle, it's possible to make this work with CAR and other
> places as well, by way of shadowing the respective symbols and
> reimplementing them.

It's impractical because for CAR you would have to redefine all
functions which can be used to access it, e.g. MAPCAR and MEMBER,
and recompile all libraries which use them with new definitions.

-- 
   __("<         Marcin Kowalczyk
   \__/       ······@knm.org.pl
    ^^     http://qrnik.knm.org.pl/~qrczak/
From: Pascal Costanza
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <3ipvveFmq97sU1@individual.net>
Marcin 'Qrczak' Kowalczyk wrote:
> Pascal Costanza <··@p-cos.net> writes:
> 
>>In my implementation, the place must indeed be prepared in some way.
> 
> Ok, this is what I expected (and this is similar to what my language
> provides).
> 
>>In principle, it's possible to make this work with CAR and other
>>places as well, by way of shadowing the respective symbols and
>>reimplementing them.
> 
> It's impractical because for CAR you would have to redefine all
> functions which can be used to access it, e.g. MAPCAR and MEMBER,
> and recompile all libraries which use them with new definitions.

Right. Would have been nice if all of Common Lisp were CLOS-based and/or 
had a metaobject protocol...


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <u8y0o0ysa.fsf@nhplace.com>
Pascal Costanza <··@p-cos.net> writes:

> Marcin 'Qrczak' Kowalczyk wrote:
> > Pascal Costanza <··@p-cos.net> writes:
> >
> >>In my implementation, the place must indeed be prepared in some way.
> > Ok, this is what I expected (and this is similar to what my language
> > provides).
> >
> >>In principle, it's possible to make this work with CAR and other
> >>places as well, by way of shadowing the respective symbols and
> >>reimplementing them.
> > It's impractical because for CAR you would have to redefine all
> > functions which can be used to access it, e.g. MAPCAR and MEMBER,
> > and recompile all libraries which use them with new definitions.
> 
> Right. Would have been nice if all of Common Lisp were CLOS-based
> and/or had a metaobject protocol...

I never understand it when people say this.  What do you mean by 
"if all of Common Lisp were CLOS-based"?  The whole point of CLOS and
the meta-object protocol is to provide a coherent explanation of why
you cannot modify certain classes.

Are you saying "if all of Common Lisp were standard-class" or "if the
meta-object protocol were different than it is"?  

If I recall correctly...   Dylan went the route of making all classes
customizable, but in the process they decided there was not enough point
left to the meta-object protocol, so they never defined one.

My point is that sometimes you don't get changes in isolation.  You may
be annoyed that your favorite Chinese restaurant has slow service, and
you might cheer when one day they come under new management, but you
might think twice when the following day the management decides to change
it from Chinese to Indian food.   (Or tell the story the other way around
if you think that's an improvement.)

I wasn't right there in the design discussions for CLOS.  There were too
many committees for people to be on all of them, and I was on some of
the others.  So even though I was involved in CL's design, I can't say
how every detail got there.  But my impression was that a lot of the
reason there's a meta-object protocol at all was less to give you all the
operations there and more to explain why all classes don't have more 
uniform behavior.  Had they had the uniform behavior you wish for, my
personal guess (and I could be wrong) is that ANSI CL would not have had 
no meta-object protocol at all, and that in spite of the uniform behavior
you might or might not have had the ability to customize the parts you 
wanted.  Maybe you would have, but then again, maybe another reason would
have been invented why you couldn't customize CAR, and all you'd have is
the same frustration about not changing that, but still no ability to
change other things.  

My meta-point (heh) is that history does not move in straight lines, and
the changes it brings never comes in ones.  A different history would bring
many changes, not just this one you wish for.  (You are intended to be
comforted by this, but I suppose you might not be.)

In the specific, though, I also just find your terminology confusing.
All of CL _is_ CLOS exactly because the whole philosophy of
meta-objects is about having a heterogeneous, not a homogenous,
underlying representation. CLOS is not just standard class.

Further, the metaobject protocol we have, as it is written, is
carefully defined to accommodate, not to deny, such heterogeneity; 
as such, all of CLOS does_ have a metaobject protocol, it just doesn't 
have the one you wish for.  (Think of it as if you'd said it's a pity
that all of the US doesn't have religion, which accidentally seems to
suggest that those who are not Christian have never thought about matters
of God, rather than choose to think about it in their own way.)

Freedom and uniformity are enemies of one another.  You can always try
to strike what seems to you a balance.  But if you push too much for
one, you tend to lose the other.
From: Pascal Costanza
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <3iqc1hFmlja2U1@individual.net>
Kent M Pitman wrote:

> Pascal Costanza <··@p-cos.net> writes:
> 
>>Right. Would have been nice if all of Common Lisp were CLOS-based
>>and/or had a metaobject protocol...
> 
> I never understand it when people say this.  What do you mean by 
> "if all of Common Lisp were CLOS-based"?  The whole point of CLOS and
> the meta-object protocol is to provide a coherent explanation of why
> you cannot modify certain classes.
> 
> Are you saying "if all of Common Lisp were standard-class" or "if the
> meta-object protocol were different than it is"?

a) "if all of Common Lisp were CLOS-based":

It would have been possible to define cons cells as classes, like this.

(defclass cons ()
   ((car :initarg :car :accessor car)
    (cdr :initarg :cdr :accessor cdr)))

(defun cons (car cdr)
   (make-instance 'cons :car car :cdr cdr))

Similar things could have been done for other data structures. For 
example, Smalltalk has a way to even define array-style data structures 
as classes.

Yes, this would require a major redesign of Common Lisp and would 
probably be too costly, both in development time and maybe runtime overhead.

b) "if all of Common Lisp had a metaobject protocol"

Note that the notion of a metaobject protocol is completely independent 
of CLOS, or of any object system for that matter. One could have a 
metaobject protocol, say, for the reader or REQUIRE, etc., without 
requiring the base constructs to be related to OOP at all.

So one could, say, specify that REQUIRE internally calls 
REQUIRE-USING-CLASS (for example, (require-using-class 
*current-module-system* module)) and let the actual module system be 
specified by a metaclass. You could then do something along the 
following lines.

(defclass asdf-modules ()
   ...)

(defmethod require-using-class
   ((modules asdf-modules) module)
   (asdf:oos 'asdf:load-op module))

(setf *current-module-system*
       (class-prototype (find-class 'asdf-modules)))

Here, REQUIRE is neither a generic function nor in other ways related to 
CLOS. CLOS would only be used at the metalevel.

Again, I don't claim this would be easy to achieve, there are probably 
lots of devils in the details.

> If I recall correctly...   Dylan went the route of making all classes
> customizable, but in the process they decided there was not enough point
> left to the meta-object protocol, so they never defined one.

Hm, I don't know enough about Dylan to be able to comment on this. What 
do you mean by customizable here? Do you mean that all functions are 
generic in Dylan? Then a metaobject protocol would still make sense...

> Had they had the uniform behavior you wish for, my
> personal guess (and I could be wrong) is that ANSI CL would not have had 
> no meta-object protocol at all, and that in spite of the uniform behavior
> you might or might not have had the ability to customize the parts you 
> wanted.  Maybe you would have, but then again, maybe another reason would
> have been invented why you couldn't customize CAR, and all you'd have is
> the same frustration about not changing that, but still no ability to
> change other things.

Sure. Please don't understand my wishful thinking as an implicit 
criticism of Common Lisp. It's clearly a lot simpler to express a wish 
than to work on the details to make it a reality.

> In the specific, though, I also just find your terminology confusing.
> All of CL _is_ CLOS exactly because the whole philosophy of
> meta-objects is about having a heterogeneous, not a homogenous,
> underlying representation. CLOS is not just standard class.

Well, I said "if all of Common Lisp had _a_ metaobject protocol", not 
"if all of Common Lisp were covered by _the_ metaobject protocol". Would 
"if all of Common Lisp had metaobject protocols" been more appropriate?


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uy88new5r.fsf@nhplace.com>
Pascal Costanza <··@p-cos.net> writes:

> Kent M Pitman wrote:
> 
> > Pascal Costanza <··@p-cos.net> writes:
> >
> >>Right. Would have been nice if all of Common Lisp were CLOS-based
> >>and/or had a metaobject protocol...
> > I never understand it when people say this.  What do you mean by "if
> > all of Common Lisp were CLOS-based"?  The whole point of CLOS and
> > the meta-object protocol is to provide a coherent explanation of why
> > you cannot modify certain classes.
> > Are you saying "if all of Common Lisp were standard-class" or "if the
> > meta-object protocol were different than it is"?
> 
> a) "if all of Common Lisp were CLOS-based":
> 
> It would have been possible to define cons cells as classes, like this.
> 
> (defclass cons ()
>    ((car :initarg :car :accessor car)
>     (cdr :initarg :cdr :accessor cdr)))
> 
> (defun cons (car cdr)
>    (make-instance 'cons :car car :cdr cdr))
> 
> Similar things could have been done for other data structures. For
> example, Smalltalk has a way to even define array-style data
> structures as classes.
> 
> Yes, this would require a major redesign of Common Lisp and would
> probably be too costly, both in development time and maybe runtime
> overhead.
> 
> b) "if all of Common Lisp had a metaobject protocol"
> 
> Note that the notion of a metaobject protocol is completely
> independent of CLOS, or of any object system for that matter. One
> could have a metaobject protocol, say, for the reader or REQUIRE,
> etc., without requiring the base constructs to be related to OOP at
> all.
> 
> So one could, say, specify that REQUIRE internally calls
> REQUIRE-USING-CLASS (for example, (require-using-class
> *current-module-system* module)) and let the actual module system be
> specified by a metaclass. You could then do something along the
> following lines.
> 
> (defclass asdf-modules ()
>    ...)
> 
> (defmethod require-using-class
>    ((modules asdf-modules) module)
>    (asdf:oos 'asdf:load-op module))
> 
> (setf *current-module-system*
>        (class-prototype (find-class 'asdf-modules)))
> 
> Here, REQUIRE is neither a generic function nor in other ways related
> to CLOS. CLOS would only be used at the metalevel.
> 
> Again, I don't claim this would be easy to achieve, there are probably
> lots of devils in the details.
> 
> > If I recall correctly...   Dylan went the route of making all classes
> > customizable, but in the process they decided there was not enough point
> > left to the meta-object protocol, so they never defined one.
> 
> Hm, I don't know enough about Dylan to be able to comment on
> this. What do you mean by customizable here? Do you mean that all
> functions are generic in Dylan? Then a metaobject protocol would still
> make sense...
> 
> > Had they had the uniform behavior you wish for, my
> > personal guess (and I could be wrong) is that ANSI CL would not have
> > had no meta-object protocol at all, and that in spite of the uniform
> > behavior
> > you might or might not have had the ability to customize the parts
> > you wanted.  Maybe you would have, but then again, maybe another
> > reason would
> > have been invented why you couldn't customize CAR, and all you'd have is
> > the same frustration about not changing that, but still no ability to
> > change other things.
> 
> Sure. Please don't understand my wishful thinking as an implicit
> criticism of Common Lisp. It's clearly a lot simpler to express a wish
> than to work on the details to make it a reality.
> 
> > In the specific, though, I also just find your terminology confusing.
> > All of CL _is_ CLOS exactly because the whole philosophy of
> > meta-objects is about having a heterogeneous, not a homogenous,
> > underlying representation. CLOS is not just standard class.
> 
> Well, I said "if all of Common Lisp had _a_ metaobject protocol", not
> "if all of Common Lisp were covered by _the_ metaobject
> protocol". Would "if all of Common Lisp had metaobject protocols" been
> more appropriate?

No, as I see it (and I'm just one individual, not the law), the problem is
that you have too narrow a view of the possible space of metaobject protocols.
That is, you presume that if there were one, it would be written to suit you.
But the space of possible meta-object protocols is bigger than you think.
It is possible to have a meta-object protocol that doesn't do what you want,
and that's what CLOS has.  It does not fail to have one, it has one that you
(apparently) wish were different.

You might think that if I had a computer with network protocols, you could
talk to me.  But I could write a network protocol that said that when you 
try to contact me, I refuse your connection.

Do you understand how atheism can be classified as a religion, and hence
protected under freedom of religion?  The belief that there is no God is
not an absence of faith, nor an areligious posistion.

Do you think anarchy is politics, and how anarchists are not apolitical?

The fact that you are told that you cannot subclass something is not
the absence of a meta-object protocol.  It is a meta-object protocol
that explains to you that some things are not permitted.  Without
meta-objects, yuu wouldn't be assured of a different behavior, you'd
just have no way to discuss the matter.

In fact, it's up to the implementation to say what system classes are
built-in classes, incidentally--the language doesn't even keep
implementors from experimenting with what you appear to want.  If you
think it should be easy to have more things be subclassable, you'll
not even be perturbing the standard to try.

My quibble is with your wording, not your desire.  You're free to want 
whatever makes sense to you. But CLOS defines certain terminology for
talking about things, and you're in conflict with that terminology in
your usage here, I feel.

- - - 

I have a similar issue with people wanting to "extend EQUAL" (and various
other operations) to be generic.  They often presume that because they
don't like or find useful the fact that EQUAL is defined as EQ on certain
classes that EQUAL is somehow undefined on those classes.  In fact, EQUAL
is a total function, that is well-defined on all args.  The desire to have
it do something else is a legitimate desire on the part of some, but it
is NOT an extension... it's not like EQUAL takes no position on the matter.
It's that EQUAL takes a position that they don't like, and that in some
odd sense is perhaps too trivial for them to consider as a "real" position.
But just like "atheism" may seem "too trivial" for some religious believers
to take seriously, the issue is not "the answer" but "the question".

There are no religious answers, only religious questions.  That is,
you can't ask religious questions and then define some answers to be 
"about religion" and some answers to be "not involving religion".
Religion is about a search for answers, not about the answers you 
end up with.

There are no political answers, only political questions.  That is,
you can't ask political questions and then define some answers to be
"about politics" and some answers to be "not involving politics".
Politics is about a search for a way for people to interact, not about
how they end up interacting.

In the space of MOPs, there are no "MOP answers", only "MOP questions".
That is, you can't ask the linguistic design questions that lead to a MOP
(shall we allow users to access this, extend that, etc.) and then define
that some results are a "MOP" and some are "not a MOP".  A MOP is a
about a set of answers that explain what you can AND CAN'T do, it is not
a promise that you'll be able to do everything on a list of "must do" 
operations in the mind of some person (or many people).  The MOP that CLOS
has tells you exactly that: what you can and can't do.  It doesn't say
"you can do this for these classes and we didn't think about the others",
it says "you can do this for these classes and we thought hard about the
other classses and decided to tell you that it was a bad idea for us to 
globally define what happens to the others (and here's why), but we did
not think it was a bad idea for individuals to try, we just thought it 
would be limiting for us to force them to try in a particular way".

- - - -

This comes back, again, to the issue of premature standardization discussed
recently in another thread.  Standards ought to codify common practice, not
innovate.  A lot of what CLOS did might seem new, but was common in one
or another of the dialects being unified.  There is certainly current
practic with what you want in other languages, but it would have been
quite dangerous for Common Lisp to suddenly try to be Smalltalk.  Besides,
Smalltalk had already staked out that space.  So Common Lisp picked a model
that explained ALL of what it did, not just something that explained some
of it and left the rest for later work.  CLOS is a total definition.
It just happens to be a total definition that tells you there are some
things you can't do at the same time as telling you there are other things
you can.
From: Pascal Costanza
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <3iqin6Fmuni3U1@individual.net>
Kent M Pitman wrote:

> The fact that you are told that you cannot subclass something is not
> the absence of a meta-object protocol.  It is a meta-object protocol
> that explains to you that some things are not permitted.  Without
> meta-objects, yuu wouldn't be assured of a different behavior, you'd
> just have no way to discuss the matter.

OK. My terminology is unclear because it is not clear what the result of 
"metaobjectifying" all of Common Lisp would actually mean. I don't think 
that just turning everything into instances of standard-class / 
standard-object would do the job. I rather think that parts that are 
underspecified in the CLOS MOP spec (built-in-class, structure-class, 
...) would need different protocols than the ones already available.

I think it's acceptable to speak of "many metaobject protocols". Even 
the CLOS MOP itself discusses several subprotocols.

You will probably object to my use of "underspecified" above. I am not 
using that term as a language lawyer though. Yes, one way to look at 
this issue is to say that the absence of a specification is also a 
specification. However, it is also just a fact that some things are 
simply not specified in the CLOS MOP, like how cons cells are 
structured, or how slots in structures are accessed, etc. This might as 
well be intentional, but that still doesn't change that fact. 
Furthermore, it could indeed turn out that if they were specified they 
might be specified in a way such that I wouldn't be able to do what I 
want to do with them, but that still doesn't affect my position.

> My quibble is with your wording, not your desire.  You're free to want 
> whatever makes sense to you. But CLOS defines certain terminology for
> talking about things, and you're in conflict with that terminology in
> your usage here, I feel.

The original CLOS spec (as found in CLtL2) and the AMOP spec always use 
the qualified term "the CLOS metaobject protocol" or "the CLOS MOP" (or 
even "the Common Lisp Object System metaobject protocol". They don't 
just say "the metaobject protocol", unless it is clear from the context 
that the one for CLOS is meant. Further, the notion of metaobject 
protocols is used outside of CLOS and Common Lisp. I have intended to 
use the term with the latter general meaning, and I hope my use of "a 
metaobject protocol" instead of "the metaobject protocol" has made that 
clear.

I don't see in what way the inclusion of many metaobject protocols for 
different aspects of Common Lisp could do any harm, provided they don't 
contradict each other.

 > A MOP is a
> about a set of answers that explain what you can AND CAN'T do, it is not
> a promise that you'll be able to do everything on a list of "must do" 
> operations in the mind of some person (or many people).  The MOP that CLOS
> has tells you exactly that: what you can and can't do.  It doesn't say
> "you can do this for these classes and we didn't think about the others",
> it says "you can do this for these classes and we thought hard about the
> other classses and decided to tell you that it was a bad idea for us to 
> globally define what happens to the others (and here's why), but we did
> not think it was a bad idea for individuals to try, we just thought it 
> would be limiting for us to force them to try in a particular way".

That's not quite true. Here is the preface to the actual CLOS MOP 
specification given in AMOP:

"In this part of the book, we provide the detailed specification of a 
metaobject protocol for CLOS.  Our work with this protocol has always 
been rooted in our own implementation of CLOS, PCL.  This has made it 
possible for us to have a user community, which in turn has provided us 
with feedback on this protocol as it has evolved.  As a result, much of 
the design presented here is well-tested and stable.  As this is being 
written, those parts have been implemented not only in PCL, but in at 
least three other CLOS implementations we know of.  Other parts of the 
protocol, even though they have been implemented in one form or another 
in PCL and other implementations, are less well worked out.  Work 
remains to improve not only the ease of use of these protocols, but also 
the balance they provide between user extensibility and implementor freedom.

"In preparing this specification, it is our hope that it will provide a 
basis for the users and implementors who wish to work with a metaobject 
protocol for CLOS.  This document should not be construed as any sort of 
final word or standard, but rather only as documentation of what has 
been done so far.  We look forward to seeing the improvements, both 
small and large, which we hope this publication will catalyze."


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <u64vrd46r.fsf@nhplace.com>
Pascal Costanza <··@p-cos.net> writes:

> > A MOP is a
> > about a set of answers that explain what you can AND CAN'T do, it is not
> > a promise that you'll be able to do everything on a list of "must
> > do" operations in the mind of some person (or many people).  The MOP
> > that CLOS
> > has tells you exactly that: what you can and can't do.  It doesn't say
> > "you can do this for these classes and we didn't think about the others",
> > it says "you can do this for these classes and we thought hard about the
> > other classses and decided to tell you that it was a bad idea for us
> > to globally define what happens to the others (and here's why), but
> > we did
> > not think it was a bad idea for individuals to try, we just thought
> > it would be limiting for us to force them to try in a particular
> > way".
> 
> That's not quite true. Here is the preface to the actual CLOS MOP
> specification given in AMOP:
> 
> "In this part of the book, we provide the detailed specification of a
> metaobject protocol for CLOS [...].

As far as I can see, what you quoted is not in conflict with what I
said.  In fact, I think of AMOP these days as a pretty much de facto
layered standard on CL, even though a few conforming implementations
don't have it.  Many do.  CL has a subset of it, but implies that the
rest is there, just unpublished.  But neither its presence or absence
implies that it would span built-in class, which is my real point.  As
I recall (and it's been a while since I looked, but nothing you quoted
seems to contradict what I was intending to say) AMOP does not
guarantee that everything in the language needs to support every
operation it has.  Even with AMOP (which most implementations tend to
support), it might still be not be possible to subclass CONS, not
because of the failure to have a MOP but because built-in classes
don't have to participate.

Anyway, I had a point to make, and I've made it.  Now I'm just
repeating myself.  So this is likely all I'll say on this subthread
unless there's some substantially new issue to comment on.
From: Pascal Costanza
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <3ir5ajFmiitkU1@individual.net>
Kent M Pitman wrote:

> As far as I can see, what you quoted is not in conflict with what I
> said.  In fact, I think of AMOP these days as a pretty much de facto
> layered standard on CL, even though a few conforming implementations
> don't have it.

A standard with some holes in it. Some of the stuff in the CLOS MOP is 
really not useable, as far as I can tell. (Gee, I hope that noone quotes 
this out of context! So let me stress this here: Most of the stuff in 
the CLOS MOP is indeed very useful! But this shouldn't make us blind 
towards the problems that it has, especially when even its authors say so.)

> Many do.  CL has a subset of it, but implies that the
> rest is there, just unpublished.  But neither its presence or absence
> implies that it would span built-in class, which is my real point.

OK, let me try one more time: I don't think it would be a good idea to 
extend the CLOS MOP to cover, say, cons cells by just turning CONS into 
a standard-class. I think such data structures would require a different 
kind of metaobject protocol. (And no, I haven't thought this through at 
all.)

My sketch of defining CONS with DEFCLASS has probably set this 
discussion on the wrong track. If so, sorry for that.


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uacl34i7d.fsf@nhplace.com>
Pascal Costanza <··@p-cos.net> writes:

> OK, let me try one more time: I don't think it would be a good idea to
> extend the CLOS MOP to cover, say, cons cells by just turning CONS
> into a standard-class. I think such data structures would require a
> different kind of metaobject protocol. (And no, I haven't thought this
> through at all.)

I don't really have any conflict with this and wouldn't have even chimed
in if I'd thought originally this was all you were saying.

I'd be interested to read a coherent proposal on this if you ever had one.
Don't let my other remarks dissuade you from either doing it or thinking
I'd be interested.

> My sketch of defining CONS with DEFCLASS has probably set this
> discussion on the wrong track. If so, sorry for that.

Or perhaps also my remarks.  If so, I'm sorry for that as well.
From: Christopher C. Stacy
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uzmt32927.fsf@news.dtpq.com>
Pascal Costanza <··@p-cos.net> writes:
> Yes, this would require a major redesign of Common Lisp and would
> probably be too costly, both in development time and maybe runtime
> overhead.

Mainly, it would not have been "Common Lisp", which is a 
codification of current standard practices -- what you are
proposing is some all-new different Lisp language.
From: Pascal Costanza
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <3iqip2Fmuni3U2@individual.net>
Christopher C. Stacy wrote:
> Pascal Costanza <··@p-cos.net> writes:
> 
>>Yes, this would require a major redesign of Common Lisp and would
>>probably be too costly, both in development time and maybe runtime
>>overhead.
> 
> Mainly, it would not have been "Common Lisp", which is a 
> codification of current standard practices -- what you are
> proposing is some all-new different Lisp language.

I don't think so. The user-level interface could (hopefully) remain the 
same.


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Christopher C. Stacy
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uwto78y7q.fsf@news.dtpq.com>
Pascal Costanza <··@p-cos.net> writes:

> Christopher C. Stacy wrote:
> > Pascal Costanza <··@p-cos.net> writes:
> >
> >>Yes, this would require a major redesign of Common Lisp and would
> >>probably be too costly, both in development time and maybe runtime
> >>overhead.
> > Mainly, it would not have been "Common Lisp", which is a
> > codification of current standard practices -- what you are
> > proposing is some all-new different Lisp language.
> 
> I don't think so. The user-level interface could (hopefully) remain
> the same.

Standardizing existing practices means much more than just 
maintaining an API.  Common Lisp was about bringing existing
implementations together, not about new ways of doing things.
From: Pascal Costanza
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <3ir6b2FmvfatU1@individual.net>
Christopher C. Stacy wrote:

> Standardizing existing practices means much more than just 
> maintaining an API.  Common Lisp was about bringing existing
> implementations together, not about new ways of doing things.

Yet, those existing practices had been new at some stage before 
standardization. I don't believe you intend to suggest that it should be 
forbidden to think about new ways of doing things.

CLOS itself could have just remained as specified in the ANSI standard, 
without any MOP. Yet it was attempted to design a MOP although it 
doesn't seem to me that sufficient experience existed at that time. 
Clearly I haven't been there, but all I read about the history of CLOS 
and its MOP support this conclusion.

If it had been purely about standardizing existing practice, one could 
have simply picked Flavors or LOOPS and been done with it.

Sidenote: This discussion is about some very loose and speculative 
remark that I have made. Consider that remark as a statement made in 
brainstorming mode. In my experience, this is an effective way of coming 
up with new ideas, with "new ways of doing things". I am really amazed 
how such speculation triggers such defenses of the status quo. What's 
the problem?


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Christopher C. Stacy
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uoe9j8r8p.fsf@news.dtpq.com>
Pascal Costanza <··@p-cos.net> writes:

> Christopher C. Stacy wrote:
> 
> > Standardizing existing practices means much more than just
> > maintaining an API.  Common Lisp was about bringing existing
> > implementations together, not about new ways of doing things.
> 
> Yet, those existing practices had been new at some stage before
> standardization. I don't believe you intend to suggest that it should
> be forbidden to think about new ways of doing things.

I suggest that Common Lisp was forbidden by its charter from 
thinking up and introducing those kinds of new ideas.

Of course I think it's fine for someone to think of them
for some other Lisp dialect; I encourage people to do so!
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uekaf4ibc.fsf@nhplace.com>
Pascal Costanza <··@p-cos.net> writes:

> Christopher C. Stacy wrote:
> 
> > Standardizing existing practices means much more than just
> > maintaining an API.  Common Lisp was about bringing existing
> > implementations together, not about new ways of doing things.
> 
> Yet, those existing practices had been new at some stage before
> standardization. I don't believe you intend to suggest that it should
> be forbidden to think about new ways of doing things.

Forbidden?  No.  But Chris is absolutely right that the purpose of 
standardization is NOT to do design (except insofar as is necessary
to consolidate current practice).

> CLOS itself could have just remained as specified in the ANSI
> standard, without any MOP. Yet it was attempted to design a MOP
> although it doesn't seem to me that sufficient experience existed at
> that time. Clearly I haven't been there, but all I read about the
> history of CLOS and its MOP support this conclusion.

The MOP was designed and proposed as part of CLOS in its entirety.
The basic stuff was accepted because it was like other object systems
with which there was current practice experience and it was possible
to evaluate the impact on implementations based on the design of other
object systems already deployed.  The MOP was not accepted because
there was not adequate current practice experience, and the committee
members were worried that they didn't understand what the effect of it
would be.  It appeared to constrain implementations, even when applied
just to standard-class, and there was a worry that the flexibility it
offered would slow down benchmarks even in (and, actually, especially
in) situations where the MOP's capabilities were not being applied.

You make your remark as if it contradicts what Chris said, but (a)
your remark falsely suggests that the MOP design was a separate step,
which it really wasn't, and (b) the MOP's non-acceptance is more of a
confirmation of what Chris said about what is and is not acceptable in
a standard.
 
> If it had been purely about standardizing existing practice, one could
> have simply picked Flavors or LOOPS and been done with it.

There was a political rift between the Flavors and LOOPS communities.
There was no way to solve the problem by accepting one or the other of
these without giving away the marketplace to one or the other of these
communities.  One of the biggest fears ANSI has, and it makes this
issue VERY clear to the committee members, is that standards are
potentially a way of ramming one implementor's preferred way of doing
things down the throat of another because it can be economically
injurious to the party that doesn't have to pay for retooling.  Fear
of lawsuit over this is the basis of a large number of ANSI policies.
So it is hardly surprising that neither of these was taken intact.  It
was in some ways lamentable and actively lamented (at least at the
time) that XJ313 went out on a limb to design a whole new system.  But
there didn't seem any other way to arrive at a coherent design.  PCL
was distributed and heavily tested as a way of easing that transition,
but not everyone used it, which is why the underlying assumptions the
MOP makes were rejected.  Symbolics Genera, then quite the dominant
player, did not use PCL as its basis, so an attempt to explain the
Symbolics underlying implementation by the addition of a MOP was not
practical.  In some ways, it's only since the "fortunate coincidence"
of Symbolics' demise (a bit of dark humor on my part--I don't really
celebrate it) that the MOP can be said to have an increased market
share.

> Sidenote: This discussion is about some very loose and speculative
> remark that I have made. Consider that remark as a statement made in
> brainstorming mode. In my experience, this is an effective way of
> coming up with new ideas, with "new ways of doing things". I am really
> amazed how such speculation triggers such defenses of the status
> quo. What's the problem?

The problem is that your casual comments neglect (or treat with what
some of us think undue casualness or preciseness) the effects of
history, and seem to portray, whether you intended it or not, a sense
that both another way is obvious, when it was not; and, in some cases,
you additionally seem to portray, again whether intended or not, an
incorrect view of the current state.  There is appropriate terminology
for dealing with this complex issue, and when you blur that
terminology, we are reduced to long paragraphs to describe what we
want instead of referring to crisply defined terms.  Those crisply
defined terms encourage short-form discussion.  Deviation from them
means we either suffer a terminology update then and there or a future
of confusion caused by allowing blurry terminology to mean nothing.
In my experience, at least, this leads to many debates over nothing
and it's best to nip the problem in the bud.

Now, I said I was going to stop posting on this subthread, and I will
go back to doing that.  I violated my promise only because you raised
new issues not on the table before--that is, IMO, you mischaracterized
history, at least as far as I recall it.  I would welcome input from 
others like Haflich, Margolin, and others who were there concurrently
and often supply supplementary, complementary, or even rebuttal remarks
about my ability to remember.

But I think it's useful for those of us who were there to chime in for
the sake of global memory, at least for the limited few years we're
around to do so actively.  After that, the burden's on Google to remember
for us.
From: Rob Warnock
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <dOadndZWKZZXQ13fRVn-sw@speakeasy.net>
Pascal Costanza  <··@p-cos.net> wrote:
+---------------
| A further comment: A macro is a local source-to-source transformation, 
| which means that global issues cannot be taken into account which can be 
| a limitation. (One attempt to break out of these limitations were 
| "Macros that reach out and touch somewhere" - see 
| http://www2.parc.com/csl/groups/sda/publications/papers/Kiczales-Macros91/for-web.pdf 
| )
+---------------

Another attempt, although it may not be immediately obvious that it's
related to that, is recent work on access to lexical environments such
as the tutorial that Duane Rettig just gave at ILC 2005 <http://www.
international-lisp-conference.org/tutorials.html#making_environments_
accessible_in_common_lisp>. When macros can reliably examine their
lexical environment, a number of the global issues can be directly
addressed. And if *write* access to the lexical environment is provided,
especially the ability to mutate portions of the environment "above" one,
then many [though not all] of the other global issues can be addressed
as well.


-Rob

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: Lars Brinkhoff
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <85wtoe22tp.fsf@junk.nocrew.org>
> Pascal Costanza wrote:
>> I know that you are working on such a beast. Have you already taken
>> a look at 3-Lisp? You definitely should...

Google helped me find these articles with some information about 3-Lisp:

http://crpit.com/confpapers/CRPITV37Miyoshi.pdf
http://www2.parc.com/csl/groups/sda/projects/reflection96/docs/malenfant/ref96/ref96.html

Did I miss anything?

Ray Dillinger <····@sonic.net> writes:
> It looks a lot like how what I've been working on is, um, *starting*
> to look.  I wouldn't have seen the similarity when I began, but I
> feel like I've been sort of propelled along this road of design
> inevitability ... and 3-lisp looks a lot like it's in the
> neighborhood of where I'm at now.

Other than your last few articles, which were very interesting, is
there any chance you'll write a paper or something about your work?
From: Pascal Costanza
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <3icr5gFku1r9U1@individual.net>
Lars Brinkhoff wrote:

>>Pascal Costanza wrote:
>>
>>>I know that you are working on such a beast. Have you already taken
>>>a look at 3-Lisp? You definitely should...
> 
> Google helped me find these articles with some information about 3-Lisp:
> 
> http://crpit.com/confpapers/CRPITV37Miyoshi.pdf
> http://www2.parc.com/csl/groups/sda/projects/reflection96/docs/malenfant/ref96/ref96.html
> 
> Did I miss anything?

Check out 
http://library.readscheme.org/servlets/search.ss?pattern=Smith+Brian+Cantwell

There is also an excellent paper by Jim des Rivieres, "Control-Related 
Meta-Level Facilities in LISP" in Maes/Nardi (eds.), "Meta-Level 
Architectures and Reflection" (North-Holland). Unfortunately, this 
doesn't seem to be available online. (Does someone know how to contact 
Jim des Rivieres?)



Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <ur7emlfzh.fsf@nhplace.com>
Lars Brinkhoff <·········@nocrew.org> writes:

[I'm not following the context of this thread so am responding to this post
 out of context, but...]

> > Pascal Costanza wrote:
> >> I know that you are working on such a beast. Have you already taken
> >> a look at 3-Lisp? You definitely should...
> 
> Google helped me find these articles with some information about 3-Lisp:
> 
> http://crpit.com/confpapers/CRPITV37Miyoshi.pdf
> http://www2.parc.com/csl/groups/sda/projects/reflection96/docs/malenfant/ref96/ref96.html
>
> Did I miss anything?

3-Lisp itself is pretty ... thick.  Very well thought out, but quite a
big first read.  I've read some of it and it's tough going, though
interesting.  I've been told that there are some papers that summarize
it that are more approachable.  If memory serves, John Batali wrote a
readable summary that's only maybe 60 pages (compared to Smith's
3-inch-thick thesis).  I'd google for this string:

  Batali lisp reflection OR reflective

In general, the compound search term "reflective tower" will probably
find you much of the literature on the topic.
From: Lars Brinkhoff
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <853br11qve.fsf@junk.nocrew.org>
Kent M Pitman <······@nhplace.com> writes:
> 3-Lisp itself is pretty ... thick.  Very well thought out, but quite
> a big first read.  I've read some of it and it's tough going, though
> interesting.  I've been told that there are some papers that
> summarize it that are more approachable.  If memory serves, John
> Batali wrote a readable summary that's only maybe 60 pages (compared
> to Smith's 3-inch-thick thesis).

Thanks.  In case anyone else is interested, here's Batalis paper:

ftp://publications.ai.mit.edu/ai-publications/500-999/AIM-701.ps
ftp://publications.ai.mit.edu/ai-publications/pdf/AIM-701.pdf
From: Ray Dillinger
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <zBewe.2321$p%3.15169@typhoon.sonic.net>
Lars Brinkhoff wrote:
> Ray Dillinger <····@sonic.net> writes:

>> It looks a lot like how what I've been working on is, um, *starting*
>> to look.  I wouldn't have seen the similarity when I began, but I
>> feel like I've been sort of propelled along this road of design
>> inevitability ... and 3-lisp looks a lot like it's in the
>> neighborhood of where I'm at now.


> Other than your last few articles, which were very interesting, is
> there any chance you'll write a paper or something about your work?


What I've done, I think, is something the cognoscenti knew could
be done.  I mean, lazy semantics are pretty well understood;
promises are well-understood as code plus an environment in which
that code is to be evaluated;  I think the only original thing in
my system is being able to get the code and environments from
the promises using accessors, with the code expressed as lisp
data -- and that's not actually hard to do when that's the form
in which it appears in the source at the call site. Or at
least, nowhere near as hard as implementing lazy semantics
for function calls in the first place.

What this allows you to do is make promises that *WILL* throw an
error if they're evaluated - for example a cond clause - and
pass them as arguments to cond (which is a function), which
does not evaluate them.  Instead it rips them apart and builds
a nested-if statement out of the bits, which it then evaluates
in the caller environments using eval.

So "cond" isn't a macro, or even syntax - it's just a function.

The only reason nobody has done this before, though, is that
it's insane.  "cond" is a very _EXPENSIVE_ function compared
to "if" unless you find some way to avoid doing the code
manipulation each time it's called.

Where cond is inlined and the forms of its arguments are
immutable (source code) partial-evaluation can do most of
it when the function is built.  But you have to attach an
invalidation daemon to that function from cond's binding,
so that if someone ever mutates cons, it will burn the code
that took advantage of inlining and partial evaluation of
cond.

The part of all this that *I* consider to be worthwhile work
is the effort to find ways to make it run efficiently without
giving up the fully dynamic semantics.  That's the actual
tough part.


				Bear
From: Lars Brinkhoff
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <85wtoawtt4.fsf@junk.nocrew.org>
Ray Dillinger wrote:
> Lars Brinkhoff wrote:
> > Other than your last few articles, which were very interesting, is
> > there any chance you'll write a paper or something about your work?
> What I've done, I think, is something the cognoscenti knew could
> be done.

Yes, it doesn't seem like it's conceptually complicated.  I think you
described it quite well in your posts.  However, I still believe a
paper(-ish thing) could be of benefit to the incognoscenti, and/or an
inspiration for would-be language experimentators.

Also, it seems as the implications for compilation could have some
novel aspects.  And you could describe benefits you've found when
actually using the new language.  I don't think you said much about
that so far.

The paper could be as simple as concatenating your (perhaps slightly
edited) news articles.  That would still be more accessible to
researchers/hackers/hobbyists than sifting through the Google archives.

> The part of all this that *I* consider to be worthwhile work is the
> effort to find ways to make it run efficiently without giving up the
> fully dynamic semantics.  That's the actual tough part.

So maybe a paper should focus on that.

Not that I'm trying to pressure you!  Just trying to point out that
there could be more interest in this than you think.
From: Joe Marshall
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <r7ekvhpz.fsf@comcast.net>
Lars Brinkhoff <·········@nocrew.org> writes:

>> Pascal Costanza wrote:
>>> I know that you are working on such a beast. Have you already taken
>>> a look at 3-Lisp? You definitely should...
>
> Google helped me find these articles with some information about 3-Lisp:
>
> http://crpit.com/confpapers/CRPITV37Miyoshi.pdf
> http://www2.parc.com/csl/groups/sda/projects/reflection96/docs/malenfant/ref96/ref96.html
>
> Did I miss anything?

It would also be worthwhile to look at the papers that disagree with
the 3-Lisp viewpoint:

  Mitchell Wand. "The Mystery of the Tower Revealed: a Non-Reflective
  Description of the Reflective Tower".  
  Proceedings of the 1986 ACM Symposium on LISP and Functional
  Programming.
  August 1986.

 ------

  AI Memo 946
  Reification without Evaluation -- Alan Bawdin

  Constructing self-referential systems, such as Brian Smith's 3-lisp
  language, is actually more straightforward than you think.  Anyone
  can build an infinite tower of processors (where each processor
  implementes the processor at th next level below) by employing some
  common sense and one simple trick.  In particular, it is *not*
  necessary to re-design quotation, take a stand on teh relative
  merits of evaluation vs. normalization, or treat continuations as
  meta-level objects.  This paper presents a simple programming
  language interpreter that illustrates how this can be done.  By
  keeping expression evaluation entirely separate from the mechanisms
  that implement its infinite tower, this interpreter avoids many
  troublesome aspects of previous self-referential programming
  languages.  Given these basically straightforward techniques,
  processor towers might be easily constructed for a wide variety of
  systems to enable them to manipulate and reason about themselves.

 ------

Alan Bawden is one smart cookie.


-- 
~jrm
From: lin8080
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <42BF3231.F5597C81@freenet.de>
Ray Dillinger schrieb:

> Okay...  indulge me for a minute here then.  Let's say you
> have a lisp with lazy semantics (promises get passed as arguments
> instead of computed values).

Hallo

There is a text on my hd from:

 Colin Matheson Thu Mar 9 22:38:05 GMT 1995 

about: (last sentence)

 ... The key feature of a planning system that this program lacks is the
ability to reason properly about changes in the world that result from
performing an action and how these interact with the preconditions and
effects of other planned actions.

stefan
From: Marcin 'Qrczak' Kowalczyk
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <87r7es2p6w.fsf@qrnik.zagroda>
Kenny Tilton <·······@nyc.rr.com> writes:

> Please get back to us when you have some application functionality you
> cannot express in Common Lisp.

Running two arbitrary threads of Lisp code concurrently can only be
expressed by writing a concurrent Lisp interpreter in Lisp. This is a
lot of work for someone who just wants to use threads.

-- 
   __("<         Marcin Kowalczyk
   \__/       ······@knm.org.pl
    ^^     http://qrnik.knm.org.pl/~qrczak/
From: GP lisper
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1119634205.d8d325fc3702a03cc4674c98c339c560@teranews>
On Fri, 24 Jun 2005 09:23:35 +0200, <······@knm.org.pl> wrote:
> Kenny Tilton <·······@nyc.rr.com> writes:
>
>> Please get back to us when you have some application functionality you
>> cannot express in Common Lisp.
>
> Running two arbitrary threads of Lisp code concurrently can only be
> expressed by writing a concurrent Lisp interpreter in Lisp. This is a
> lot of work for someone who just wants to use threads.

Isn't it more than merely an interpreter?  You want this setup because
you want shared objects, so what happens when one thread no longer
references an object?

If it was easy, it would be done by now.  Generally people find that
threads are a nice idea for someone else.


-- 
The LOOP construct is really neat, it's got a lot of knobs to turn!
Don't push the yellow one on the bottom.
From: Marcin 'Qrczak' Kowalczyk
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <87wtoha27y.fsf@qrnik.zagroda>
GP lisper <········@CloudDancer.com> writes:

>> Running two arbitrary threads of Lisp code concurrently can only be
>> expressed by writing a concurrent Lisp interpreter in Lisp. This is a
>> lot of work for someone who just wants to use threads.
>
> Isn't it more than merely an interpreter?  You want this setup because
> you want shared objects, so what happens when one thread no longer
> references an object?

Nothing; but when all threads no longer reference an object, it's
garbage collected.

I thought it was obvious, so perhaps I misunderstood the question.

> If it was easy, it would be done by now.  Generally people find that
> threads are a nice idea for someone else.

It is being done by some implementations (sometimes poorly). It's not
standarized.

Emacs doesn't support threads, and as a result I can't use Gnus while
it's downloading articles.

-- 
   __("<         Marcin Kowalczyk
   \__/       ······@knm.org.pl
    ^^     http://qrnik.knm.org.pl/~qrczak/
From: GP lisper
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1119807028.35a90f0253d225865ebfe9e52695bba7@teranews>
On Sun, 26 Jun 2005 11:37:05 +0200, <······@knm.org.pl> wrote:
> GP lisper <········@CloudDancer.com> writes:
>
>>> Running two arbitrary threads of Lisp code concurrently can only be
>>> expressed by writing a concurrent Lisp interpreter in Lisp. This is a
>>> lot of work for someone who just wants to use threads.
>>
>> Isn't it more than merely an interpreter?  You want this setup because
>> you want shared objects, so what happens when one thread no longer
>> references an object?
>
> Nothing; but when all threads no longer reference an object, it's
> garbage collected.
>
> I thought it was obvious, so perhaps I misunderstood the question.

It is the effort behind that statement that is important...



-- 
The LOOP construct is really neat, it's got a lot of knobs to turn!
Don't push the yellow one on the bottom.
From: Marcin 'Qrczak' Kowalczyk
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <87wtohtpz4.fsf@qrnik.zagroda>
GP lisper <········@CloudDancer.com> writes:

>> Running two arbitrary threads of Lisp code concurrently can only be
>> expressed by writing a concurrent Lisp interpreter in Lisp. This is a
>> lot of work for someone who just wants to use threads.
>
> Isn't it more than merely an interpreter?  You want this setup because
> you want shared objects, so what happens when one thread no longer
> references an object?

Nothing; but when all threads no longer reference an object, it's
garbage collected.

I thought it was obvious, so perhaps I misunderstood the question.

> If it was easy, it would be done by now.  Generally people find that
> threads are a nice idea for someone else.

It is being done by some implementations (sometimes poorly). It's not
standarized. By standarization I mean the ability to use the same code
on multiple language implementations, not necessarily formal ANSI papers.

Emacs doesn't support threads, and as a result I can't use Gnus while
it's downloading articles. This shows that threads in Lisp are
sometimes useful.

-- 
   __("<         Marcin Kowalczyk
   \__/       ······@knm.org.pl
    ^^     http://qrnik.knm.org.pl/~qrczak/
From: Juliusz Chroboczek
Subject: Re: de threadibus [was: ILC2005...]
Date: 
Message-ID: <7izmtbyieb.fsf_-_@lanthane.pps.jussieu.fr>
GP lisper <········@CloudDancer.com>:

> If it [were] easy, it would be done by now.

It's not easy.

> Generally people find that threads are a nice idea for someone else.

As Barbie famously said, programming is hard.  Programming batch
programs is hard enough, but we haven't got the least idea about how
to go about programming concurrent systems.

Threads are the best we've got for programming concurrent systems.
The problem is not with threads -- it's about concurrency itself.

                                        Juliusz
From: Rob Warnock
Subject: Re: de threadibus [was: ILC2005...]
Date: 
Message-ID: <JdudnTNaXKBTfl3fRVn-2Q@speakeasy.net>
Juliusz Chroboczek  <···@pps.jussieu.fr> wrote:
+---------------
| GP lisper <········@CloudDancer.com>:
| > Generally people find that threads are a nice idea for someone else.
| 
| As Barbie famously said, programming is hard.  Programming batch
| programs is hard enough, but we haven't got the least idea about how
| to go about programming concurrent systems.
| 
| Threads are the best we've got for programming concurrent systems.
| The problem is not with threads -- it's about concurrency itself.
+---------------

This is not entirely correct. A *few* rigorous steps have been taken,
going back as far 1976, e.g.:

    http://www.cs.utexas.edu/users/EWD/ewd05xx/EWD554.PDF
    EWD 554 "A personal summary of the Gries-Owicki Theory" (March 1976)
    [Published in Edsger W. Dijkstra, "Selected Writings on Computing:
    A Personal Perspective", Springer-Verlag, 1982. ISBN 0 387 90652 5.]


-Rob

p.s. Reading this might be a bit difficult if one has never seen
the Dijkstra/Gries "guarded command language" notation:

    Edsger W. Dijkstra: "Guarded Commands, Nondeterminacy and Formal
    Derivation of Programs", Comm. ACM 18(8): 453-457 (1975)

An earlier version (June 1974) is archived at:

    http://www.cs.utexas.edu/users/EWD/ewd04xx/EWD418.PDF
    EWD 418: "Guarded commands, non-determinacy and a calculus
    for the derivation of programs".

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: Juliusz Chroboczek
Subject: Re: de threadibus
Date: 
Message-ID: <7iekamro5u.fsf@lanthane.pps.jussieu.fr>
me:

> > Programming batch programs is hard enough, but we haven't got the
> > least idea about how to go about programming concurrent systems.

····@rpw3.org (Rob Warnock):

> This is not entirely correct. A *few* rigorous steps have been taken,
> going back as far 1976, eg. [Dijkstra-Gries-Owicki]

We've got plenty of formal models and theories for concurrency and
parallelism -- Dijkstra's guarded sums, Gries-Owicki, Hoare's models
for CSP, Milner's bisimilarity for CCS and pi, modal mu-calculus,
Petri nets, Mazurkiewicz traces, event structures, event categories,
and plenty more.

At the same time, we don't have the inkling of an idea about how to
write a concurrent program.  We're still stuck with thread libraries
that cannot make up their mind about whether they were designed for
concurrency, for parallelism or for realtime and end up being
inadequate for either.  We don't have any sound principles for
designing systems that are free from deadlocks and race conditions
(the one exception being putting a total ordering on critical
sections), let alone ensuring reactivity.

Exciting times lie ahead.  Hope to see you there.

                                        Juliusz
From: Rob Warnock
Subject: Re: de threadibus
Date: 
Message-ID: <9sadnTPtGvMykF_fRVn-gw@speakeasy.net>
Juliusz Chroboczek  <···@pps.jussieu.fr> wrote:
+---------------
| We don't have any sound principles for designing systems that
| are free from deadlocks and race conditions (the one exception
| being putting a total ordering on critical sections)...
+---------------

Why doesn't enforcing a partial ordering suffice to avoid deadlock?
I'll agree that a total ordering may be easier to *administer*,
but I didn't think it was strictly necessary.


-Rob

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: A.L.
Subject: Re: de threadibus
Date: 
Message-ID: <e144c15bbgqnfd94qest9emp2e4h2iudi2@4ax.com>
On Wed, 29 Jun 2005 02:40:13 +0200, Juliusz Chroboczek
<···@pps.jussieu.fr> wrote:

>me:
>
>> > Programming batch programs is hard enough, but we haven't got the
>> > least idea about how to go about programming concurrent systems.
>
>····@rpw3.org (Rob Warnock):
>
>> This is not entirely correct. A *few* rigorous steps have been taken,
>> going back as far 1976, eg. [Dijkstra-Gries-Owicki]
>
>We've got plenty of formal models and theories for concurrency and
>parallelism -- Dijkstra's guarded sums, Gries-Owicki, Hoare's models
>for CSP, Milner's bisimilarity for CCS and pi, modal mu-calculus,
>Petri nets, Mazurkiewicz traces, event structures, event categories,
>and plenty more.
>
>At the same time, we don't have the inkling of an idea about how to
>write a concurrent program...

"We".. This means, who?... Talking about yourself?...

A.L.
From: Ray Dillinger
Subject: Re: de threadibus
Date: 
Message-ID: <0Gnwe.2417$p%3.15534@typhoon.sonic.net>
Juliusz Chroboczek wrote:

> At the same time, we don't have the inkling of an idea about how to
> write a concurrent program.  We're still stuck with thread libraries
> that cannot make up their mind about whether they were designed for
> concurrency, for parallelism or for realtime and end up being
> inadequate for either.  We don't have any sound principles for
> designing systems that are free from deadlocks and race conditions
> (the one exception being putting a total ordering on critical
> sections), let alone ensuring reactivity.

This is the truth.

And it's becoming more and more true as hardware gets more and more
parallel.  Ever try writing code that takes advantage of dataflow
hardware?  Where you've got thousands of parallel and concurrent
computations running at the same time, using the same code, and
very little control over which of them completes first?

The language that does the best job of capturing parallel/concurrent
semantics in a form that doesn't make programmers tear their hair
out, will capture the future.

			Bear
From: Christopher C. Stacy
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uaclgbq7v.fsf@news.dtpq.com>
Kenny Tilton <·······@nyc.rr.com> writes:
> * McCarthy actually meant that very little code lasts ten years.

That would suggest a serious disconnect with reality;
it's a little hard to believe.
From: Paul F. Dietz
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <daKdnV6NQ4VQ1CbfRVn-tQ@dls.net>
Christopher C. Stacy wrote:
> Kenny Tilton <·······@nyc.rr.com> writes:
> 
>>* McCarthy actually meant that very little code lasts ten years.
> 
> 
> That would suggest a serious disconnect with reality;
> it's a little hard to believe.

I think he said 20 years, not 10, and I'm not sure he was
entirely serious.

	Paul
From: Kenny Tilton
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <iuNue.13899$XB2.3623548@twister.nyc.rr.com>
Paul F. Dietz wrote:

> Christopher C. Stacy wrote:
> 
>> Kenny Tilton <·······@nyc.rr.com> writes:
>>
>>> * McCarthy actually meant that very little code lasts ten years.
>>
>>
>>
>> That would suggest a serious disconnect with reality;
>> it's a little hard to believe.
> 
> 
> I think he said 20 years, not 10,

Fantastic. We have a new entry for examples of "stupid quibble". Rahul 
said ten, OK? (As if it fucking matters.)

  and I'm not sure he was
> entirely serious.

I think this is the difference between a yobbo and and an intellect.

While everyone was laughing at "you do not look old enough...", and 
Rahul was protesting that he meant "in the future", McCarthy slipped in 
the mumble making clear that his point was simply that very little code 
(from anyone!) lasts long enough to justify freezing a language. Your 
correspondent can confirm this from <gasp!> actual production dode 
experience.

Anyone with an iota of an experience in production code knows how fast 
systems get swapped out, and that was the trivial yet telling point 
McCarthy made in teasing Rahul and that particular defense of 
standardization.

-- 
Kenny

Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film

"If you plan to enter text which our system might consider to be 
obscene, check here to certify that you are old enough to hear the 
resulting output." -- Bell Labs text-to-speech interactive Web page
From: Friedrich Dominicus
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <878y10qmxe.fsf@flarge.here>
Kenny Tilton <·······@nyc.rr.com> writes:

> While everyone was laughing at "you do not look old enough...", and
> Rahul was protesting that he meant "in the future", McCarthy slipped
> in the mumble making clear that his point was simply that very little
> code (from anyone!) lasts long enough to justify freezing a
> language. 

Well exactly that is not the case. Why was there a year 2000 problem?
Because nobody expected software to survice longer then a few
years. How many billions of dollars were spend on fixing those
"could-not-survive-so-long" software.

And every standard is just the snapshot in time. How many standards do
exist for Fortran? How many for C?

Friedrich

-- 
Please remove just-for-news- to reply via e-mail.
From: Pupeno
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <d9gbnd$d18$1@domitilla.aioe.org>
Friedrich Dominicus wrote:
> Well exactly that is not the case. Why was there a year 2000 problem?
What problem ? I man, anybody saw the problem (aside from a couple of cobol
prgrams) ?
> Because nobody expected software to survice longer then a few
> years. How many billions of dollars were spend on fixing those
> "could-not-survive-so-long" software.
Well, the money spent in solving the 2kY problem is known to be one of the
biggest wastes on the IT market (has anyone evaluated upgradding-to-XP
yet ?)

> And every standard is just the snapshot in time. How many standards do
> exist for Fortran? How many for C?
personally I see lots of projects without a standard evolve faster than
Common Lisp (Perl, Python, Ruby, PHP). I would be all on the side of
dropping the standard (it doesn't help portability a lot because if you use
multithreading or sockets or one of those things not standarized *yet*, you
are screwed. But, I'm just a newbie.
-- 
Pupeno <······@pupeno.com> (http://pupeno.com)
Reading ? Science Fiction ? http://sfreaders.com.ar
From: Friedrich Dominicus
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <874qboqima.fsf@flarge.here>
Pupeno <······@pupeno.com> writes:

> Friedrich Dominicus wrote:
>> Well exactly that is not the case. Why was there a year 2000 problem?
> What problem ? I man, anybody saw the problem (aside from a couple of cobol
> prgrams) ?
>> Because nobody expected software to survice longer then a few
>> years. How many billions of dollars were spend on fixing those
>> "could-not-survive-so-long" software.
> Well, the money spent in solving the 2kY problem is known to be one of the
> biggest wastes on the IT market (has anyone evaluated upgradding-to-XP
> yet ?)
Well this is statement which I doubt very much. What would have
happened if the money would not have been spend? We don't know because
nobody dared to ignore it. Now let us hope that securitiy gets at
least that much attentoin....

>
>> And every standard is just the snapshot in time. How many standards do
>> exist for Fortran? How many for C?
> personally I see lots of projects without a standard evolve faster than
> Common Lisp (Perl, Python, Ruby, PHP). I would be all on the side of
> dropping the standard (it doesn't help portability a lot because if you use
> multithreading or sockets or one of those things not standarized *yet*, you
> are screwed. But, I'm just a newbie.
Well have you tried to run a Perl 3 program with Perl 5? 

Have you read about the trouble going from perl 4 to Perl 5? Why do
you think are those branches still in use? Because most of the people
do not like to rewrite their coded over and over again because if
changes in some implementations.

I have no
problems to compile fifteen year old C programs. And the same is true
for quite some Lisp Code out there which explicity was written to be
portable. 

Standards are IMHO on of the sharpest tools the "users" have against
the "vendors". I have seen in at least one other programming language
what happens if the users do not have something like a standard. 

And you can see how happily the big ones ignore standards. Just check
the MSVC documentation on the C language. 

Of course that is just a problem if you have more then on
implementation. But fortunatly we do have the choice. If you do think
standards are good I suggest you check back to the thread "modern
lisp" mode from Franz. 

I would not care as much about standards if backward-compatiblity
would be one on the highest priorites. I just can tell for C and
Common Lisp that at least here I found this high priority. Guess what
Franz even has some library to cope with Flavours! 

In a language with such a long history as Lisp, standards are a
blessing.

Friedrich

-- 
Please remove just-for-news- to reply via e-mail.
From: Kirk Job Sluder
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <87y88zg2kv.fsf@debian.kirkjobsluder.is-a-geek.net>
Pupeno <······@pupeno.com> writes:

> Friedrich Dominicus wrote:
> personally I see lots of projects without a standard evolve faster than
> Common Lisp (Perl, Python, Ruby, PHP). I would be all on the side of
> dropping the standard (it doesn't help portability a lot because if you use
> multithreading or sockets or one of those things not standarized *yet*, you
> are screwed. But, I'm just a newbie.

Well, I would argue that these languages do have de-facto
standardization due to the fact that one distribution dominates.  Almost
all python projects use cpython which so far has made some effort to
maintain backwards compatibility.  (This might change with version 3.)
Perl5 likewise is a de-facto standard.  

> -- 
> Pupeno <······@pupeno.com> (http://pupeno.com)
> Reading ? Science Fiction ? http://sfreaders.com.ar

-- 
Kirk Job-Sluder
"The square-jawed homunculi of Tommy Hilfinger ads make every day an
existential holocaust."  --Scary Go Round
From: Pupeno
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <d9hjid$msv$1@domitilla.aioe.org>
Kirk Job Sluder wrote:
> Well, I would argue that these languages do have de-facto
> standardization due to the fact that one distribution dominates.  Almost
> all python projects use cpython which so far has made some effort to
> maintain backwards compatibility.  (This might change with version 3.)
> Perl5 likewise is a de-facto standard.
Yes, I agree and it was discussed on other sub-threads. My point is that *I*
prefeer the model of one-implementation/de-facto-standard than the ANSI
model.
-- 
Pupeno <······@pupeno.com> (http://pupeno.com)
Reading ? Science Fiction ? http://sfreaders.com.ar
From: Matthias Buelow
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <86ll4z1tt3.fsf@drjekyll.mkbuelow.net>
Pupeno <······@pupeno.com> writes:

>Yes, I agree and it was discussed on other sub-threads. My point is that *I*
>prefeer the model of one-implementation/de-facto-standard than the ANSI
>model.

I don't think that python and perl constitute de-facto standards at
all, since even for a de-facto standard, you have to consider
long-term feature stability, long, well defined standardization
cycles, accountability and the existence of alternative providers. If
van Rossum and whoever runs Perl these days desire to change their
languages completely, who's going to stop them? Contrast that with
Sun, who basically are constrained by their large customer base, and
are actually providing a de-facto standard (which other vendors, like
IBM, are following). I think there's a difference in quality here,
about what constitutes a de-facto standard, and what is "just" a
single implementation.

mkb.
From: Friedrich Dominicus
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <87psuakey2.fsf@flarge.here>
Kirk Job Sluder <····@jobsluder.net> writes:

> Pupeno <······@pupeno.com> writes:
>
>> Friedrich Dominicus wrote:
>> personally I see lots of projects without a standard evolve faster than
>> Common Lisp (Perl, Python, Ruby, PHP). I would be all on the side of
>> dropping the standard (it doesn't help portability a lot because if you use
>> multithreading or sockets or one of those things not standarized *yet*, you
>> are screwed. But, I'm just a newbie.
I would appricate if you cite me correctly. I did not have wrote that
paragraph. And I strongly argued *against* it. 

Friedrich
-- 
Please remove just-for-news- to reply via e-mail.
From: Ulrich Hobelmann
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <3i2jmsFjkbv6U1@individual.net>
Pupeno wrote:
> personally I see lots of projects without a standard evolve faster than
> Common Lisp (Perl, Python, Ruby, PHP). I would be all on the side of
> dropping the standard (it doesn't help portability a lot because if you use
> multithreading or sockets or one of those things not standarized *yet*, you
> are screwed. But, I'm just a newbie.

Then have fun porting your stuff from PHP3 to 4, and now to PHP5 etc.

I'm happy I never used those languages!

-- 
Don't let school interfere with your education. -- Mark Twain
From: Pupeno
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <d9hig6$lba$1@domitilla.aioe.org>
Ulrich Hobelmann wrote:

> Pupeno wrote:
>> personally I see lots of projects without a standard evolve faster than
>> Common Lisp (Perl, Python, Ruby, PHP). I would be all on the side of
>> dropping the standard (it doesn't help portability a lot because if you
>> use multithreading or sockets or one of those things not standarized
>> *yet*, you are screwed. But, I'm just a newbie.
> 
> Then have fun porting your stuff from PHP3 to 4, and now to PHP5 etc.
As PHP developer I preeffer to port stuff to PHP 5 and gain a better object
model than continue to use PHP 4.

> I'm happy I never used those languages!
You are lucky, for me, they are the only source of income.
-- 
Pupeno <······@pupeno.com> (http://pupeno.com)
Reading ? Science Fiction ? http://sfreaders.com.ar
From: Espen Vestre
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <kwy88zq0yq.fsf@merced.netfonds.no>
Ulrich Hobelmann <···········@web.de> writes:

> Then have fun porting your stuff from PHP3 to 4, and now to PHP5 etc.
>
> I'm happy I never used those languages!

Oouch. Now you reminded me of the painful first 5.* versions of perl.
Some of the 0.01 versions were like completely new languages :-(
-- 
  (espen)
From: GP lisper
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1119633307.a3adb67a2107562e1f45e015739d51b0@teranews>
On Fri, 24 Jun 2005 16:33:01 +0200, <·····@vestre.net> wrote:
>
>
> Ulrich Hobelmann <···········@web.de> writes:
>
>> Then have fun porting your stuff from PHP3 to 4, and now to PHP5 etc.
>>
>> I'm happy I never used those languages!
>
> Oouch. Now you reminded me of the painful first 5.* versions of perl.
> Some of the 0.01 versions were like completely new languages :-(


This is exactly what McCarthy was arguing FOR.  But it wasn't an
audience from the ivory tower, it was an audience that either made
money from lisp or wanted to do so.  The ivory tower works off of
publishing papers, big changes give them more opportunities for
papers.  Not the right audience for that point.


"Elephants never forget"

Elephants are trained at a young age with a heavy, unbreakable chain
tying them down.  They strain against it for a long time, then stop
and never try again.  Afterwards they can be bound with a rope.

'never forgetting' is not how humans operate, funny thing to hear from
a 'human-class AI proponent'.

-- 
The LOOP construct is really neat, it's got a lot of knobs to turn!
Don't push the yellow one on the bottom.
From: GP lisper
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1119633304.314c95541108dcb60ca1810595e7c73b@teranews>
On Fri, 24 Jun 2005 08:38:37 +0200, <···················@q-software-solutions.de> wrote:
>
> Well exactly that is not the case. Why was there a year 2000 problem?

Y2K was the biggest joke of the century.  Really showed how poor the
educational system is nowdays, and how willingly people believe that
burning money fixes the world.  The wolves separated many $$ from the
sheep with that one.


NEXT UP:  The "666 problem" AKA June 6, 2006.  lol
From: ···············@yahoo.com
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <2b0a7c2a2d4d48c747bf2c380240@news.chi.sbcglobal.net>
Gl> On Fri, 24 Jun 2005 08:38:37 +0200,
Gl> <···················@q-software-solutions.de> wrote:
[snip]
Gl> Y2K was the biggest joke of the century.  Really showed how poor the
Gl> educational system is nowdays, and how willingly people believe that
Gl> burning money fixes the world.  The wolves separated many $$ from
Gl> the sheep with that one.

Messages like this in this thread have made me almost (but not quite) speechless. 
:-)

I worked in administrative data processing for 20 years at a major private 
university. I was the programmer responsible for maintaining what I think 
was the first admin system there to encounter the Y2K problem and require 
partial re-writing. The admin system for a school associated with the university 
used the high-school graduating year as the way to classify the students, 
it used a two-digit year, and it had a pre-school. So some time around 1985 
the little toddlers enrolling were the class of 2000 and but if you sorted 
by year you'd find them graduating ahead of the class of 1985, zero being 
less than 85.

From there on until 2000, administrative system after administrative system 
encountered the problem and had to have significant work done to fix the 
problem. For instance, an academic personnel system couldn't handle faculty 
appointment termination dates after 12/31/1999. Such cases kept coming up 
in system after system. Because of a lot of good work by a lot of people 
over a decade and a half, there weren't any significant Y2K problems in the 
administrative systems at that university.

Please don't say that Y2K was a joke or a fraud.

Clark Wilson

P.S.  Separate topic: As for the longevity of realworld systems, I personally 
know of an accounting application (run only once a year) that was in production 
into the early 1990s at a major private university, running on an IBM 1401 
machine being emulated on the current IBM mainframe at that time. (The IBM 
1401 mainframe was announced in 1959 and withdrawn in 1971. It had 4,000 
bytes of main storage, expandable up to a lordly 16,000 bytes in later models.)
From: Greg Menke
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <m3psu9g4yy.fsf@athena.pienet>
···············@yahoo.com writes:

> Gl> On Fri, 24 Jun 2005 08:38:37 +0200,
> Gl> <···················@q-software-solutions.de> wrote:
> [snip]
> Gl> Y2K was the biggest joke of the century.  Really showed how poor the
> Gl> educational system is nowdays, and how willingly people believe that
> Gl> burning money fixes the world.  The wolves separated many $$ from
> Gl> the sheep with that one.
> 
> Messages like this in this thread have made me almost (but not quite)
> speechless. :-)
> 
> I worked in administrative data processing for 20 years at a major
> private university. I was the programmer responsible for maintaining
> what I think was the first admin system there to encounter the Y2K
> problem and require partial re-writing. The admin system for a school
> associated with the university used the high-school graduating year as
> the way to classify the students, it used a two-digit year, and it had a
> pre-school. So some time around 1985 the little toddlers enrolling were
> the class of 2000 and but if you sorted by year you'd find them
> graduating ahead of the class of 1985, zero being less than 85.


Y2k was pretty much irrelevant as far as problems go for the big
financial systems I worked on.  The killer was leap years- gotta account
for the varying number of days properly.  1999-2000 was a no-brainer.

Gregm
From: Hartmann Schaffer
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1jFve.4846$EP2.23707@newscontent-01.sprint.ca>
···············@yahoo.com wrote:
> ...
> I worked in administrative data processing for 20 years at a major 
> private university. I was the programmer responsible for maintaining 
> what I think was the first admin system there to encounter the Y2K 
> problem and require partial re-writing. The admin system for a school 
> associated with the university used the high-school graduating year as 
> the way to classify the students, it used a two-digit year, and it had a 
> pre-school. So some time around 1985 the little toddlers enrolling were 
> the class of 2000 and but if you sorted by year you'd find them 
> graduating ahead of the class of 1985, zero being less than 85.

lots of administrative systems were effected, but there was also quite a 
lot of unnecessary hype and scaremongering about Y2K, and i wouldn't be 
surprised if some money was wasted because of that.  remember the dire 
predictions of elevators crashing and planes falling down in mid-flight? 
  i am still trying to figure out how an incorrect date algorithm could 
effect those

> ...

hs
From: Paul Wallich
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <d9nfrp$e1r$1@reader1.panix.com>
Hartmann Schaffer wrote:
> ···············@yahoo.com wrote:
> 
>> ...
>> I worked in administrative data processing for 20 years at a major 
>> private university. I was the programmer responsible for maintaining 
>> what I think was the first admin system there to encounter the Y2K 
>> problem and require partial re-writing. The admin system for a school 
>> associated with the university used the high-school graduating year as 
>> the way to classify the students, it used a two-digit year, and it had 
>> a pre-school. So some time around 1985 the little toddlers enrolling 
>> were the class of 2000 and but if you sorted by year you'd find them 
>> graduating ahead of the class of 1985, zero being less than 85.
> 
> 
> lots of administrative systems were effected, but there was also quite a 
> lot of unnecessary hype and scaremongering about Y2K, and i wouldn't be 
> surprised if some money was wasted because of that.  remember the dire 
> predictions of elevators crashing and planes falling down in mid-flight? 
>  i am still trying to figure out how an incorrect date algorithm could 
> effect those

Well, there were the stories about fly-by-wire aircraft flipping 
upsidedown when they crossed the equator...

I imagine that any serious navigation system probably wants to know the 
exact time and date, and although that should be stored in some 
unbreakable format, time and date entries might well be converted in a 
way that led to bad results without proper checking (see Ariane for an 
example of improper checking). Elevators would be harder to crash, 
unless some idiot had one CPU controlling the whole cab, including the 
(about-to-go-haywire) clock display. Obviously in carefully written 
modular software there would be no interactions among such modules, and 
no race conditions triggered if the time/date module went flooey, but 
lots of software ain't like that. (Remember all the web pages that 
displayed dates of 19100.)

paul

paul
From: Paolo Amoroso
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <87mzpcurb8.fsf@plato.moon.paoloamoroso.it>
Paul Wallich <··@panix.com> writes:

> Well, there were the stories about fly-by-wire aircraft flipping
> upsidedown when they crossed the equator...

I heard a similar story, but the aircraft flipped upside down *in a
simulation*.


Paolo
-- 
Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film
Recommended Common Lisp libraries/tools:
- ASDF/ASDF-INSTALL: system building/installation
- CL-PPCRE: regular expressions
- UFFI: Foreign Function Interface
From: Rob Warnock
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <AZWdnTptGOEumlzfRVn-hw@speakeasy.net>
Paolo Amoroso  <·······@mclink.it> wrote:
+---------------
| Paul Wallich <··@panix.com> writes:
| > Well, there were the stories about fly-by-wire aircraft flipping
| > upsidedown when they crossed the equator...
| 
| I heard a similar story, but the aircraft flipped upside down *in a
| simulation*.
+---------------

These may all be urban legend variations of a different story
altogether: What is certainly true is that very-low-altitude
terrain-following autopilots have code in them to, in certain cases,
roll the plane upside down when going over the top of a large hill
or mountain, so that as the plane dives back down the far side of the
hill the pilot pulls +G's [of which the human body and airframe are
both more tolerant] rather than -G's, and then rolls upright again
at the bottom of the back of the hill, of course.

The [possibly urban legend] story is that an early version of the
software would, when flying across a completely flat prairie, roll
the plane inverted then immediately upright again when it passed
over a cow.  ;-}


-Rob

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: Fred Gilham
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <u71x6ma4r4.fsf@snapdragon.csl.sri.com>
····@rpw3.org (Rob Warnock) writes:

> Paolo Amoroso  <·······@mclink.it> wrote:
> +---------------
> | Paul Wallich <··@panix.com> writes:
> | > Well, there were the stories about fly-by-wire aircraft flipping
> | > upsidedown when they crossed the equator...
> | 
> | I heard a similar story, but the aircraft flipped upside down *in a
> | simulation*.
> +---------------
> 
> These may all be urban legend variations of a different story
> altogether: What is certainly true is that very-low-altitude
> terrain-following autopilots have code in them to, in certain cases,
> roll the plane upside down when going over the top of a large hill
> or mountain, so that as the plane dives back down the far side of the
> hill the pilot pulls +G's [of which the human body and airframe are
> both more tolerant] rather than -G's, and then rolls upright again
> at the bottom of the back of the hill, of course.
> 
> The [possibly urban legend] story is that an early version of the
> software would, when flying across a completely flat prairie, roll
> the plane inverted then immediately upright again when it passed
> over a cow.  ;-}


Well, since I sit a couple doors down from the person (Peter Neumann)
who moderates the Computer Risks forum, I naturally looked this up in
the computer risks archive.  Here's the article:


----------------------------------------
 F-16 Problems (from Usenet net.aviation)
Bill Janssen <·······@mcc.com>
Wed, 27 Aug 86 14:31:45 CDT

A friend of mine who works for General Dynamics here in Ft. Worth
wrote some of the code for the F-16, and he is always telling me about
some neato-whiz-bang bug/feature they keep finding in the F-16:

o Since the F-16 is a fly-by-wire aircraft, the computer keeps the
pilot from doing dumb things to himself. So if the pilot jerks hard
over on the joystick, the computer will instruct the flight surfaces
to make a nice and easy 4 or 5 G flip. But the plane can withstand a
much higher flip than that.  So when they were 'flying' the F-16 in
simulation over the equator, the computer got confused and instantly
flipped the plane over, killing the pilot [in simulation].  And since
it can fly forever upside down, it would do so until it ran out of
fuel.

(The remaining bugs were actually found while flying, rather than in 
simulation):

o One of the first things the Air Force test pilots tried on an early
F-16 was to tell the computer to raise the landing gear while standing
still on the runway. Guess what happened? Scratch one F-16. (my friend
says there is a new subroutine in the code called 'wait_on_wheels'
now...)  [weight?]

o The computer system onboard has a weapons management system that
will attempt to keep the plane flying level by dispersing weapons and
empty fuel tanks in a balanced fashion. So if you ask to drop a bomb,
the computer will figure out whether to drop a port or starboard bomb
in order to keep the load even. One of the early problems with that
was the fact that you could flip the plane over and the computer would
gladly let you drop a bomb or fuel tank. It would drop, dent the wing,
and then roll off.

There are some really remarkable things about the F-16. And some even
more remarkable things in the new F-16C and D models:

o They are adding two movable vents called 'canards' that will be
installed near the engine intake vent under where the pilot sits. By
doing some fancy things with the flight surfaces and slick
programming, they can get the F-16 to fly almost sideways through the
air. Or flat turns (no banking!). Or fly level with the nose pointed
30 degrees down or up (handy for firing the guns at the ground or
other aircraft).

I figured this stuff can't be too classified, since I heard the almost
same thing from two different people who work at GD. I hope the Feds
don't get too upset...

George Moore (··@trsvax.UUCP)


-- 
Fred Gilham                                        ······@csl.sri.com
The PTPL (People's Trotskyist Programming League) believes that
hackers are elitist and that all software should be created by the
masses flailing away at millions of keyboards.  I didn't have the
heart to tell them that this has already been tried and the result is
called Linux.
From: Rob Warnock
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <DdKdnSRKm4dqjV_fRVn-oA@speakeasy.net>
Fred Gilham  <······@snapdragon.csl.sri.com> wrote:
+---------------
| ····@rpw3.org (Rob Warnock) writes:
| > Paolo Amoroso  <·······@mclink.it> wrote:
| > +---------------
| > | Paul Wallich <··@panix.com> writes:
| > | > Well, there were the stories about fly-by-wire aircraft flipping
| > | > upsidedown when they crossed the equator...
| > | 
| > | I heard a similar story, but the aircraft flipped upside down *in a
| > | simulation*.
| > +---------------
| > 
| > These may all be urban legend variations of a different story
| > altogether: What is certainly true is that very-low-altitude
| > terrain-following autopilots have code in them to, in certain cases,
| > roll the plane upside down when going over the top of a large hill...
...
| > The [possibly urban legend] story is that an early version of the
| > software would, when flying across a completely flat prairie, roll
| > the plane inverted then immediately upright again when it passed
| > over a cow.  ;-}
| 
| Well, since I sit a couple doors down from the person (Peter Neumann)
| who moderates the Computer Risks forum, I naturally looked this up in
| the computer risks archive.  Here's the article:
| ----------------------------------------
|  F-16 Problems (from Usenet net.aviation)
| Bill Janssen <·······@mcc.com>
| Wed, 27 Aug 86 14:31:45 CDT
...
| o Since the F-16 is a fly-by-wire aircraft, the computer keeps the
| pilot from doing dumb things to himself. So if the pilot jerks hard
| over on the joystick, the computer will instruct the flight surfaces
| to make a nice and easy 4 or 5 G flip. But the plane can withstand a
| much higher flip than that.  So when they were 'flying' the F-16 in
| simulation over the equator, the computer got confused and instantly
| flipped the plane over, killing the pilot [in simulation].  And since
| it can fly forever upside down, it would do so until it ran out of
| fuel.
+---------------

O.k., so that validates the "crossing the equator...in simulation" story,
but says nothing about nap-of-the-earth terrain-following autopilots
[and "flip...over cow" stories"], which I believe came much later.
Maybe even as late as the F-111?


-Rob

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: Fred Gilham
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <u7vf3x95dm.fsf@snapdragon.csl.sri.com>
····@rpw3.org (Rob Warnock) writes:

> Fred Gilham  <······@snapdragon.csl.sri.com> wrote:
> +---------------
> | ····@rpw3.org (Rob Warnock) writes:
> | > Paolo Amoroso  <·······@mclink.it> wrote:
> | > +---------------
> | > | Paul Wallich <··@panix.com> writes:
> | > | > Well, there were the stories about fly-by-wire aircraft flipping
> | > | > upsidedown when they crossed the equator...
> | > | 
> | > | I heard a similar story, but the aircraft flipped upside down *in a
> | > | simulation*.
> | > +---------------
> | > 
> | > These may all be urban legend variations of a different story
> | > altogether: What is certainly true is that very-low-altitude
> | > terrain-following autopilots have code in them to, in certain cases,
> | > roll the plane upside down when going over the top of a large hill...
> ...
> | > The [possibly urban legend] story is that an early version of the
> | > software would, when flying across a completely flat prairie, roll
> | > the plane inverted then immediately upright again when it passed
> | > over a cow.  ;-}
> | 
> | Well, since I sit a couple doors down from the person (Peter Neumann)
> | who moderates the Computer Risks forum, I naturally looked this up in
> | the computer risks archive.  Here's the article:
> | ----------------------------------------
> |  F-16 Problems (from Usenet net.aviation)
> | Bill Janssen <·······@mcc.com>
> | Wed, 27 Aug 86 14:31:45 CDT
> ...
> | o Since the F-16 is a fly-by-wire aircraft, the computer keeps the
> | pilot from doing dumb things to himself. So if the pilot jerks hard
> | over on the joystick, the computer will instruct the flight surfaces
> | to make a nice and easy 4 or 5 G flip. But the plane can withstand a
> | much higher flip than that.  So when they were 'flying' the F-16 in
> | simulation over the equator, the computer got confused and instantly
> | flipped the plane over, killing the pilot [in simulation].  And since
> | it can fly forever upside down, it would do so until it ran out of
> | fuel.
> +---------------
> 
> O.k., so that validates the "crossing the equator...in simulation" story,
> but says nothing about nap-of-the-earth terrain-following autopilots
> [and "flip...over cow" stories"], which I believe came much later.
> Maybe even as late as the F-111?

The F-111 is a Viet Nam War era bomber[*] --- not fly by wire.  It
antedates the "teenagers" --- the F-14, F-15 and F-16 which were
developed in the 70s.  The F-111 did have a terrain-following system.
I don't know what cow-flipping problems it may have had. :-)


[*] To everyone but F-111 pilots, who greatly resent being called
bomber pilots.

-- 
Fred Gilham                                        ······@csl.sri.com
The PTPL (People's Trotskyist Programming League) believes that
hackers are elitist and that all software should be created by the
masses flailing away at millions of keyboards.  I didn't have the
heart to tell them that this has already been tried and the result is
called Linux.
From: Joe Marshall
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <ll4svhho.fsf@comcast.net>
Fred Gilham <······@snapdragon.csl.sri.com> writes:

> The F-111 did have a terrain-following system.
> I don't know what cow-flipping problems it may have had. :-)

So cow-flipping is as anecdotal as cow-tipping?


-- 
~jrm
From: Larry Clapp
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <slrndbuerq.iq.larry@theclapp.ddts.net>
In article <····················@newscontent-01.sprint.ca>, Hartmann Schaffer wrote:
> lots of administrative systems were effected, but there was also
> quite a lot of unnecessary hype and scaremongering about Y2K, and i
> wouldn't be surprised if some money was wasted because of that.
> remember the dire predictions of elevators crashing and planes
> falling down in mid-flight? 
>   i am still trying to figure out how an incorrect date algorithm
> could effect those

Really?  Why?  I don't know a lot of nitty-gritty details behind the
Y2K problem (so this specific example might be bogus) but consider an
elevator program that calculates velocity like so:

  v = (d1 - d0) / (t1 - t0)

The time is 31 Dec, 1999, 23:59:59.  A second from now, t1 will = 0,
leaving v negative and very small.  The elevator program either
crashes, leaving you stranded, or applies a great deal of thrust the
other way, or freaks out in some other way.  Perhaps the next second
it will recover -- but perhaps not.  It's almost by definition badly
written, or the Y2K bug wouldn't've caused a problem for it.  So who
knows?

Consider any computer program that suddenly thinks ONE HUNDRED YEARS
have elapsed since its last action.  Fuel regulators.  Intravenous
drip monitors.  Consider financial programs.  Consider anything that
calculates change over time.

Maybe nothing would have happened if no one had done anything.  But I
doubt it.  Maybe only a few people would have *died* -- but I bet *a
lot* of businesses would have gotten *sued*.

Anyway, I can easily imagine a bad date algorithm affecting lots of
stuff.

-- L
From: Hartmann Schaffer
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <xBYve.4976$EP2.24258@newscontent-01.sprint.ca>
Larry Clapp wrote:
> ...
>>lots of administrative systems were effected, but there was also
>>quite a lot of unnecessary hype and scaremongering about Y2K, and i
>>wouldn't be surprised if some money was wasted because of that.
>>remember the dire predictions of elevators crashing and planes
>>falling down in mid-flight? 
>>  i am still trying to figure out how an incorrect date algorithm
>>could effect those
> 
> Really?  Why?  I don't know a lot of nitty-gritty details behind the
> Y2K problem (so this specific example might be bogus) but consider an
> elevator program that calculates velocity like so:
> 
>   v = (d1 - d0) / (t1 - t0)

most embedded systems (the ones used to run those devices) have a timer 
that is not date related.  i would expect that to be used

> ...
> Consider any computer program that suddenly thinks ONE HUNDRED YEARS
> have elapsed since its last action.  Fuel regulators.  Intravenous
> drip monitors.

again, use clock ticks rather than date/time difference

>  Consider financial programs.  Consider anything that
> calculates change over time.

agreed, and my comment didn't address them

> ...

hs
From: Larry Clapp
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <slrndc1619.iq.larry@theclapp.ddts.net>
In article <····················@newscontent-01.sprint.ca>, Hartmann Schaffer wrote:
> Larry Clapp wrote:
>> ...
>>>lots of administrative systems were effected, but there was also
>>>quite a lot of unnecessary hype and scaremongering about Y2K, and i
>>>wouldn't be surprised if some money was wasted because of that.
>>>remember the dire predictions of elevators crashing and planes
>>>falling down in mid-flight? 
>>>  i am still trying to figure out how an incorrect date algorithm
>>>could effect those
>> 
>> Really?  Why?  I don't know a lot of nitty-gritty details behind
>> the Y2K problem (so this specific example might be bogus) but
>> consider an elevator program that calculates velocity like so:
>> 
>>   v = (d1 - d0) / (t1 - t0)
> 
> most embedded systems (the ones used to run those devices) have a
> timer that is not date related.  i would expect that to be used

Do you mean that "it *should* use the clock" or "it *does* use the
clock"?

>> ...
>> Consider any computer program that suddenly thinks ONE HUNDRED
>> YEARS have elapsed since its last action.  Fuel regulators.
>> Intravenous drip monitors.
> 
> again, use clock ticks rather than date/time difference

Same as above: are you saying that such systems *should* use a clock
and count tics, or that they *do*?  I'm not trying to be a smart-ass,
I honestly can't tell which point you want to make.

If you mean that they *do*, and that they always have, then I'd agree
with you that the elevator and airplane predictions seem curious.  If
you mean that they *should* but *didn't*, well, that's the point,
isn't it?  :)

>> Consider financial programs.  Consider anything that calculates
>> change over time.
> 
> agreed, and my comment didn't address them

Okay.

-- Larry
From: Hartmann Schaffer
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <W2Jwe.5343$EP2.26018@newscontent-01.sprint.ca>
Larry Clapp wrote:
> ...
>>>> i am still trying to figure out how an incorrect date algorithm
>>>>could effect those
>>>
>>>Really?  Why?  I don't know a lot of nitty-gritty details behind
>>>the Y2K problem (so this specific example might be bogus) but
>>>consider an elevator program that calculates velocity like so:
>>>
>>>  v = (d1 - d0) / (t1 - t0)
>>
>>most embedded systems (the ones used to run those devices) have a
>>timer that is not date related.  i would expect that to be used
> 
> 
> Do you mean that "it *should* use the clock" or "it *does* use the
> clock"?

let's put it this way:  they should, and i have never seen one that 
doesn't (of course, i haven't seen all of them, but i would claim that 
such a system wouldn't be a Y2K problem but a poor design problem)

> ..

hs
From: GP lisper
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1119634203.6e62e1c9e7ba09fbd266ffb6da3a5a66@teranews>
On 24 Jun 2005 11:21:43 +0100, <······@hexapodia.net> wrote:
>
> Kenny Tilton <·······@nyc.rr.com> writes:
>
> [ SNIP ]
>> Anyone with an iota of an experience in production code knows how fast
>> systems get swapped out, and that was the trivial yet telling point
>> McCarthy made in teasing Rahul and that particular defense of
>> standardization.
>
> Anyone with experience of production code? My experience is taht
> anything taht gets instaleld runs until the circumstyacnes have
> changed enough that it is no longer viable keeping the code in
> production. I saw PDP-11 machines running physiological monitors as
> late as 1992, but they were scheduled for throwing out, not because
> the code (about 12 years old at that point) was bad or that massive
> advancements in sensors had been done in the last 12 years, but
> because it was at a point that the hardware itself was no longer
> sensibly supportable.

That is the routine in a money-poor enviroment.  A money-rich
enviroment, such as a trading floor, adopts new tech the moment that a
potential gain exists.

-- 
The LOOP construct is really neat, it's got a lot of knobs to turn!
Don't push the yellow one on the bottom.
From: Espen Vestre
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <kw1x6srqkl.fsf@merced.netfonds.no>
Kenny Tilton <·······@nyc.rr.com> writes:

> Anyone with an iota of an experience in production code knows how fast
> systems get swapped out, 

Fast? I had to interface to telco systems from the mid-seventies as 
late as in 1998.
-- 
  (espen)
From: Kenny Tilton
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <APUue.26721$IX4.8623@twister.nyc.rr.com>
Espen Vestre wrote:

> Kenny Tilton <·······@nyc.rr.com> writes:
> 
> 
>>Anyone with an iota of an experience in production code knows how fast
>>systems get swapped out, 
> 
> 
> Fast? I had to interface to telco systems from the mid-seventies as 
> late as in 1998.

<g> Look, guys, you cannot win! In some cases these systems are just 
running the same binary. Hell, the source has likely been lost. So the 
larger point of "let languages evolve continually" stands. If these apps 
/are/ being changed or at least recompiled, well, someone either was 
bright enough to keep the /compiler/ binaries around, or the evolution 
of the language has not made it impossible to run systems for twenty 
years-- what language has not evolved since the seventies or even 
eighties? Even COBOL has changed a lot over the years, and I seem to 
recall dragging my C system across more than one interesting development 
in thet tiny language.


-- 
Kenny

Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film

"If you plan to enter text which our system might consider to be 
obscene, check here to certify that you are old enough to hear the 
resulting output." -- Bell Labs text-to-speech interactive Web page
From: Espen Vestre
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <kwu0jnq0ij.fsf@merced.netfonds.no>
Kenny Tilton <·······@nyc.rr.com> writes:

> <g> Look, guys, you cannot win! In some cases these systems are just
> running the same binary. Hell, the source has likely been lost. So the
> larger point of "let languages evolve continually" stands.

Well, I take your point, and partially agree. However, if you
introduce the kind of backwards incompatiblity that perl was plagued
with in the 5.0.* versions (5.0.3? I can't remember, must be my
selective memory :)), you're creating a mess, especially for a
language like perl where you have to keep umpteen versions of the
interpreter around for all those little scripts that you never will
have time to upgrade to the next 0.01 version.

I agree that large systems usually will be maintained all the time, so
most reasonable incremental changes to the language will be easy to
adapt to, since you're upgrading your development tools anyway (I
guess that seventies telco system was compiled with a much more recent
cobol implementation than the one used by the original developers...).
-- 
  (espen)
From: Christopher C. Stacy
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uu0jnspkj.fsf@news.dtpq.com>
Espen Vestre <·····@vestre.net> writes:

> Kenny Tilton <·······@nyc.rr.com> writes:
> 
> > Anyone with an iota of an experience in production code knows how fast
> > systems get swapped out, 
> 
> Fast? I had to interface to telco systems from the mid-seventies as 
> late as in 1998.

2000, here.
From: Kenny Tilton
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <9hNue.13898$XB2.3622619@twister.nyc.rr.com>
Christopher C. Stacy wrote:

> Kenny Tilton <·······@nyc.rr.com> writes:
> 
>>* McCarthy actually meant that very little code lasts ten years.
> 
> 
> That would suggest a serious disconnect with reality;
> it's a little hard to believe.

Oh. please. You have no knowledge or experience of production code. It 
/always/ gets thrown way when it needs changing. By the time the 
corpolopolis acknowledges change is needed, the old code is too rotten 
to refactor.

ie, No, I was just having fun, JMcC did not really slam Rahul, he just 
made an easy point: production code regularly gets tossed, because it is 
so much easier to rewrite than salvage. And if you are re-salvaging, tou 
may as well change syntax change here and there.


-- 
Kenny

Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film

"If you plan to enter text which our system might consider to be 
obscene, check here to certify that you are old enough to hear the 
resulting output." -- Bell Labs text-to-speech interactive Web page
From: Christopher C. Stacy
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uoe9w9ss1.fsf@news.dtpq.com>
Kenny Tilton <·······@nyc.rr.com> writes:

> Christopher C. Stacy wrote:
> 
> > Kenny Tilton <·······@nyc.rr.com> writes:
> >
> >>* McCarthy actually meant that very little code lasts ten years.
> > That would suggest a serious disconnect with reality;
> > it's a little hard to believe.
> 
> Oh. please. You have no knowledge or experience of production code.

I've been delivering production code since about 1976,
so I guess if I haven't figured anything out by now,
I'm not likely to ever figure it out.  (So you should
probably stop wasting time lecturing me about it.)
From: Kenny Tilton
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <VoUue.26712$IX4.18707@twister.nyc.rr.com>
Christopher C. Stacy wrote:
> Kenny Tilton <·······@nyc.rr.com> writes:
> 
> 
>>Christopher C. Stacy wrote:
>>
>>
>>>Kenny Tilton <·······@nyc.rr.com> writes:
>>>
>>>
>>>>* McCarthy actually meant that very little code lasts ten years.
>>>
>>>That would suggest a serious disconnect with reality;
>>>it's a little hard to believe.
>>
>>Oh. please. You have no knowledge or experience of production code.
> 
> 
> I've been delivering production code since about 1976,

Then you know that bad systems get thrown out and good systems change 
over time because requirements change over time. If they are being 
thrown out, there you go. If they are being revised, there you go: a 
substantial new effort in which any change to the language probably only 
helps by making the language more expressive. This is the same reason 
that code quality matters and the fact code works does not: any 
successful system grows to take on new functionality and at least lasts 
long enough to see requirements change. Good code means it will see more 
work.


-- 
Kenny

Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film

"If you plan to enter text which our system might consider to be 
obscene, check here to certify that you are old enough to hear the 
resulting output." -- Bell Labs text-to-speech interactive Web page
From: Christopher C. Stacy
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <upsubspiy.fsf@news.dtpq.com>
Kenny Tilton <·······@nyc.rr.com> writes:

> Christopher C. Stacy wrote:
> > Kenny Tilton <·······@nyc.rr.com> writes:
> >
> >>Christopher C. Stacy wrote:
> >>
> >>
> >>>Kenny Tilton <·······@nyc.rr.com> writes:
> >>>
> >>>
> >>>>* McCarthy actually meant that very little code lasts ten years.
> >>>
> >>>That would suggest a serious disconnect with reality;
> >>>it's a little hard to believe.
> >>
> >>Oh. please. You have no knowledge or experience of production code.
> > I've been delivering production code since about 1976,
> 
> Then you know that bad systems get thrown out

Remind me which planet you're from, again?
From: Kenny Tilton
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <XzWue.26730$IX4.22696@twister.nyc.rr.com>
Christopher C. Stacy wrote:
> Kenny Tilton <·······@nyc.rr.com> writes:
> 
> 
>>Christopher C. Stacy wrote:
>>
>>>Kenny Tilton <·······@nyc.rr.com> writes:
>>>
>>>
>>>>Christopher C. Stacy wrote:
>>>>
>>>>
>>>>
>>>>>Kenny Tilton <·······@nyc.rr.com> writes:
>>>>>
>>>>>
>>>>>
>>>>>>* McCarthy actually meant that very little code lasts ten years.
>>>>>
>>>>>That would suggest a serious disconnect with reality;
>>>>>it's a little hard to believe.
>>>>
>>>>Oh. please. You have no knowledge or experience of production code.
>>>
>>>I've been delivering production code since about 1976,
>>
>>Then you know that bad systems get thrown out
> 
> 
> Remind me which planet you're from, again?

One where "system life cycle" includes death.

-- 
Kenny

Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film

"If you plan to enter text which our system might consider to be 
obscene, check here to certify that you are old enough to hear the 
resulting output." -- Bell Labs text-to-speech interactive Web page
From: Jimka
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1119647013.250505.5950@g47g2000cwa.googlegroups.com>
Unfortunately far too much code stays around for a lot longer than 10
years.  It is usually the code that should have been rewritten a long
time ago.
From: Pascal Bourguignon
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <87mzpf3bzb.fsf@thalassa.informatimago.com>
······@news.dtpq.com (Christopher C. Stacy) writes:

> Kenny Tilton <·······@nyc.rr.com> writes:
>> * McCarthy actually meant that very little code lasts ten years.
>
> That would suggest a serious disconnect with reality;
> it's a little hard to believe.

Indeed, code lasts much longer.  
Some _development_ _projects_ last longer!
It would not be too good if you had to rewrite these big project
entirely before they're even finished.


However, it's possible that the only stultifying feature of Common
Lisp is the command that "Thou shalt not modify the definition of
COMMON-LISP symbols".

But since it's is not too difficult to modify the behavior of Common
Lisp just defining a new package in which the modified functions are
defined and from which they're exported,  why don't we see  more
packages using some of these extended lisp package instead of
COMMON-LISP?


(defpackage "EXPERIMENTAL-LISP-USER"
   (:use "EXPERIMENTAL-LISP"))
(in-package "EXPERIMENTAL-LISP-USER")

(defun fact (n)
  (specified-as
     (= (fact 0) 1)
     (= (fact n) (* n (fact (1- n))))))


In conclusion, I bet you've just been prey of Pr. McCarthy's humor sense.


-- 
A: Because it messes up the order in which people normally read text.
Q: Why is top-posting such a bad thing?
A: Top-posting.
Q: What is the most annoying thing on usenet and in e-mail?
__Pascal Bourguignon__                     http://www.informatimago.com/
From: Matthias
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <36wy890w3y5.fsf@hundertwasser.ti.uni-mannheim.de>
Kenny Tilton <·······@nyc.rr.com> writes:

> [apologies if this has materialised in similar form or does so soon
> unbeknownst to me, but from where I sit it appears Google ate a
> similar report posted yesterday via google groups.]
> 
> Dr. McCarthy joined with Henry Baker, his predecessor at the
> microphone, in bemoaning the standardization of Common Lisp as
> stultifying if not mortifying, in that it ended innovation.

Well, other languages were standardized by the same heavyweight
process as CL (C, C++, SQL, Fortran).  While these languages sure
evolve slower than Perl/Python/Ruby/etc there is development.  The C++
standard is revised and extended every 5 years or so.  The current
version of Fortran is from 1995 and I believe there is a revision
process going on right now.  

For other languages there is, despite standardization, _no_ guarantee
that 20year old code will run or compile with a current
implementation.  Changes to any standard usually break some code (in
the simplest case by introducing a new keyword).  So, this notion that
"standard" implies "if my code runs now it will run forever" is wrong.
From: Stefan Nobis
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <87acleai9t.fsf@snobis.de>
Matthias <··@spam.please> writes:

> Well, other languages were standardized by the same heavyweight
> process as CL (C, C++, SQL, Fortran).  While these languages sure
> evolve slower than Perl/Python/Ruby/etc there is development.  The C++
> standard is revised and extended every 5 years or so.

That's not quite correct. C++ for example has till now no new
standard version after the initial one of 1998 (the next one is
expected not before 2006 or even 2007 and changes seems to be
(very) small -- some clarifications, nearly no changes in base
language but some additions to the library, mostly copied from
boost.org; at least as far as i know).

So everything we need to be comparable to C++ in this regard is
more libraries with a peer review system like boost.org.

Another example: Ada. Last standards version 1995, next not before
2007 or 2008 (main changes are extended standards libraries: at
least a collection classes library will be added).

So it's all about libraries.

-- 
Stefan.
From: Alexander Kjeldaas
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <42bd4ba7$1@news.broadpark.no>
Stefan Nobis wrote:
> Matthias <··@spam.please> writes:
> 
> 
>>Well, other languages were standardized by the same heavyweight
>>process as CL (C, C++, SQL, Fortran).  While these languages sure
>>evolve slower than Perl/Python/Ruby/etc there is development.  The C++
>>standard is revised and extended every 5 years or so.
> 
> 
> That's not quite correct. C++ for example has till now no new
> standard version after the initial one of 1998 (the next one is
> expected not before 2006 or even 2007 and changes seems to be
> (very) small -- some clarifications, nearly no changes in base
> language but some additions to the library, mostly copied from
> boost.org; at least as far as i know).
> 

Look at ftp://ftp.nuug.no/pub/dist/20050415-stroustrup.pdf for some 
slides from strostrup about where c++0x is going.  Strostrup gave this 
talk after the Lillehammer meeting of the C++ standards committee.

Because of the popularity of template meta programming, c++0x will 
probably include major changes to make template meta-programming easier. 
  Auto-typing, concepts (a type system for types), simple compile-time 
reflection etc. are improvements in this area.
Concepts alone is a major change IMO.

> So everything we need to be comparable to C++ in this regard is
> more libraries with a peer review system like boost.org.
> 
> Another example: Ada. Last standards version 1995, next not before
> 2007 or 2008 (main changes are extended standards libraries: at
> least a collection classes library will be added).
> 
> So it's all about libraries.
> 

The new c++0x standard is all about making it *easier* to write 
libraries like the ones made by boost.org.

astor
From: Paolo Amoroso
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <87psuc5c0l.fsf@plato.moon.paoloamoroso.it>
Kenny Tilton <·······@nyc.rr.com> writes:

> Dr. McCarthy joined with Henry Baker, his predecessor at the
> microphone, in bemoaning the standardization of Common Lisp as
> stultifying if not mortifying, in that it ended innovation.

Okay, let's file this, we will arrange for a vote at ILC 2006:

  Issue:          COMMON-LISP-STANDARDIZATION-SUCKS

  References:	  ILC 2005

  Related issues: STANDARDIZATION-SUCKS

  Category:       DELETION

  Edit history:   Version 1, 24-Jun-05 by Paolo Amoroso
  
  Status:	  For discussion and evaluation; not proposed for
		  inclusion in the standard at this time.

  Problem description:

    The standardization of Common Lisp is stultifying if not
    mortifying, in that it ended innovation.

  Proposal COMMON-LISP-STANDARDIZATION-SUCKS:KILL-STANDARD

   Overview:

    From http://lispmeister.com/blog/lisp-news/ILC05-rep-1.html:
    "If someone was to drop a bomb on this building, it would wipe out
    50 percent of the Lisp community.  That would probably be a good
    thing.  It would allow Lisp to start over."

   These passages are relevant to or affected by X3J13 Cleanup Issue
   COMMON-LISP-STANDARDIZATION-SUCKS:

     * The whole ANSI Common Lisp specification

   Note: It is possible that there are other passages affected by this
   cleanup issue.  This list is not part of the specification, and has
   not been formally audited for completeness by X3J13.


> * McCarthy actually meant that very little code lasts ten years.

Be sure not to invite McCarthy and VisiCalc co-inventor Dan Bricklin
at the same event:

  Software That Lasts 200 Years
  http://www.danbricklin.com/200yearsoftware.htm

  The structure and culture of a typical prepackaged software company
  is not attuned to the long-term needs of society for software that
  is part of its infrastructure.  This essay discusses the ecosystem
  needed for development that better meets those needs.


Paolo
-- 
Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film
Recommended Common Lisp libraries/tools:
- ASDF/ASDF-INSTALL: system building/installation
- CL-PPCRE: regular expressions
- UFFI: Foreign Function Interface
From: Pascal Bourguignon
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <87is033b06.fsf@thalassa.informatimago.com>
Paolo Amoroso <·······@mclink.it> writes:
> Be sure not to invite McCarthy and VisiCalc co-inventor Dan Bricklin
> at the same event:
>
>   Software That Lasts 200 Years
>   http://www.danbricklin.com/200yearsoftware.htm
>
>   The structure and culture of a typical prepackaged software company
>   is not attuned to the long-term needs of society for software that
>   is part of its infrastructure.  This essay discusses the ecosystem
>   needed for development that better meets those needs.

JVM is a good step toward long-term survival of (Java) software.
Hardware will change, but a simple virtual machine is all you need to
keep your software running.
Perhaps all software should always run on a virtual machine.

The survival of lisp software would be improved if more
implementations used a lisp virtual machine.  
Look how openmcl became caduc on June 6th ;-)
Or how OpenGenera is insignificant for lack of Alpha processors.


-- 
__Pascal Bourguignon__                     http://www.informatimago.com/
Cats meow out of angst
"Thumbs! If only we had thumbs!
We could break so much!"
From: Zaninsk
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1119635723.a45ce7fc815a9cefb2a6dc05225c747e@teranews>
> JVM is a good step toward long-term survival of (Java) software.
> Hardware will change, but a simple virtual machine is all you need to
> keep your software running.
> Perhaps all software should always run on a virtual machine.
>
> The survival of lisp software would be improved if more
> implementations used a lisp virtual machine.
> Look how openmcl became caduc on June 6th ;-)
> Or how OpenGenera is insignificant for lack of Alpha processors.
>

Yes but if you are serious about performance you'll likely implement a JIT 
engine for X, Y, Z processor. Why not cut out the midle man and compile 
directly to your real machine? If performance is not a point then yes, 
virtual machines are a good advantage. 
From: Steve
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1119664829.870545.142600@g43g2000cwa.googlegroups.com>
Zaninsk wrote:
> ...
> Yes but if you are serious about performance you'll likely implement a JIT
> engine for X, Y, Z processor. Why not cut out the midle man and compile
> directly to your real machine? If performance is not a point then yes,
> virtual machines are a good advantage.

ANSI C essentially defines a UNIX virtual machine and is relatively
easy to port to new architectures.  A Lisp which compiles to C like GCL
gains both portability and performance.
From: Jack Unrue
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <0s6pb15v5ie1bq8q1m5ic9f6cc0sfeptim@4ax.com>
On Fri, 24 Jun 2005 19:44:41 +0200, Pascal Bourguignon <···@informatimago.com> wrote:
>
> JVM is a good step toward long-term survival of (Java) software.
> Hardware will change, but a simple virtual machine is all you need to
> keep your software running.

I don't know how much of a step the JVM really is (and I don't mean to
bash Java just because I'm posting to c.l.l., because I like it and
programming in Java pays my bills).

I think the survival of Java software is as much or more a function of
how hard Sun keeps striving for backwards compatibility in the class file
format, the JVM, and the standard libraries, as anything else.  I would
cite:

- the longstanding existence of @deprecated and the fact that many if not
  most so-called "deprecated" standard library functions have not yet
  actually been removed even since the 1.1 days (check out the list of
  deprecated interfaces/classes/etc in the JDK docs and see how many are
  hold-overs from 1.1)

- the compromises made in implementing new language features in Java 5
  (c.f. type erasure, which has led some people to question just how
  "generic" Java 5 generics really are)

- javac's -source and -target options as well as java.exe's -version:<value>
  option, which are certainly good features but which I think support my
  point that Sun has put in effort in this

- the sheer weight of the Java community that has developed over time

One could imagine an alternate universe where backwards compatibility
was not such a major underlying tenant of Java, and one could imagine
the Java in that alternate universe not being so popular.  Sun had
a chance to make major changes (in the 1.0.x - 1.1 transition) and they
took it.  I don't think they believe they could get away with that
nowadays. 

Also, Java as a platform is highly dependent on native code libraries to
supply functionality that real-world apps need.

> The survival of lisp software would be improved if more
> implementations used a lisp virtual machine.  

Here's a question -- does anyone think that the development of a Lisp
VM (whether based on the one in clisp or not) would proceed any differently
from the development of Lisp itself, taking into account the history of Lisp
dialects and the Lisp culture?

I don't think there would be much difference (not that that's necessarily bad,
it's just speculation on my part).

-- 
Jack
From: Paolo Amoroso
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <871x6qi4oh.fsf@plato.moon.paoloamoroso.it>
Pascal Bourguignon <···@informatimago.com> writes:

> Perhaps all software should always run on a virtual machine.

Such a virtual machine is already in widespread use, and it's called
x86 or something like that :)


Paolo
-- 
Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film
Recommended Common Lisp libraries/tools:
- ASDF/ASDF-INSTALL: system building/installation
- CL-PPCRE: regular expressions
- UFFI: Foreign Function Interface
From: Ulrich Hobelmann
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <3i7btfFk5mbqU1@individual.net>
Paolo Amoroso wrote:
> Pascal Bourguignon <···@informatimago.com> writes:
> 
> 
>>Perhaps all software should always run on a virtual machine.
> 
> 
> Such a virtual machine is already in widespread use, and it's called
> x86 or something like that :)

Well, a much nicer VM is Unix, and it has multithreading and 
communication channels and interfaces nicely with the internet ;)

-- 
Don't let school interfere with your education. -- Mark Twain
From: Onay Urfalioglu
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <d9hc3t$6mn$1@newsserver.rrzn.uni-hannover.de>
Kenny Tilton wrote:

> [apologies if this has materialised in similar form or does so soon
> unbeknownst to me, but from where I sit it appears Google ate a similar
> report posted yesterday via google groups.]
> 
> Dr. McCarthy joined with Henry Baker, his predecessor at the microphone,
> in bemoaning the standardization of Common Lisp as stultifying if not
> mortifying, in that it ended innovation.
> 
> When rahul defended standardization as allowing his code to run ten
> years from now, McCarthy indicated that (paraphrasing) by the looks of
> Rahul it was unlikely he would produce code that anyone would want to
> run ten years from now.*
> 
> XML had the honor of having McCarthy stop in the middle of a meandering
> bit of reflection to mention how much he disliked XML.
> 
> And when your correspondent asked why he had chosen such a crappy name
> for such a great language and whether he regretted, in what is becoming
> an annual rite of humiliation, he pretty much ignored my question, but
> did mention that his preference had been FLPL, for Fortran List
> Processing Language, because he liked Fortran.
> 
> Intriguingly, there is a Fortran package with that exact name and
> acronym and function, created in 1960 as far as I can make out from some
> light googling.
> 
> * McCarthy actually meant that very little code lasts ten years.
> 

If there were only ONE CL-implementation, there would be no requirement for
a standard... this would lead to

* a much higher quality codebase since much bigger user base
* more and more bugfree extensions and libraries
* each extension, once accepted, would be a de facto standard automatically
since there is only one impl.
* an overall speedup of development and progress of CL itself since all
developers would HAVE TO work TOGETHER (in harmony) 


IMO, this is one of the strength of PL like Python, Ruby ....
From: Matthias Buelow
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <86r7erzo7d.fsf@drjekyll.mkbuelow.net>
Onay Urfalioglu <·····@gmx.net> writes:

>If there were only ONE CL-implementation, there would be no requirement for
>a standard... this would lead to
[...]

IMHO there're two models. The python/perl/ruby etc. model lends itself
to hobbyist, amateur and general "hackish" programming (in the
positive sense). While fixed standards benefit industrial programming
and the development of large systems by large teams, over several
years, especially when several high-quality, commercial
implementations are available. With standards comes planning security
and choice (of competing vendors), something that is more important in
an industrial environment than neat features of a single
implementation.

mkb.
From: Pupeno
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <d9hhpd$kbn$1@domitilla.aioe.org>
Matthias Buelow wrote:
> Onay Urfalioglu <·····@gmx.net> writes:
> 
>>If there were only ONE CL-implementation, there would be no requirement
>>for a standard... this would lead to
> [...]
> 
> IMHO there're two models. The python/perl/ruby etc. model lends itself
> to hobbyist, amateur and general "hackish" programming (in the
> positive sense). While fixed standards benefit industrial programming
> and the development of large systems by large teams, over several
> years, especially when several high-quality, commercial
> implementations are available. With standards comes planning security
> and choice (of competing vendors), something that is more important in
> an industrial environment than neat features of a single
> implementation.
As I said in another message: yes, ok, you are rigth, but then, I can't
understand why Python er Perl is much used on the industry than Lisp (maybe
I'm seeing a different industry than you). The industry I see is not so
worried in the long term problem (be it that bad er good) but in the Just
Work(tm) part, or the amount of libraries, or amount of documentation. For
example if I want a web application server for Python, the answer is Zope,
which just runs on Windows, MacOS X and almost all other variations of
Unix, Linux, BSD. If I tell my boss: "oh there's a solution A for doing B
en CL, but it is developed and run on OpenCML, we just first check if it'd
work on our current deployed SBCL on Linux, and maybe port it if necessary"
my boss will ge crazy and conclude that this is not the way (and I kinda
agree with him). At the companies where I worked we never worried about ten
years from now or running our code on a different platform (we were doing
mainly server things) but we did worry a lot about how hard was to use
others code (like libraries).
Thanks.
-- 
Pupeno <······@pupeno.com> (http://pupeno.com)
Reading ? Science Fiction ? http://sfreaders.com.ar
From: Kirk Job Sluder
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <87u0jnfv67.fsf@debian.kirkjobsluder.is-a-geek.net>
Pupeno <······@pupeno.com> writes:
> As I said in another message: yes, ok, you are rigth, but then, I can't
> understand why Python er Perl is much used on the industry than Lisp (maybe
> I'm seeing a different industry than you). 

Well, I think a lot of it has to do with historical evolution.  Perl was
written in such a way that it could be easily adopted by people
experienced with Bourne, sed and awk.  The expansion of regular
expressions also helped.  This led to its quick adoption as the system
administration scripting language to be used after Bourne and bash.  So
in a rather quick period of time, there was a pretty large volume of
perl scripts running around.

So perl was, in many ways, the right language for a large number of
tasks, at the right place, in the right time of UNIX systems
development, and built on earlier systems.  

Later on, the limitations and redundancy of perl became a problem.
Meanwhile, you have this large base of education and industry using
"object oriented" as a buzzword, and perl's object orientation was not
up to the task.  So then, cpython and ruby entered the scene as
"scripting languages" with a strong "object oriented" approach to
design, and similarities to C++ and java.  

I don't think that the standardization is as much of a problem.  After
all, C, (and I think C++) are also both ANSI standardized.  And yet, we
still see tons of innovative development built on C and C++
libraries. To my memory those standards are pretty minimal, with many of
the issues discussed in this thread bootstrapped from a core of the
language.


-- 
Kirk Job-Sluder
"The square-jawed homunculi of Tommy Hilfinger ads make every day an
existential holocaust."  --Scary Go Round
From: Matthias Buelow
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <86psub1u38.fsf@drjekyll.mkbuelow.net>
Pupeno <······@pupeno.com> writes:

>As I said in another message: yes, ok, you are rigth, but then, I can't
>understand why Python er Perl is much used on the industry than Lisp (maybe
>I'm seeing a different industry than you). The industry I see is not so
>worried in the long term problem (be it that bad er good) but in the Just
>Work(tm) part, or the amount of libraries, or amount of documentation. For

Well, it is my impression that Perl and Python are restricted to the
scripting and glue-language role, or for one-shot hacks. I'd doubt
that someone would chose Perl for a traffic routing or telecom
switching system, or somesuch, typically projects which take years in
the making and occupy larger programmer teams. It's here where
standards are beneficial (or pseudo-standards like with Sun acting as
a de-facto standards body for Java).

mkb.
From: Pupeno
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <d9hk6d$nef$1@domitilla.aioe.org>
Matthias Buelow wrote:
> Well, it is my impression that Perl and Python are restricted to the
> scripting and glue-language role, or for one-shot hacks. I'd doubt
> that someone would chose Perl for a traffic routing or telecom
> switching system, or somesuch, typically projects which take years in
> the making and occupy larger programmer teams. It's here where
> standards are beneficial (or pseudo-standards like with Sun acting as
> a de-facto standards body for Java).
As I said, I way be seeing a different industry that you (because where I am
located none of these developments ever happen).
-- 
Pupeno <······@pupeno.com> (http://pupeno.com)
Reading ? Science Fiction ? http://sfreaders.com.ar
From: Karl A. Krueger
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <d9l6gb$99l$1@baldur.whoi.edu>
Matthias Buelow <···@incubus.de> wrote:
> Well, it is my impression that Perl and Python are restricted to the
> scripting and glue-language role, or for one-shot hacks.

For better or for worse, this is quite erroneous.  Both are used rather
more broadly than scripting and glue.

Some examples in Python include the Zope Web application server, the
Twisted high-performance network application server, the Mailman
mailing-list manager, and the BitTorrent distributed file server.


> I'd doubt that someone would chose Perl for a traffic routing or
> telecom switching system, or somesuch, typically projects which take
> years in the making and occupy larger programmer teams.

These days, telecom systems are written in hodge-podges of C and various
external scripting languages:  http://www.asterisk.org/

-- 
Karl A. Krueger <········@example.edu> { s/example/whoi/ }
From: Zachery Bir
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <2005062811204616807%zbir@urbanapecom>
On 2005-06-24 14:35:23 -0400, Matthias Buelow <···@incubus.de> said:

> Pupeno <······@pupeno.com> writes:
> 
>> As I said in another message: yes, ok, you are rigth, but then, I can't
>> understand why Python er Perl is much used on the industry than Lisp (maybe
>> I'm seeing a different industry than you). The industry I see is not so
>> worried in the long term problem (be it that bad er good) but in the Just
>> Work(tm) part, or the amount of libraries, or amount of documentation. For
> 
> Well, it is my impression that Perl and Python are restricted to the
> scripting and glue-language role, or for one-shot hacks.

(with-delurk """Well, our business (Zope Corp.) is pretty much Python 
only, and the projects and products we sell and write go well beyond 
"scripts" or "glue". Given the rest of your post, it seems you're 
ignoring (even if just for the sake of argument) a huge swath of 
potentially profitable (and sometimes even interesting) software. The 
spectrum's quite a bit more variegated than scripts/glue/one-shot hacks 
on the one hand and air traffic control systems on the other.""")

> I'd doubt
> that someone would chose Perl for a traffic routing or telecom
> switching system, or somesuch, typically projects which take years in
> the making and occupy larger programmer teams. It's here where
> standards are beneficial (or pseudo-standards like with Sun acting as
> a de-facto standards body for Java).

Zac
From: Rainer Joswig
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <joswig-636F49.20501624062005@news-europe.giganews.com>
In article <············@domitilla.aioe.org>,
 Pupeno <······@pupeno.com> wrote:

> Matthias Buelow wrote:
> > Onay Urfalioglu <·····@gmx.net> writes:
> > 
> >>If there were only ONE CL-implementation, there would be no requirement
> >>for a standard... this would lead to
> > [...]
> > 
> > IMHO there're two models. The python/perl/ruby etc. model lends itself
> > to hobbyist, amateur and general "hackish" programming (in the
> > positive sense). While fixed standards benefit industrial programming
> > and the development of large systems by large teams, over several
> > years, especially when several high-quality, commercial
> > implementations are available. With standards comes planning security
> > and choice (of competing vendors), something that is more important in
> > an industrial environment than neat features of a single
> > implementation.
> As I said in another message: yes, ok, you are rigth, but then, I can't
> understand why Python er Perl is much used on the industry than Lisp (maybe
> I'm seeing a different industry than you).

Cars are more used than trucks.

CL and Python are very different. Different purpose, different audience, ...

> The industry I see is not so
> worried in the long term problem (be it that bad er good) but in the Just
> Work(tm) part, or the amount of libraries, or amount of documentation. For
> example if I want a web application server for Python, the answer is Zope,
> which just runs on Windows, MacOS X and almost all other variations of
> Unix, Linux, BSD. If I tell my boss: "oh there's a solution A for doing B
> en CL, but it is developed and run on OpenCML, we just first check if it'd
> work on our current deployed SBCL on Linux, and maybe port it if necessary"
> my boss will ge crazy and conclude that this is not the way (and I kinda
> agree with him). At the companies where I worked we never worried about ten
> years from now or running our code on a different platform (we were doing
> mainly server things) but we did worry a lot about how hard was to use
> others code (like libraries).
> Thanks.
From: Matthew D Swank
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <pan.2005.06.24.21.04.09.338366@c.net>
On Fri, 24 Jun 2005 20:50:16 +0200, Rainer Joswig wrote:

> 
> CL and Python are very different. Different purpose, different audience, ...
> 

Though it sounds like McCarthy is not in CL's audience (anymore), and
perhaps (with respect to Python) neither is Peter Norvig.

Matt
-- 
"You do not really understand something unless you can
 explain it to your grandmother." — Albert Einstein.
From: Rainer Joswig
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <joswig-A2178F.08184525062005@news-europe.giganews.com>
In article <······························@c.net>,
 Matthew D Swank <·······································@c.net> wrote:

> On Fri, 24 Jun 2005 20:50:16 +0200, Rainer Joswig wrote:
> 
> > 
> > CL and Python are very different. Different purpose, different audience, ...
> > 
> 
> Though it sounds like McCarthy is not in CL's audience (anymore), and
> perhaps (with respect to Python) neither is Peter Norvig.
> 
> Matt

Right.
From: Karl A. Krueger
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <d9pfqa$hbk$2@baldur.whoi.edu>
Rainer Joswig <······@lisp.de> wrote:
> In article <······························@c.net>,
> Matthew D Swank <·······································@c.net> wrote:
>> On Fri, 24 Jun 2005 20:50:16 +0200, Rainer Joswig wrote:
>> > 
>> > CL and Python are very different. Different purpose, different audience, ...
>> 
>> Though it sounds like McCarthy is not in CL's audience (anymore), and
>> perhaps (with respect to Python) neither is Peter Norvig.

	"Many readers will be surprised that Lisp allows you to do all
	this with conciseness similar to scripting languages such as
	Python, efficiency similar to C++, and unparalleled flexibility
	in designing your own language extensions."

		-- Peter Norvig, quoted on the back of Peter Seibel's
		   _Practical Common Lisp_

-- 
Karl A. Krueger <········@example.edu> { s/example/whoi/ }
From: lin8080
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <42BC8E15.A91EF7D2@freenet.de>
Onay Urfalioglu schrieb:


> If there were only ONE CL-implementation, there would be no requirement for
> a standard... this would lead to

* a compatibility package as a new standard

stefan
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uvf432f8b.fsf@nhplace.com>
Onay Urfalioglu <·····@gmx.net> writes:

> If there were only ONE CL-implementation, there would be no requirement for
> a standard... this would lead to

There are numerous points in time in the past where if there had been
only one implementation, the bottom would have fallen out of the
market.  In the days where there were only commercial implementations,
"second sourcing" was ESSENTIAL to customer willingness to invest a large
amount of money in Lisp.

Now you might argue that times have changed (and I might make my usual
lamentations about it having happened) due to Free Software.  But even
with all that, you're left with the practical fact that there now ARE
multiple implementations.  What are you advocating?  That all but one
lay down their arms and die?  It's hard to kill a free implementation,
and it's hard to tell a commercial implementation to stop caring.

Of course, ANY implementation can simply elect to DIVERGE and hope
others will follow.  That's the only REAL mechanism for making a
single-implementation Lisp.  I just don't see any implementors with
that kind of hubris.

> * a much higher quality codebase since much bigger user base

- only if people follow
- only if there's funding to match what the community does

> * more and more bugfree extensions and libraries

- only because you define away bugs
- you probably lose portability or at least gain uneven support
  across implementations because you rely on people "caring" for
  compliance

> * each extension, once accepted, would be a de facto standard automatically
> since there is only one impl.

- you lose competition over implementation strategy
- you lose discussion about whether there are multiple ways to read
  definitions in the language spec
- you permit opportunistic controllers of the source to change things on
  a whim, perturbing the base of customers who have no recourse but to follow

> * an overall speedup of development and progress of CL itself since all
> developers would HAVE TO work TOGETHER (in harmony) 

- or use another language. you assume a captive audience, which is far
  from true. you're banking the whole future of lisp on quite a gamble.

> IMO, this is one of the strength of PL like Python, Ruby ....

i once heard the following two rules offered for how to win in the market:

 - go with your gut
 - be right

it's the second of these two that is problematic.

by some accounts, python and ruby followed both rules.  (by some
accounts, so did lisp.)  but supposing you disagree that lisp did, you
have laid out clearly how to do the first part in getting to a new
language ("eschew compatibility" -> "go with your gut"), you're more
sketchy on how to achieve the second part ("be right").
From: Coby Beck
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <fhDve.77306$HI.53240@edtnps84>
"Kent M Pitman" <······@nhplace.com> wrote in message 
··················@nhplace.com...
> Onay Urfalioglu <·····@gmx.net> writes:
>
>> If there were only ONE CL-implementation, there would be no requirement 
>> for
>> a standard... this would lead to
...

>> * more and more bugfree extensions and libraries
>
> - only because you define away bugs

I think this is a very insightful point Kent has made here.  It is the same 
process as happens when you are writing a library for your own use (usually 
in a particular appliction) versus trying to write one you can release for 
public consumption.  The temptation is great and the cost is 0 to solve a 
sticky problem by just ensuring it can't happen or documenting that it is 
not supported (or closing your eyes and making a wish;)etc etc.  When you 
are standardizing or packaging for release you are forced to consider the 
hard problems and solve them conceptually, not just in a particular piece of 
code.

"There are the known unknowns, and then there are the unknown unknowns...." 
to borrow from the accidental wisdom of Donald Rumsfeld.

-- 
Coby Beck
(remove #\Space "coby 101 @ bigpond . com")
From: Matthias
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <36wis00w4k3.fsf@hundertwasser.ti.uni-mannheim.de>
Kent M Pitman <······@nhplace.com> writes:

> Onay Urfalioglu <·····@gmx.net> writes:
> 
> > If there were only ONE CL-implementation, there would be no requirement for
> > a standard... this would lead to
> 
> > * more and more bugfree extensions and libraries
> 
> - only because you define away bugs

No, also because everybody is using the same libraries.  More users means
more people to notice and squash bugs.
From: Pascal Costanza
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <3i9usuFkjmagU1@individual.net>
Matthias wrote:

> Kent M Pitman <······@nhplace.com> writes:
> 
>>Onay Urfalioglu <·····@gmx.net> writes:
>>
>>>If there were only ONE CL-implementation, there would be no requirement for
>>>a standard... this would lead to
>>
>>>* more and more bugfree extensions and libraries
>>
>>- only because you define away bugs
> 
> No, also because everybody is using the same libraries.  More users means
> more people to notice and squash bugs.

This is a very hypothetical discussion. We don't know what would have 
been the case if there didn't exist a Common Lisp standard. Maybe Lisp 
would have completely ceased to be during the 90's (including Scheme - 
keep in mind that Scheme was mostly a reaction to the Common Lisp 
standardization).

I think it is a strength that there exist many different implementations 
of Common Lisp, and I also think it's a strength that the standard 
evolves very slowly (to put it mildly ;). These are characteristics that 
can be very important under some circumstances. Few languages provide 
these characteristics, so this can be a competitive advantage.


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Greg Menke
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <m3k6kgf4iz.fsf@athena.pienet>
Pascal Costanza <··@p-cos.net> writes:

> Matthias wrote:
> 
> > Kent M Pitman <······@nhplace.com> writes:
> 
> I think it is a strength that there exist many different implementations
> of Common Lisp, and I also think it's a strength that the standard
> evolves very slowly (to put it mildly ;). These are characteristics that
> can be very important under some circumstances. Few languages provide
> these characteristics, so this can be a competitive advantage.
> 

I think it would be hard to overstress this point.  Even the frenetic
pace of development of gcc causes me a fair measure of configuration
management grief.  That in itself isn't so bad, but once you include the
even more frenetic pace of development in operating systems to host (an
be targetted by) the compiler it starts to make project management
people a little twitchy.  While CL won't help much with the latter, it
helps A LOT with the former.

Gregm
From: Jon Boone
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <m3ekan8ees.fsf@amicus.local>
Pascal Costanza <··@p-cos.net> writes:

> This is a very hypothetical discussion. We don't know what would
> have been the case if there didn't exist a Common Lisp
> standard. Maybe Lisp would have completely ceased to be during the
> 90's (including Scheme - keep in mind that Scheme was mostly a
> reaction to the Common Lisp standardization).

    Do you mean that the rise of Scheme (for some value of rise) was
  largely due to the CL standard?  

    I recall that Scheme predates the CLtL1 book and was cited as a
  source of ideas for CLtL (and was, presumably, to be subsumed [for
  some definition of subsumed] in the CLtL effort).

    I recall that Scheme was developed as a replacement/extension of
  planner.  By which I think is meant that it was originally intended
  to be a domain specific language, rather than a competitor to
  MacLisp (or any other variant of lisp).

    This all predates my time, so I'm trying to understand it from a
  purely historical perspecitve, with much of my information deriving
  from written publications by Steele and others.

--jon



  
From: Pascal Costanza
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <3ic4kgFknq95U1@individual.net>
Jon Boone wrote:

> Pascal Costanza <··@p-cos.net> writes:
> 
>>This is a very hypothetical discussion. We don't know what would
>>have been the case if there didn't exist a Common Lisp
>>standard. Maybe Lisp would have completely ceased to be during the
>>90's (including Scheme - keep in mind that Scheme was mostly a
>>reaction to the Common Lisp standardization).
> 
>     Do you mean that the rise of Scheme (for some value of rise) was
>   largely due to the CL standard?  
> 
>     I recall that Scheme predates the CLtL1 book and was cited as a
>   source of ideas for CLtL (and was, presumably, to be subsumed [for
>   some definition of subsumed] in the CLtL effort).

Yes, the original Scheme language was defined before Common Lisp (but 
looked quite different from "modern" Scheme). The first two reports on 
Scheme are from the 70's. The reaction to Common Lisp starts with R2RS 
from 1985, all the reports since then contain implicit or explicit 
criticism of Common Lisp. (For example, R2RS says "An UnCommon Lisp" in 
the subtitle, and from R3RS on, the reports include the infamous 
"languages should be designed not by piling feature on top feature" 
quote.) It is especially interesting to see the timeline of 
publications: CLtL1 - 1984, R2RS - 1985, R3RS - 1986, CLtL2 - 1989, R4RS 
- 1991, ANSI Common Lisp - 1994/95, R5RS - 1998.

>     I recall that Scheme was developed as a replacement/extension of
>   planner.  By which I think is meant that it was originally intended
>   to be a domain specific language, rather than a competitor to
>   MacLisp (or any other variant of lisp).

The original Scheme was a device / the result of trying to better 
understand object-oriented programming as in Hewitt's actors model. See 
the very first report on Scheme.


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <u8y0u7wba.fsf@nhplace.com>
Pascal Costanza <··@p-cos.net> writes:

> Jon Boone wrote:
> 
> > Pascal Costanza <··@p-cos.net> writes:
> >
> >>This is a very hypothetical discussion. We don't know what would
> >>have been the case if there didn't exist a Common Lisp
> >>standard. Maybe Lisp would have completely ceased to be during the
> >>90's (including Scheme - keep in mind that Scheme was mostly a
> >>reaction to the Common Lisp standardization).
> >     Do you mean that the rise of Scheme (for some value of rise) was
> >   largely due to the CL standard?      I recall that Scheme predates
> > the CLtL1 book and was cited as a
> >   source of ideas for CLtL (and was, presumably, to be subsumed [for
> >   some definition of subsumed] in the CLtL effort).
> 
> Yes, the original Scheme language was defined before Common Lisp (but
> looked quite different from "modern" Scheme). The first two reports on
> Scheme are from the 70's. The reaction to Common Lisp starts with R2RS
> from 1985, all the reports since then contain implicit or explicit
> criticism of Common Lisp. (For example, R2RS says "An UnCommon Lisp"
> in the subtitle, and from R3RS on, the reports include the infamous
> "languages should be designed not by piling feature on top feature"
> quote.) It is especially interesting to see the timeline of
> publications: CLtL1 - 1984, R2RS - 1985, R3RS - 1986, CLtL2 - 1989,
> R4RS - 1991, ANSI Common Lisp - 1994/95, R5RS - 1998.

Having worked on those drafts of the Scheme spec, I can quite assure
you that Common Lisp had little to do with either the design or the
timetable.  They came out really more in conjunction with the time
available to the editors, and they couldn't have been more
disinterested in the state of Common Lisp.

The original Scheme design was done by Steele and Sussman.  Sussman
opened it to more people as a result of the popularity of the early
papers, but a lot of the design push in the early days was around
Sussman's desire to evolve S&ICP (and to a lesser extent, other Scheme
texts) into a widespread curriculum push.  In that regard, I think it
reacted as much to Pascal as anything else, since Pascal was the
dominant choice of something to teach in theoretical classrooms before
S&ICP.  Scheme did gain some market share at that time, though I think
lost the Big Battle in many ways to Java, probably because Java was
better positioned to address the web market, something Scheme has 
tended to explicitly avoid in many ways (at least up to a few years 
back, when a new, more user-oriented regime took over Scheme design).
This more recent trend was the result of a long-standing tension in the
Scheme community between the theorists and the implementors.  I tried
on many occasions to get changes made to Scheme to clarify vaguenesses
such as portability issues, branch cuts, etc, and was rebuffed by those
who liked the "smallness" of the spec (caused by not specifying these
things).  That's fine for classroom stuff, but not practical for business.
Scheme implementors lived in a private little world of custom extensions
for a long time, unable to get the highly conservative Scheme community to
make progress forward to "useful" stuff, but that logjam has finally been
broken, I am led to believe.  (For some reason, probably the ever-more-flakey
ISP I changed from a couple years ago, I didn't get Scheme design mail for
the last couple of years, so am somewhat out of it as far as recent politics.)

Scheme was, of course, also a conceptual child of Algol, so was very
influenced by that.

But CL?  I don't think really much at all.  Only a few Schemers (Jonathan
Rees, Will Clinger, and myself come to mind) seemed to really care much.
For the most part, I think the Scheme community just ignored CL.  Its goals
were just too different from Scheme's for there to be much overlap.

> >     I recall that Scheme was developed as a replacement/extension of
> >   planner.  By which I think is meant that it was originally intended
> >   to be a domain specific language, rather than a competitor to
> >   MacLisp (or any other variant of lisp).

Well, in its very, very early days, sort of.  But no.

Sussman was stuck using a bizarre dialect of lisp on Delphi operating
system for the PDP11 when I went through his class.  (I don't know the
name of the dialect; I wrote an emulator for the PDP10 which I called
ULISP, and cleaned up the dialect slightly in the process.)  They used
my emulator for one term of teaching Sussman's class, but I
misunderstood his passion for tail calling, and didn't implement that
when he asked me.  He became frustrated by the lack of tail calling in
my emulator and grabbed at Scheme for subsequent terms because it
allowed him the expressive capability to talk about iteration using
recursive expressiveness.  

At that point, and since, he was not trying to teach domain specific
languages, he was trying to teach programming in general.  And, more
generally, how to think and how to talk about process.

Sussman spoke at a Lisp conference in the mid 80's sometime and said
he thought the key contribution to the world of computer science in
the 20th century was not "computation" but "language", that is, the
ability to describe process (i.e., algorithms) clearly.  He remarked about
how when he was a kid, the only teaching technique for algorithms was to,
for example, do a long division in front of the student and then say "do you
see what I did?" and then if the student said no to do another problem and
ask "do you see now?"  He alleged that the ability to describe more clearly
how to describe such things was much more important than all the computation
actually performed by computers in that era.  I tend to think this was right.

But again, my point is that he was not limiting himself to planning
and other AI-ish things.  He was focusing on the general purpose
nature of Scheme.

> The original Scheme was a device / the result of trying to better
> understand object-oriented programming as in Hewitt's actors
> model. See the very first report on Scheme.

I think such things were just "hooks" to get people to pay attention to your
paper.  No one did anything in the AI Lab that didn't have a cute name.
The cute name was sometimes helpful but just as often a way of attracting
buzz so that people would come to your talks.  I think he was just talking
about computation ... as was Actors, actually.

The above is all just my opinion and speculation, of course.  (But I was
there at the time, so it's at least founded in some degree of personal
experience...)
From: Bulent Murtezaoglu
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <87zmtavr8o.fsf@p4.internal>
>>>>> "KMP" == Kent M Pitman <Kent> writes:
[...]
    KMP> Sussman spoke at a Lisp conference in the mid 80's sometime
    KMP> and said he thought the key contribution to the world of
    KMP> computer science in the 20th century was not "computation"
    KMP> but "language", that is, the ability to describe process
    KMP> (i.e., algorithms) clearly.  He remarked about how when he
    KMP> was a kid, the only teaching technique for algorithms was to,
    KMP> for example, do a long division in front of the student and
    KMP> then say "do you see what I did?" and then if the student
    KMP> said no to do another problem and ask "do you see now?"  He
    KMP> alleged that the ability to describe more clearly how to
    KMP> describe such things was much more important than all the
    KMP> computation actually performed by computers in that era.
[...]

Well, he seems to have held on to this idea the 00's too.  A movie of 
an (rather good, IMHO) 2001 talk where he says what KMP outlines above 
is available:
  I
http://www.aduni.org/colloquia/sussman/

cheers,

BM
From: Pascal Costanza
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <3icrbbFk9otfU1@individual.net>
Kent M Pitman wrote:
> Pascal Costanza <··@p-cos.net> writes:
> 
> 
>>Jon Boone wrote:
>>
>>
>>>Pascal Costanza <··@p-cos.net> writes:
>>>
>>>
>>>>This is a very hypothetical discussion. We don't know what would
>>>>have been the case if there didn't exist a Common Lisp
>>>>standard. Maybe Lisp would have completely ceased to be during the
>>>>90's (including Scheme - keep in mind that Scheme was mostly a
>>>>reaction to the Common Lisp standardization).
>>>
>>>    Do you mean that the rise of Scheme (for some value of rise) was
>>>  largely due to the CL standard?      I recall that Scheme predates
>>>the CLtL1 book and was cited as a
>>>  source of ideas for CLtL (and was, presumably, to be subsumed [for
>>>  some definition of subsumed] in the CLtL effort).
>>
>>Yes, the original Scheme language was defined before Common Lisp (but
>>looked quite different from "modern" Scheme). The first two reports on
>>Scheme are from the 70's. The reaction to Common Lisp starts with R2RS
>>from 1985, all the reports since then contain implicit or explicit
>>criticism of Common Lisp. (For example, R2RS says "An UnCommon Lisp"
>>in the subtitle, and from R3RS on, the reports include the infamous
>>"languages should be designed not by piling feature on top feature"
>>quote.) It is especially interesting to see the timeline of
>>publications: CLtL1 - 1984, R2RS - 1985, R3RS - 1986, CLtL2 - 1989,
>>R4RS - 1991, ANSI Common Lisp - 1994/95, R5RS - 1998.
> 
> Having worked on those drafts of the Scheme spec, I can quite assure
> you that Common Lisp had little to do with either the design or the
> timetable.  They came out really more in conjunction with the time
> available to the editors, and they couldn't have been more
> disinterested in the state of Common Lisp.

OK, of course I should better believe you. ;)

However, why the subtitle ("UnCommon Lisp") and why the remarks wrt 
language size? (I recall reading also some other remarks wrt Common Lisp 
in those reports.)



Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <umzpalext.fsf@nhplace.com>
Pascal Costanza <··@p-cos.net> writes:

> Kent M Pitman wrote:
> > [...] Having worked on those drafts of the Scheme spec, I can
> > quite assure you that Common Lisp had little to do with either the
> > design or the timetable. [...]
> 
> OK, of course I should better believe you. ;)
> 
> However, why the subtitle ("UnCommon Lisp") and why the remarks wrt
> language size? (I recall reading also some other remarks wrt Common
> Lisp in those reports.)

A passion for silly puns and other easter eggs.

Certainly there was a desire to distinguish the _document_ from CL.
But CL just didn't come up in discussion of the language itself,
and in fact I'd go out on a limb and say many if not most of the 
Scheme authors were and probably are still simply unaware of what's
in CL.  They seemed so put off either by CL's size or its lack of
formal semantics or some anecdote they heard from someone that they
didn't go fishing for good ideas.

On the former point, though--easter eggs--if you had a copy of the
original Algol report, you'd realize that huge energy went into making
the report look like the Algol report--in layout, typography, document
sectionization, and even in the dedication.  My copy of the Algol
report isn't handy to me here, but I believe it was dedicated to the
memory of one of the participants, or at least to someone influential.
One of the original Scheme reports is dedicated to the memory of
Algol.  Such detailings were cute jokes actually, lost on
many--probably most--readers, since the Algol report is not required
reading for students any more.  (Probably a shame that history is lost
that way.)

But in this context, my meta-point is that they did these things because
they were fun, not always because there was a deep reason.  The language
reflects a set of affirmatively held views on how a language should be,
not a reaction to other languages.  The presentation and promotion of the
language, by contrast, is sometimes political.

Think of it like the Republicans vs the Democrats in the purest form
(ignoring the underlying hypocrisy and monetary influence that might
make this a less-good analogy).  That is, imagine for a moment that
the Republicans are seriously committed to small government and
personal liberty for the kind of pure reasons Rush Limbaugh
advances--that they are simply more principled.  Those principles were
not devised by looking at Democrats and doing otherwise.  Those
principles are based on the Constitution and the Bible and other
affirmative influences.  And yet, even in this cartoon world, you can
still understand that Rush might _present_ this philosophy as a
reaction to "Democrats and Socialists and other such Commies".  That is,
the message is presented in context that the listener will understand--it
just wasn't developed that way.  It was developed by Purity of Heart and
with Belief in God and Country and all of that.  But what is important
about such Purity is that it will stand up to all attacks, because it is
simply Right, and Right will always compare favorably to Wrong.

Now go back and replace "Republicans" with "Schemers" and "Democrats"
with "Common Lispers" in the previous paragraph and you'll understand
how "An UnCommon Lisp" got onto the title... ;)
From: Pascal Costanza
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <3id8ieFkqiotU1@individual.net>
Kent M Pitman wrote:
> Pascal Costanza <··@p-cos.net> writes:
> 
>>Kent M Pitman wrote:
>>
>>>[...] Having worked on those drafts of the Scheme spec, I can
>>>quite assure you that Common Lisp had little to do with either the
>>>design or the timetable. [...]
>>
>>OK, of course I should better believe you. ;)
>>
>>However, why the subtitle ("UnCommon Lisp") and why the remarks wrt
>>language size? (I recall reading also some other remarks wrt Common
>>Lisp in those reports.)
> 
> A passion for silly puns and other easter eggs.

OK, thanks a lot for the clarifications.

> Think of it like the Republicans vs the Democrats in the purest form
> (ignoring the underlying hypocrisy and monetary influence that might
> make this a less-good analogy).  That is, imagine for a moment that
> the Republicans are seriously committed to small government and
> personal liberty for the kind of pure reasons Rush Limbaugh
> advances--that they are simply more principled.  Those principles were
> not devised by looking at Democrats and doing otherwise.  Those
> principles are based on the Constitution and the Bible and other
> affirmative influences.  And yet, even in this cartoon world, you can
> still understand that Rush might _present_ this philosophy as a
> reaction to "Democrats and Socialists and other such Commies".  That is,
> the message is presented in context that the listener will understand--it
> just wasn't developed that way.  It was developed by Purity of Heart and
> with Belief in God and Country and all of that.  But what is important
> about such Purity is that it will stand up to all attacks, because it is
> simply Right, and Right will always compare favorably to Wrong.
> 
> Now go back and replace "Republicans" with "Schemers" and "Democrats"
> with "Common Lispers" in the previous paragraph and you'll understand
> how "An UnCommon Lisp" got onto the title... ;)

This is fun, because that change would also keep "Commies" as a nickname 
for Common Lispers. Hmmm... :)


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Paolo Amoroso
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <87wtoeih6r.fsf@plato.moon.paoloamoroso.it>
Kent M Pitman <······@nhplace.com> writes:

> Pascal Costanza <··@p-cos.net> writes:
[...]
>> However, why the subtitle ("UnCommon Lisp") and why the remarks wrt
>> language size? (I recall reading also some other remarks wrt Common
>> Lisp in those reports.)
>
> A passion for silly puns and other easter eggs.

Now that I think about it, the name of Marco Baringer's
continuation-based web framework, UnCommon Web, may be an even subtler
pun :)


Paolo
-- 
Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film
Recommended Common Lisp libraries/tools:
- ASDF/ASDF-INSTALL: system building/installation
- CL-PPCRE: regular expressions
- UFFI: Foreign Function Interface
From: Björn Lindberg
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <42c3e653$1@news.cadence.com>
Kent M Pitman wrote:
> Onay Urfalioglu <·····@gmx.net> writes:
> 
> 
>>If there were only ONE CL-implementation, there would be no requirement for
>>a standard... this would lead to
> 
> 
> There are numerous points in time in the past where if there had been
> only one implementation, the bottom would have fallen out of the
> market.  In the days where there were only commercial implementations,
> "second sourcing" was ESSENTIAL to customer willingness to invest a large
> amount of money in Lisp.

One other point with regards to a proper standard that I haven't seen 
mentioned so far is that a standard makes the separation between 
intended and incidental behaviour clear. For a language such as Perl, 
since what is the "standard" may (and will) at any given time move, no 
third party can hope to compete with the single existing implementation. 
And even if it did, making other implementational choices becomes 
extremely difficult, because you do not know which behaviour in Perl is 
intended and which is incidental, and in any case you can never know 
whether some users rely on a certain behaviour or not.


Bj�rn
From: ···@telent.net
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1119873589.958295.199380@g43g2000cwa.googlegroups.com>
Onay Urfalioglu wrote:

> all developers would HAVE TO work TOGETHER (in harmony)

The beatings will continue until morale improves
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uzmtf2fqw.fsf@nhplace.com>
Kenny Tilton <·······@nyc.rr.com> writes:

...
> Dr. McCarthy joined with Henry Baker, his predecessor at the
> microphone, in bemoaning the standardization of Common Lisp as
> stultifying if not mortifying, in that it ended innovation.
...

I wish I could have been there to debate this with him.

Nothing about the standardization of Common Lisp has precluded in the
slightest way other innovation.  Indeed, the very size and fixed
nature of Common Lisp has been sufficient barrier to new entrants into
the implementation camp that the opportunity has been wide open for
someone to come in with something either smaller or different IF it 
were either too big to implement or too inflexible to accomodate change.

1. Since 1985, when the ANSI process began, the market has changed
   substantially.  The business climate itself has changed substantially.
   Companies used to do lots of "long arc" planning and now are focused
   on the very next quarter's bottom line.  The market just doesn't have
   the free cash to be taking as many flights of fancy as it once did.

2. The Scheme language, by staying small, has attracted lots of people who
   have wanted to make new implementations and dialects.  But this has 
   arguably just frittered away valuable talent re-implementing past ideas
   rather than pressing on to new ones.  After a while, some of us had seen
   a LOT of Lisp dialects and came to think that perhaps there was more to
   life than merely making new ones.  Like actually writing programs.

3. The modern world is very heterogeneous. Expanding your own design by
   seeking the perfect language through continuous refinement is a barrier
   both to new entry to the language, and also to connecting the language
   to new systems.  A lot of recent effort has been on connecting Lisp to
   other technologies.  I don't think that effort is wasted.  I like McCarthy
   and enjoy his talks, but my sense is that he would make these same comments
   unless there was a "return" to a quest for a pure or novel dialect.
   And I don't think that's what you'd get if you did a re-implementation.
   You'd more likely get more dialects that were C- or C++ or Perl or Java
   compatible, and that actually LOST rather than GAINED flexibility because
   the pressure would be to add features from those languages that don't mix
   well with Lisp.  That, at least, was the fate of Dylan.  I'm not saying 
   that there's no value in being able to intermix with those language, but
   I'm saying that making Lisp more C++-ish or Java-ish is not the way to do
   that.  And the pressure would be there, if there were a re-design, to do
   that.

4. I think McCarthy places too much blame at the door of the language.  The
   days of "large government" (where implementors outnumber users) are gone,
   and the burden is no longer on implementors to make motion move forward.
   We are now in the era of "the people" (where users outnumber implementors).
   For things to move in this or that direction, there must be users who want
   it to happen.  You can't blame CL for the lack of user-commitment to other
   things.  Build another language (ARC?) and see if people flock to it.  At
   some point, you have to blame the implementors (or would-be implementors)
   of those other dialects (or would-be dialects) for not attracting things.

Obligatory Political Analogy: Blaming Common Lisp for there not being
other Lisp dialects is like blaming the Republicans for Kerry having
lost to Bush in the last US presidential election.  The Democrats lost
because they ran a lousy campaign.  I'd even push the analogy to say
that the election results don't even prove that Bush was a good
candidate as much as it proves that Kerry didn't offer a credible
alternative; likewise, even if you don't like Common Lisp, the burden
is still on you to provide something more credible.  I like the
Democrats but feel they have poorly articulated their message; Howard
Dean is doing good things fixing that.  You can like or dislike CL,
but at least it's got a coherent message that implementors can use as
a guide and users can do likewise, yielding productive programs.
Other attempts have been made, e.g., ISLISP, but it's not CL's fault
they have fallen short.

I wish McCarthy would spend more time articulating what he wants to
see happen and how it might be done, than talking about how he doesn't
like what does happen.  This is not because I want him to shut up.  I
just don't know how to respond to his criticism.  Should we
unstandardize CL?  Should we stop selling existing implementations?
That makes no sense.  The only thing one can do is blunder forth.  So
McCarthy ought, IMO, be offering us suggestions about what he wants in
a new Lisp that would make the effort and cost of discarding CL
worthwhile... (In fact, I think the burden is even on him to show that
it's even NECESSARY to discard CL; that is, that the features and
effects he wants can't simply be accomodated by CL.  Of course, it's
easy to design a feature incompatible with CL, but it's harder to
design a legitimate need not served by CL that can't be addressed by
adding something to CL.  I'm not saying it's impossible, but I think
CL has stood for a while exactly because it is "good enough" for a lot
of what needs doing.)

I basically don't want to get in the position of putting McCarthy
down.  I have great respect for the man.  He has contributed greatly
to us.  And I think he still has things of merit to say.  I try to
listen carefully when he speaks because I hate to waste that resource.
BUT I find the remarks, at least as paraphrased here, as falling
short, in that they offer non-constructive criticism (by which I mean
"criticism that cannot be acted", NOT "criticism spoken in a
mean-spirited way") rather than a roadmap ahead or even a picture of
what we'd find at the end of the road if we would just hack our way
through the jungle without a map.  And so I feel like he's somewhat
asking me to waste the availability of the wisdom he might have to
share if it's not framed in a way I can figure out how to act upon.

(But then, maybe he did say more about this and it just got lost in
translation when summarized here?  One can always hope...)
From: Don Geddis
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <873br6gqxp.fsf@sidious.geddis.org>
> Kenny Tilton <·······@nyc.rr.com> writes:
>> Dr. McCarthy joined with Henry Baker, his predecessor at the
>> microphone, in bemoaning the standardization of Common Lisp as
>> stultifying if not mortifying, in that it ended innovation.

Kent M Pitman <······@nhplace.com> wrote on Sat, 25 Jun 2005:
> I wish McCarthy would spend more time articulating what he wants to
> see happen and how it might be done, than talking about how he doesn't
> like what does happen.

I wasn't there either, but I suspect McCarthy is just nostalgic, and perhaps
a bit proud, for the advanced in programming language concepts served up by
the Lisp community in the '60s and '70s.  With major competitors like Maclisp
and Interlisp, and the building of things like Symbolics Lisp Machines, it
must have seemed (I wasn't there) like all sorts of exciting new things were
being invented all the time.

Garbage collection, then incremental garbage collection.  Lexical scoping.
Bignums.  Interpreter/compiler equivalence.  CLOS.  Etc.  The rest of the
programming language community needed to look to the Lisp community to see
what was possible in programming languages.

For the last 10-20 years, this has seemed less true.  McCarthy observed that
ANSI standardization occupied a good portion of the Lisp community's effort
during that time.  Perhaps it's just a coincidence (and the real reason is
something about funding or other economics), but McCarthy suspects that
standardization led to a decrease in language experimentation in Lisp.

You can argue that the decrease is for a good cause.  You've said many times
that there is great value in actually using a language to build interesting
programs, separate from the value in altering the base language itself.

But McCarthy doesn't care about most of that.  He's interested in human-level
artificial intelligence.  ANSI Common Lisp is a remarkable achievement, but
everyone realizes that it is perhaps a kind of "local maximum" in the
language design space.  McCarthy cares about long-run maximum, and isn't
concerned if you have to go downhill for awhile in the short run.

ANSI CL has a lot of resistance to these kinds of changes, especially if the
initial directions make the language worse.  This perhaps wasn't true decades
ago.

I happen to appreciate the benefits of the ANSI CL standard, even while
recognizing that McCarthy may be wistful for what was given up (rapid
innovation).

At least, that's my guess about what he meant...

        -- Don
_______________________________________________________________________________
Don Geddis                  http://don.geddis.org/               ···@geddis.org
When they ship styrofoam, what do they pack it in?
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <u4qblnnyt.fsf@nhplace.com>
Don Geddis <···@geddis.org> writes:

> > Kenny Tilton <·······@nyc.rr.com> writes:
> >> Dr. McCarthy joined with Henry Baker, his predecessor at the
> >> microphone, in bemoaning the standardization of Common Lisp as
> >> stultifying if not mortifying, in that it ended innovation.
> 
> Kent M Pitman <······@nhplace.com> wrote on Sat, 25 Jun 2005:
> > I wish McCarthy would spend more time articulating what he wants to
> > see happen and how it might be done, than talking about how he doesn't
> > like what does happen.

[A good analysis here.  I'll interpose some counterpoint, but I don't mean
 to take away from your points in most cases--just to add more 
 three-dimensionality to some of the things that I think have different ways
 of looking at things.]

> I wasn't there either, but I suspect McCarthy is just nostalgic, and perhaps
> a bit proud, for the advanced in programming language concepts served up by
> the Lisp community in the '60s and '70s.  With major competitors like Maclisp
> and Interlisp, and the building of things like Symbolics Lisp Machines, it
> must have seemed (I wasn't there) like all sorts of exciting new things were
> being invented all the time.
> 
> Garbage collection, then incremental garbage collection.  Lexical scoping.
> Bignums.  Interpreter/compiler equivalence.  CLOS.  Etc.  The rest of the
> programming language community needed to look to the Lisp community to see
> what was possible in programming languages.

The "innovation" here largely preceded the standardization.  That is,
there was a period in the late 1970's and early 1980's in which nearly
all of this was thought up, even though it started to be rolled out in
"standards" form preliminarily in CLTL (1984) and finally in ANSI CL
(1994).  There were some legitimately new and innovcative technical
advances in CLOS per se, but a lot of it was a pick and choose process
from earlier, disparate object systems.

> For the last 10-20 years, this has seemed less true.  McCarthy observed that
> ANSI standardization occupied a good portion of the Lisp community's effort
> during that time.  Perhaps it's just a coincidence (and the real reason is
> something about funding or other economics), but McCarthy suspects that
> standardization led to a decrease in language experimentation in Lisp.

This is a great example, I think, of the harshness of Darwinian
evolution.  I personally think Common Lisp did an outstanding job of
its first and most important priority: not being killed off.  I think
many of the things that McCarthy decries were 100% essential in the
context of the time to keep the language from being offed.  The funny
thing about Darwinian strengths is that often their deployment is on
an LRU basis... that is, you use a survival characteristic and then
you don't need it again for a long time.  It needs to be in the
arsenal but it may not have day-to-day deployment.  (What makes a
thing attractive to the market may be different than what keeps a
thing from being killed by a market, in other words.  So you have to
work at both ends.)

> You can argue that the decrease is for a good cause.  You've said many times
> that there is great value in actually using a language to build interesting
> programs, separate from the value in altering the base language itself.

Yes.  I assume you mean I've compared this to things like the
Betamax/VHS issue, there was more value in the ability to make a
decison and move forward than there was in perpetually haggling over
which to take.  The market didn't mind losing what quality it would
gain from Betamax because it got that same "value" back in other ways
by just killing that tiny debate in time to make VCRs earlier.

And while programming is qualitatively different than mere, inactive,
hardware, such that it may not want to be ever marked as a "done deal"
(never to be improved upon) it can and is still measured by business
according to a goodness metric of money needed to do a task, and in
those realms a sense of "good enough" continues to rule.

> But McCarthy doesn't care about most of that.  He's interested in human-level
> artificial intelligence.  ANSI Common Lisp is a remarkable achievement, but
> everyone realizes that it is perhaps a kind of "local maximum" in the
> language design space.  McCarthy cares about long-run maximum, and isn't
> concerned if you have to go downhill for awhile in the short run.

Yes, and this is where my point about money comes back into play.  I
think McCarthy's "real" message is that there is more to programming
than money.  But, in essence, he's saying that people who are not 
money-obsessed should be continuing the push.  

But even such efforts require money.  So it seems to me to be up to
the community that is supplied by infinite, unfettered money (by which
I mean Academia, where McCarthy lives, or "free software land", where
an infinite supply of labor waits merely for someone to designate a
"cause").  Standards are largely about business, and operate in a
different circle.  It seems misplaced for McCarthy to blame others and
not his own grad students or members of the free software movement for
letting things down.
 
(I personally think historically Free Software doesn't innovate as
much as it copies, but I don't know that that's intrinsic to its
nature...  It made Linux, like Unix.  It made Gimp, like Photoshop.
It made Open Office, like MS/Office.  It has yet to make something
that competes with Apple for offering something major and novel that
business didn't already think up...  McCarthy is offering a hint that
they could do that...)

> ANSI CL has a lot of resistance to these kinds of changes, especially if the
> initial directions make the language worse.  This perhaps wasn't true decades
> ago.

ANSI CL was about insulating the commercial community from the whims of
the people who wanted the language to change daily.  Business could not
keep up, and useful business could not be built on continual change.
ANSI CL was explicitly intended to ENABLE variation by making sure that
business would have no reason to argue against such innovation.  The thought
was that by freezing something, business would have something it could rely
on while innovation proceeded ahead.

But I think what happened was that ANSI CL was good enough that its users
felt empowered to get on about writing programs.  Maybe it's a hill  climbing
problem, but all solutions to the hill climbing problem are frought with
some risk of failure, and people seem not to have been willing to take that
leap from the cliffs of CL.
 
> I happen to appreciate the benefits of the ANSI CL standard, even while
> recognizing that McCarthy may be wistful for what was given up (rapid
> innovation).

Me, too. But I'm not threatened by people making other projects that 
try other things.  Dylan was a good experiment, I think it just went awry
when it incorporated infix syntax.  Goo and Arc are not necessarily my
personal cup of tea, but they seemed like excellent ways to go.  ISLISP
and Eulisp still offer other paths.  Even Gnu Emacs lisp offers opportunities.
Blaming CL for the failures of all of these efforts to exceed Common Lisp
seems as inappropriate as blaming Lisp for AI winter.   There ARE other
innovation sandboxes than CL.  CL has not inhibited them.  But none have
made what McCarthy wants.  

Here's my bottom line beef:  Rather than pitch this as a sadness, why not
pitch it as an opportunity?  There are lots of people eager to dive in and
make a mark on the world.  If McCarthy can articulate what needs to be done,
those people would, I think, willingly do it.  If he can't articulate what
needs to be done well enough for them to do that, then maybe THAT is the
place that resources need to be applied first.

> At least, that's my guess about what he meant...

Well, he's welcome to join in here, of course.  But I think absent that,
we're forced to take our own analyses and to run with them.  At least he
can be satisfied that and should be pleased that he's provoked discussion
on the matter.  That seems to be at least part of his intent.
From: alex goldman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1998534.fVMvnPux20@yahoo.com>
Kent M Pitman wrote:

>  
> (I personally think historically Free Software doesn't innovate as
> much as it copies, but I don't know that that's intrinsic to its
> nature...  It made Linux, like Unix.  It made Gimp, like Photoshop.
> It made Open Office, like MS/Office.  It has yet to make something
> that competes with Apple for offering something major and novel that
> business didn't already think up...  McCarthy is offering a hint that
> they could do that...)

Not really innovation, but is CMUCL still the fastest Lisp, or did the
commercial Lisps catch up? Is anyone keeping score?
From: Paul F. Dietz
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <otSdnSFDNMgdaiPfRVn-jw@dls.net>
alex goldman wrote:

> Not really innovation, but is CMUCL still the fastest Lisp, or did the
> commercial Lisps catch up? Is anyone keeping score?

This kind of claim is meaningless without specifying the
benchmark.  For example, if you choose the ACL2 regression
test suite, gcl comes out on top, beating Allegro and CMUCL
by a good margin.

	Paul
From: alex goldman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <1294349.TXcGU6BujH@yahoo.com>
Paul F. Dietz wrote:

> alex goldman wrote:
> 
>> Not really innovation, but is CMUCL still the fastest Lisp, or did the
>> commercial Lisps catch up? Is anyone keeping score?
> 
> This kind of claim is meaningless without specifying the
> benchmark.  For example, if you choose the ACL2 regression
> test suite, gcl comes out on top, beating Allegro and CMUCL
> by a good margin.

God, I hate nitpickers. A while ago, some genius told me that 
"for_any X, Y: car(cons(X,Y), X)"  is not valid Prolog (duh!) because it's
missing a period at the end of the line!

Everyone knows the relative performance on one benchmark will not
necessarily be reproduced on another. Nevertheless, it's still possible to
see and speak of general patterns, if such can be found. Or is it now
verbotten to say that Python is slower than CMUCL?
From: Paul F. Dietz
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <hqadnUHQFqB0myLfRVn-1g@dls.net>
alex goldman wrote:

> God, I hate nitpickers.

I'm not too fond of careless and sloppy whiners myself.

	Paul
From: alex goldman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <3315878.9kpHO12YkU@yahoo.com>
Paul F. Dietz wrote:

> alex goldman wrote:
> 
>> God, I hate nitpickers.
> 
> I'm not too fond of careless and sloppy whiners myself.

If you stopped whining, imagine what it could do for your self-esteem!
From: Paolo Amoroso
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <87is01xga5.fsf@plato.moon.paoloamoroso.it>
Kent M Pitman <······@nhplace.com> writes:

> Don Geddis <···@geddis.org> writes:
[...]
> (I personally think historically Free Software doesn't innovate as
> much as it copies, but I don't know that that's intrinsic to its
> nature...  It made Linux, like Unix.  It made Gimp, like Photoshop.

I have heard similar comments often.  But RPG's latest book seems to
associate open source with innovation:

  Innovation Happens Elsewhere
  http://lemonodor.com/archives/001171.html


[about McCarthy]
>> At least, that's my guess about what he meant...
>
> Well, he's welcome to join in here, of course.  But I think absent that,

McCarthy is listed in CLtL2 (page xxi) among the contributors to the
design and implementation of Common Lisp, and to the polishing of
CLtL1.  What was his contribution to the standardization of Common
Lisp?


Paolo
-- 
Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film
Recommended Common Lisp libraries/tools:
- ASDF/ASDF-INSTALL: system building/installation
- CL-PPCRE: regular expressions
- UFFI: Foreign Function Interface
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <uis00qyp7.fsf@nhplace.com>
Paolo Amoroso <·······@mclink.it> writes:

> Kent M Pitman <······@nhplace.com> writes:
> 
> [about McCarthy]
> >> At least, that's my guess about what he meant...
> >
> > Well, he's welcome to join in here, of course.  But I think absent that,
> 
> McCarthy is listed in CLtL2 (page xxi) among the contributors to the
> design and implementation of Common Lisp, and to the polishing of
> CLtL1.  What was his contribution to the standardization of Common
> Lisp?

I don't personally know what his involvement was.

Rather than address that, let me just say that I wouldn't put a lot of
stock in the mere presence of someone's name in a list of
participants.  There are people who may have participated indirectly
who are not listed.  There are people who are listed as participants
because they simply attended meetings.  There are people who made no
in-meeting contributions yet made substantial contributions on the
side by influencing this or that person's ideas, agendas,
presentations, or votes.

The ANSI CL standard was delayed several months because one committee
member uselessly challenged the presence of another committee member
on one of the lists of participants.  (This was at a time when only
technical criticisms were permissible, but ANSI had no mechanism for
distinguishing between technical and non-technical criticisms, and as
a result they advised us to just treat this as a technical criticism.)
It took many months to run the full process of having a meeting to
discuss this "criticism" and dismissing it.  The basis of the
dismissal was not a judgment that the person had contributed or not up
to a certain standard of usefulness (which appeared to be the nature
of the criticism) but that the formal records of the meeting showed
the person to be on the attendance list, etc.  That is, participation
is objectively, not subjectively, judged.  

In general, I have always advised people creating "credits" or
"acknowledgments" to prefer inclusion rather than exclusion, since it
rarely injures one to point to more people who were included as much
as it hurts if you make a mistake and fail to include someone you
should have.  However, the potential downside of this approach is that
sometimes those who you list as having contributed will not be easily
distinguished among themselves--that is, it's hard to tell the "major"
contributions from the "minor" ones merely from the fact of being
listed.  

I entered the design of CLTL late, and most of my participation was in
reading proposals and voting.  I paid little attention at the time to
the full set of players, so I can't say who was there or not.  For all
I know, McCarthy might have had regular meetings with the key players,
or might never have done anything beyond invent Lisp originally.
Either would be adequate reason to include him in the credits for
CLTL.

I know he's in the CLHS credits and I can tell you as an objective
fact that the reason is that he attended meetings because that's how I
put that list together and because I know him to have attended
meetings and because I probably have written records in my basement to
back up this claim.

But having now hopefully disposed of the red herring issue of whether
being listed as a participant implies participation, let me get to a
more important related point: Any person who WAS involved in the
standard still did not make the whole standard, and is therefore not
obliged to stand by it and take responsibility for it in silent agony
if they don't believe in it.  Allowing for the possibility that
McCarthy were a huge backroom player in the process, he's still
entitled to be upset with it and still entitled to criticize it.
Perhaps in that case, a possible option is for me to say "you should
have exercised your influence better", but I personally know how hard
that was because of how many times I tried to exercise my influence
and how far I got.  (Sometimes, I think, because the committee didn't
do the right thing, but just as often if not moreso, perhaps, because
there is superior wisdom in a group that no individual can have.)

In the end, what I'm saying, is that what McCarthy (and anyone else
with an axe to grind) has most control of is what they do with their
own time.  We all make choices.  If he thinks the biggest problem with
Lisp is lack of a good dialect, he should be spending his time
designing that dialect.  On the other hand, if he thinks the dialect
is fine and the biggest problem is lack of tools, applications, goals,
or something else, then that's fair too.  At some point, we must all
own what we are able to affect.  It's one thing to say "it's too bad
that not everyone in the world is honest/caring/happy/whatever"
because no single individual can make a change in himself that makes
the world honest.  But it's another thing to say "it's too bad no one
in the world is honest/caring/happy/whatever" because existentials are
things that an individual can influence...  

I THINK McCarthy is complaining of lack of existence of an alternative
to Lisp, and if so, he's in a position to change that.  Maybe he is
complaining of lack of everyone liking the alternatives that are
there, or complaining of lack of dislike in a large number of people
for Common Lisp.  Those may or may not be things he can change--maybe
he can lead by example, or maybe not...
From: Don Geddis
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <87hdfjg132.fsf@sidious.geddis.org>
Kent M Pitman <······@nhplace.com> wrote on Mon, 27 Jun 2005:
> I THINK McCarthy is complaining of lack of existence of an alternative
> to Lisp, and if so, he's in a position to change that.  Maybe he is
> complaining of lack of everyone liking the alternatives that are
> there, or complaining of lack of dislike in a large number of people
> for Common Lisp.

Actually, none of those things were my own interpretation.

I thought he was straightforwardly complaining that there seems to be less
programming language innovation in the Lisp community these days, than in
the past.

I don't think he pines for any specific alternative, which is why creating
one himself wouldn't help.  It's more the overall effect of the community,
which in the past fostered innovation in language features.  For example,
he's probably happy about things like the Franz work on first-class
environments.  But when was the last time anybody actually cared about SERIES
being an alternative to LOOP?  LOOP made it into the ANSI standard, SERIES
didn't, and for the vast majority of the Lisp community that kind of answered
the question.

Perhaps it would be clearer by analogy.  Do you remember the early days of
the web and Netscape?  During the battle with Microsoft and IE, when the
result still seemed to be up in the air?  Every three months or so, a major
new version of the software was released, with a huge collection of new
features.  First HTML was extended, with things like TABLEs, and style sheets.
Then additional modules were added, like email and newsgroups.

This continued for years, every few months, until IE won and Netscape was
crushed.  Since the late '90s, such innovation screeched to a halt.  The
winner, IE, has been essentially unchanged for years now (compared to the
pace of innovation in the past).

I can imagine someone wishing that Microsoft had not won that battle, and
that Netscape and IE still struggled to innovate in order to win new customers.

It seems to me that McCarthy has similar feelings about Lisp.  Not that ANSI CL
needs an specific new feature; but more that the whole Lisp community used to
be about language innovation, and it isn't so much anymore.

(Again, I personally think that society has received enormous benefit from
ANSI CL standardization, but perhaps in areas different than language
innovation.  So it all depends on what outcomes you value.)

        -- Don
_______________________________________________________________________________
Don Geddis                  http://don.geddis.org/               ···@geddis.org
If you go through a lot of hammers each month, I don't think it necessarily
means you're a hard worker.  It may just mean that you have a lot to learn
about proper hammer maintenance.  -- Deep Thoughts, by Jack Handey
From: Matthias Buelow
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <868y0vy7f0.fsf@drjekyll.mkbuelow.net>
Don Geddis <···@geddis.org> writes:

>I can imagine someone wishing that Microsoft had not won that battle, and
>that Netscape and IE still struggled to innovate in order to win new customers.

To characterize the "browser wars" as "innovative" is a really
deranged retrospect of the situation, imho.

Do you really think it is a good idea to need several browsers just
because some sites won't display properly with one, and some other
sites with the other? I also don't see how this maps to programming
languages, since with programming languages, the end user who is just
using the compiled product normally isn't affected, and couldn't care
less about which language the software was written in. Whereas with
the Web, it had direct effect on the user's nerves, time and
productivity. Thankfully, most web developers seem to value the W3C
standards higher today than they did back then and one browser is
usually sufficient.

mkb.
From: Raffael Cavallaro
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <2005062715073316807%raffaelcavallaro@pasdespamsilvousplaitmaccom>
On 2005-06-27 14:35:31 -0400, Matthias Buelow <···@incubus.de> said:

> To characterize the "browser wars" as "innovative" is a really
> deranged retrospect of the situation, imho.

This depends entirely on whether one values stability or change. 
Academic researchers like McCarthy value change - a.k.a. innovation - 
more highly than stability. After all they are more or less in the 
change business - they're paid to think up changes in the way we do 
things.

Users tend to value stability and the resultant interoperability more highly.

So to characterize the browser wars as innovative is not "deranged" but 
rather a refelction of researcher values, not user values.

N.B. I am not Don Geddis, but I do agree with what he was saying.
From: Robert Uhl
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <m31x6o39jy.fsf@4dv.net>
Kent M Pitman <······@nhplace.com> writes:
>
> The "innovation" here largely preceded the standardization.  That is,
> there was a period in the late 1970's and early 1980's in which nearly
> all of this was thought up, even though it started to be rolled out in
> "standards" form preliminarily in CLTL (1984) and finally in ANSI CL
> (1994).  There were some legitimately new and innovcative technical
> advances in CLOS per se, but a lot of it was a pick and choose process
> from earlier, disparate object systems.

That's not necessarily a bad thing, since that allows the standardisers
to pick and choose the best of the lot, right?

> (I personally think historically Free Software doesn't innovate as
> much as it copies, but I don't know that that's intrinsic to its
> nature...  It made Linux, like Unix.  It made Gimp, like Photoshop.
> It made Open Office, like MS/Office.  It has yet to make something
> that competes with Apple for offering something major and novel that
> business didn't already think up...  McCarthy is offering a hint that
> they could do that...)

Wasn't emacs free software from the get-go?  Granted, there were other
extensible editors (e.g. TECO, on which it was first coded), but
oughtn't that still count for something?

And then there's Tex & LaTeX, which are still the best solution to that
particular problem as far as I can tell.  To what extent were they
original versus copies?  I really don't know.

And in a way isn't that kinda like the standardisation process?  I.e.,
others blaze the trail and make mistakes, and then it's Done Right later
on, in the one case by a standards committee, and in the other by the
free software community...

> But I think what happened was that ANSI CL was good enough that its
> users felt empowered to get on about writing programs.

That's probably it--it's a useful tool and can get the job done.  Most
of the improvements I can think of are incremental (e.g. a quick-to-type
string concatenation function...) and are hardly interesting.  And
having the reader either preserve case or downcase, rather than
upcase--which is hardly worth mentioning.

What would you add or change?

-- 
Robert Uhl <http://public.xdi.org/=ruhl>
More likely, the ticket database is an incoherent work of speculative
fiction that doesn't contain records for many changes.
                                      --clover_kicker
From: Kent M Pitman
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <ubr5sqy1j.fsf@nhplace.com>
Robert Uhl <·········@NOSPAMgmail.com> writes:

> Kent M Pitman <······@nhplace.com> writes:
> >
> > The "innovation" here largely preceded the standardization.  That is,
> > there was a period in the late 1970's and early 1980's in which nearly
> > all of this was thought up, even though it started to be rolled out in
> > "standards" form preliminarily in CLTL (1984) and finally in ANSI CL
> > (1994).  There were some legitimately new and innovcative technical
> > advances in CLOS per se, but a lot of it was a pick and choose process
> > from earlier, disparate object systems.
> 
> That's not necessarily a bad thing, since that allows the standardisers
> to pick and choose the best of the lot, right?

No, you're right.  It's a good thing, in fact.  It's how standards are
INTENDED to work.  They sometimes don't, especially in the compuer
arena.  Someone from what was then the National Bureau of Standards
(NBS) [later NIST], made this precise warning in an early CL meeting,
and we periodically referred back to it: he warned against the
temptation to do design under the guise of standardization, pointing
out that only those things that had current practice and were merely
seeking standard form were properly ripe for standardization.

> > (I personally think historically Free Software doesn't innovate as
> > much as it copies, but I don't know that that's intrinsic to its
> > nature...  It made Linux, like Unix.  It made Gimp, like Photoshop.
> > It made Open Office, like MS/Office.  It has yet to make something
> > that competes with Apple for offering something major and novel that
> > business didn't already think up...  McCarthy is offering a hint that
> > they could do that...)
> 
> Wasn't emacs free software from the get-go?

Not in the GPL sense.  The GPL evolved over time.

> Granted, there were other extensible editors (e.g. TECO, on which it 
> was first coded), but oughtn't that still count for something?

The other extensible editors to which you refer were also Emacs. Its
first implementation was in Teco.  It was not charged for, but it was not
part of a political scheme to lock out or penalize commercial use.

> And then there's Tex & LaTeX, which are still the best solution to that
> particular problem as far as I can tell.  To what extent were they
> original versus copies?  I really don't know.

Also not GPL'd originally, I'm pretty sure.  Probably still.  I haven't
checked the license recently.

But these were all the output of government-paid, public research funds.
And they have no business being GPL'd ever.

If I had more time and money, I would file suit against any publicly
funded agency in the United States that is using public funds to
produce GPL'd output, since the money for that work comes from the
people of the US, and the people deserve to have that work back in the
Public Domain or some sort of copyright that doesn't create an
impediment to commercial use.  A lot of those tax funds come from 
business, and business has as much right to the fruits of those efforts
as does any non-business.

> And in a way isn't that kinda like the standardisation process?  I.e.,
> others blaze the trail and make mistakes, and then it's Done Right later
> on, in the one case by a standards committee, and in the other by the
> free software community...

I'm not sure I see your point here.

However I'll take the point to observe that standards committees serve
the business community, and are hard-pressed to accept anything as a standard
that is encumbered by patent or copyright, to INCLUDE the GPL.

> > But I think what happened was that ANSI CL was good enough that its
> > users felt empowered to get on about writing programs.
> 
> That's probably it--it's a useful tool and can get the job done.  Most
> of the improvements I can think of are incremental (e.g. a quick-to-type
> string concatenation function...) and are hardly interesting.  And
> having the reader either preserve case or downcase, rather than
> upcase--which is hardly worth mentioning.
> 
> What would you add or change?

Heh.  Topic for another time perhaps, but precious little.  There are
a couple of additions I'd like to see to the core language, but they
are minor and I largely do ok without them.  (The only place I'd
consider affecting the core language is where it defines something I
can't interface to or can't overcome from user code that really gets
in the way of real work.  There are a number of things that are maybe
a little trouble to interface to, but almost none get in the way of
real work.)  I think CL is "good enough" and I'd prefer it stay static
and just be extended.

That's not to say I think no one could do better.  I just think that
incremental change won't get you there.  I evaluate other dialects on 
their own terms as they arise (which isn't quickly).
From: Christopher C. Stacy
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <ur7eo8jyo.fsf@news.dtpq.com>
Robert Uhl <·········@NOSPAMgmail.com> writes:
> Wasn't emacs free software from the get-go?
> Granted, there were other extensible editors
> (e.g. TECO, on which it was first coded),
> but oughtn't that still count for something?

I suppose that depends on what you mean by "Emacs"
and what you mean by "free software".
I would be happy to relate various facts of history in this
area to you, but I would first like some clarification.

> And then there's Tex & LaTeX, which are still the
> best solution to that particular problem as far as
> I can tell.  To what extent were they original
> versus copies?  I really don't know.

The current versions of Emacs and TeX and so on are not
significant innovations; they're just knock-offs (or
depending on exactly what piece of software you're
trying to refer to, mere distributions) of the older
programs that were developed between 1970-1984.
In other words, they all exactly predate the advent 
of the "free software movement".

I think the general answer is that you can credit the
US military's cold war effort with all the innovation.

By and large, all the money (that paid for the faculty and
staff and offices and computers, and even quite specifically
for these kinds of projects) came from the military, whose
purpose was to foster technical innovation so that we could
have better weapons and systems to defeat the Soviet Union.
Even in the case of the occasional person who was not on
that payroll, but who was volunteering (perhaps a student),
that same military money was enabling their contribution.

(Note however, that some of the people actually doing the
work did not like the fact that they were working for the
military, and often generally pretended that the filthy
dirty warmonger money came into the labs out of thin air.
Sometimes there was also military money that was funnelled
through NSF, and sometimes there were corporate or private
grants, but even all added together it was vastly dwarfed
by the DOD money.)

The results were usually shared with the other contractors,
most of whom were also university research labs; in some
cases the software was also shared with anyone else would 
be interested.

Also, notice that essentially none of these innovations
were adopted by industry at the time, except for a literal
handful of defense contractors, very tightly connected
spinoffs from the universities, who were mostly or wholly
doing advanced R&D for the same military agencies.

Bear in mind that there were not very many computers in 
the world, back then.  All the economics, and kinds of 
players and adgendas in the world of computers were 
rather different than the situation today.

One part that's really the same is that computer researchers
at universities wanted to share (some of) their work with
their colleagues. for academic purposes.  (Of course, they
were also usually being forced to do som anyway.)

It was a different world.
Is that "free software"?

> And in a way isn't that kinda like the standardisation
> process?  I.e., others blaze the trail and make mistakes,
> and then it's Done Right later on, in the one case by a
> standards committee, 

Done "right" according to some metrics that industry wants

> and in the other by the free software community...

I suppose so, although I'm not sure what this "community" is, 
or what metrics of "rightness" you are considering.

Communities don't write software, hackers write software.
I am under the impression that essentially all the work 
for almost all of the present era's "free software" that 
one might hold up as "better" has been done by a very few
individuals.  And except for a few philanthropists, those
people were being financially compensated (eg. by corporations)
to do the work.  In some cases, parents act as the unwitting
philanthropists for these individuals.
From: =?utf-8?b?R2lzbGUgU8ODwqZsZW5zbWk=?= =?utf-8?b?bmRl?=
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <0nll4x0z5y.fsf@kaktus.ii.uib.no>
Kent M Pitman <······@nhplace.com> writes:

>  
> (I personally think historically Free Software doesn't innovate as
> much as it copies, but I don't know that that's intrinsic to its
> nature...  It made Linux, like Unix.  It made Gimp, like Photoshop.
> It made Open Office, like MS/Office.  It has yet to make something
> that competes with Apple for offering something major and novel that
> business didn't already think up...  McCarthy is offering a hint that
> they could do that...)

I hear you keep saying this, but the first web tools, both on the server
and client sides were free/open. On the client side, the commercial race 
between Microsoft and netscape changed this, but on the server side the
open source tools are still in lead, both for serving and for scripting.
When it comes to the web it's the closed-source vendors that have 
copied the opensource tools. For graphical programs, you are at least 
partially right, but since the software both for the web and many (most?) 
of the internet protocols used today were first implemented in free or open 
source software. 

-- 
Gisle Sælensminde, Phd student, Scientific programmer
Computational biology unit, University of Bergen, Norway
Email: ·····@cbu.uib.no | Complicated is easy, simple is hard.
From: lin8080
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <42BF38A5.A3C342C1@freenet.de>
Gisle Sælensmi nde schrieb:
> Kent M Pitman <······@nhplace.com> writes:


> > (I personally think historically Free Software doesn't innovate as
> > much as it copies, but I don't know that that's intrinsic to its
> > nature...  

> ... but since the software both for the web and many (most?)
> of the internet protocols used today were first implemented in free or open
> source software.

Free software developers should be payed as marketing testers. All
lucrative fields are soon robbed by com. vendors (who then lament
something about illegal copies).

stefan

set a smile on it, ja?
From: Robert Maas, see http://tinyurl.com/uh3t
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <REM-2005aug13-003@Yahoo.Com>
> From: Kent M Pitman <······@nhplace.com>
> ... Darwinian strengths ... you use a survival characteristic and
> then you don't need it again for a long time.  It needs to be in the
> arsenal but it may not have day-to-day deployment.

Hey, I've noticed your deep understanding of Lisp and related general
programming issues (my favorite is still your essay about intention in
data types/semantics), but now I see you have some deep understanding
of Darwinian evolution too, some insight not everyone has. Perhaps you
could take some time to browse talk.origins and sci.bio.evolution
sometime and spot places where your insight would be especially useful?

> the Betamax/VHS issue, there was more value in the ability to make a
> decison and move forward than there was in perpetually haggling over
> which to take.  The market didn't mind losing what quality it would
> gain from Betamax because it got that same "value" back in other ways
> by just killing that tiny debate in time to make VCRs earlier.

Hmm, I never looked at it that way before. I have always been pissed
that the wrong company&format won that battle. Turning to a topic
closer to on-topic: I've always considered the PDP-11 instruction set
to be the very best, and was pissed that the LSI-11 died out in favor
of Intel's 8086 crap. But applying your kind of analysis, because Intel
won the battle very early, it had more time to speed up the chip, and
compiler writers could concentrate on overcoming the cruddy instruction
set to implement high-level languages, and eventually Intel with that
head start was able to switch to RISC instruction sets in recent CPUs,
so in the end it was better to let the wrong guy win than to have the
two battle endlessly. How do you feel about (my reading of) your logic
applied to that case?

> ... programming ... measured by business ... a sense of "good enough"
> continues to rule.

Can we also apply that to "fast enough"? IMO, Lisp is quite fast enough
nowadays, and switching to C just to make a program 10% faster, at the
cost of making the program inflexible, is a bad idea.

> It seems misplaced for McCarthy to blame others and not his own grad
> students or members of the free software movement for letting things
> down.

I think this is the single point you need to communicate to McCarthy
privately and see how he responds. Will he ask his graduate students to
take up your challenge? Will he contact the free software movements,
with his great historical reputation/prestege, and convince some of
them to take up your challenge? I hope so.

> I personally think historically Free Software doesn't innovate as
> much as it copies, but I don't know that that's intrinsic ...

The original Lisp was a free innovation, as was the original Emacs,
right? So you're saying there were a few such free major innovations
long ago but lately they haven't been happening hardly ever? Do you
consider MediaWiki and Blogs to be free innovation not duplicating
anything previously existing? (Just asking for your assessment of them,
compared to my not-very-informed opinion, not making a backhanded
rhetorical comment, just posing them as possible counterexamples to
your claim and asking you for evaluation of them as such.)

> I think what happened was that ANSI CL was good enough that its users
> felt empowered to get on about writing programs.

I'm one of those users who indeed considers ANSI CL to be good enough
in most areas, and implemention-specific-patches to make it also good
enough in a few other areas, to the point where there's very little
where it really suffers as not being good enough:
- Need SysLisp or similar idea for generating exact machine code/data.
- Need lots more standard packages for specific application domains,
   and for general networking (XML/DOM/SAX, SOAP, RPC/RMI).
- Need ability to load other-language libraries that were compiled in
   their native form not intended for Lisp, and call their subroutines
   directly from Lisp's foreign-function-call mechanisms of various
   types, or even decompose a native-code coreimage to find all the entry
   points and call them directly, as contrasted with running foreign
   whole-programs as sub-processes such as CMUCL's SYSTEM:RUN-PROGRAM.

Anything else CL really needs in a globally standard way, as contrasted
with each programmer or shop adding his/her or manager's own favorite
add-on packages?

> Even Gnu Emacs lisp offers opportunities. Blaming CL for the failures
> of all of these efforts to exceed Common Lisp

Um, I'm curious: If you count the number of copies of GNU Emacs in the
world actually in use by at least one person each, and the number of
copies of all fully functional versions of CL in the world likewise,
what's the balance? Most Unix/Linux have Gnu Emacs, but not as many
have CL, and most of those CLs are not used by anyone whereas Gnu Emacs
tends to be used by somebody here or there. I'm guessing that GNU Emacs
may exist more places then all of the CLs put together, but you would
know better. (And how many AutoCad installations are there, while we're
on that xLispDialect "exceeds" yLispDialect topic?) (Maybe that's not
what you meant by "exceeds", but it's a valid comparison point you must
agree.)

> Here's my bottom line beef:  Rather than pitch this as a sadness, why
> not pitch it as an opportunity?  There are lots of people eager to
> dive in and make a mark on the world.  If McCarthy can articulate
> what needs to be done, those people would, I think, willingly do it.
> If he can't articulate what needs to be done well enough for them to
> do that, then maybe THAT is the place that resources need to be
> applied first.

Could you put this question to him, privately, also? If he can't
articulate what he really wants done, then maybe the two of you can
hash it out together, if both of you have the time and energy to do
that?

> At least he can be satisfied that and should be pleased that he's
> provoked discussion on the matter.  That seems to be at least part of
> his intent.

And he's put a burr under your saddle, and a couple other especially
bright saddles here, provoking you to come up with some good relevant
insight/opinions/comments/interpretations to fire up these discussions.
(Sorry I fell behind in responding to this topic.)

Oh, one more item: Is it too late to clean up that mess about
strings being mutable vs. literal constants you aren't supposed to
change internally, likewise other literal or quoted constants?
From: Matthias Buelow
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <3m7tdaF14spujU1@news.dfncis.de>
Robert Maas, see http://tinyurl.com/uh3t <·······@yahoo.com> wrote:

>closer to on-topic: I've always considered the PDP-11 instruction set
>to be the very best, and was pissed that the LSI-11 died out in favor
>of Intel's 8086 crap. But applying your kind of analysis, because Intel

Since when was the PDP-11 a PC? I think that's confusion of markets.
A comparison with m68k might be valid, though.

>I'm one of those users who indeed considers ANSI CL to be good enough
>in most areas, and implemention-specific-patches to make it also good

IMHO _Common_ Lisp is a dead, bloated whale stranded on the beach
with a few treehugger-style activists vainly trying to push it
back into the water. I hope someone will come up with a better,
fresher, new Lisp (completely designed from scratch) some day because
the basic concepts (but not the teratoma-like appendages that it has
grown over the decades) of Lisp are too valuable to be lost forever.

mkb.
From: Tim X
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <87iry99gs2.fsf@tiger.rapttech.com.au>
Matthias Buelow <···@incubus.de> writes:

> Robert Maas, see http://tinyurl.com/uh3t <·······@yahoo.com> wrote:
> 
> >closer to on-topic: I've always considered the PDP-11 instruction set
> >to be the very best, and was pissed that the LSI-11 died out in favor
> >of Intel's 8086 crap. But applying your kind of analysis, because Intel
> 
> Since when was the PDP-11 a PC? I think that's confusion of markets.
> A comparison with m68k might be valid, though.
> 

From memory, I think the 8086 Intel chip was used in some
mini/mainframe systems as well as PCs - in particular, I remember
reading about a hypercube and other high performance systems which
used large arrayx of this chip family. This chip was also used in a
few research projects. Over 10 years ago I was involved in research
relating to signal processing using systolic arrays and one of the
systems we used was based on arrays of the Intel 8086.

I have to agree with Robert on the assembly language for the 8086
being combersome and difficult to work with compared to that of other
chips around at the time. 

Tim

-- 
Tim Cross
The e-mail address on this message is FALSE (obviously!). My real e-mail is
to a company in Australia called rapttech and my login is tcross - if you 
really need to send mail, you should be able to work it out!
From: Robert Maas, see http://tinyurl.com/uh3t
Subject: LSI-11, tit-forest-tat (was: ILC2005: McCarthy denounces Common Lisp, ...)
Date: 
Message-ID: <REM-2005aug19-011@Yahoo.Com>
> From: Matthias Buelow <····@incubus.de>
> >closer to on-topic: I've always considered the PDP-11 instruction set
> >to be the very best, and was pissed that the LSI-11 died out in favor
> >of Intel's 8086 crap. But applying your kind of analysis, because Intel

> Since when was the PDP-11 a PC?

I never said any such thing, for whatever value of "PC" you mean
(PC is a registered trademark if IBM corporation, for their original
product line of 8086-based small-business computers, the PC being a
misnomer in that case. PC is also a more general term to include both
the IBM-PC and various clones of it that have been made by other
companies. Finally, PC means any computer, such as a Macintosh or
Dell/Linux laptop that is used by individual people for their personal
work.)

The PDP-11 was a closet-sized mainframe, smaller than a PDP-10 or
IBM-360, but neverthelss quite a bit larger than a modern desktop, and
somewhat larger than a modern "tower".

The LSI-11 was a true micro-processor, that could be installed on a
single PC (Printed CIrcuit) board, that implemented a complete
desktop-sized computer, using essentially the same instruction set as
the original PDP-11. It's the LSI-11 that I compared to the 8086, the
LSI-11 having a really nice instruction set for handcrafted
assembly-language programming, while the 8086 was a crock extension
from the 8080.

> A comparison with m68k might be valid, though.

Not "though", rather "also"! The 68000 series was an extension of the
older 6800 microprocessor, but the 68000 did it right, having an
instruction set not as elaborate as the LSI-11 but nearly as good in
design given the lower cost.

> IMHO _Common_ Lisp is a dead, bloated whale stranded on the beach

Your opinion noted and countered. Common Lisp is the best overall
language for a large majority of user-level applications and system
utilities. There are only a few additional features that it would need
to be the best language for such a huge majority of applications that
nearly all other languages, including C and Java, could be abandoned at
that point. (See another place where I recently posted my wish-list
including SysLisp and XML/SOAP etc.)

> with a few treehugger-style activists vainly trying to push it
> back into the water.

CL is "good enough" for virtually all my software needs. Cutting down a
forest just because it isn't perfect, and leaving a barren landscape,
in the hope that someday a new language might replace it, is plain
dumb. Better to keep the forest, with so very many trees, too many to
count, until and unless we eventually find a better use for that land.
No, I don't hug trees, but I'd rather have the trees than the erosion,
and I'd rather have the acorns and leaves than the dust. I'd only ask
that one small area at the top of the hill be cleared to have a clear
view of the sky at night, and clear some space around buildings as a
firebreak just in case you get too eager about getting rid of my
forest.
(Tit for tat with your treehugger crap. Don't take either seriously.)
From: Matthias Buelow
Subject: Re: LSI-11, tit-forest-tat
Date: 
Message-ID: <3mn8riF1731emU2@news.dfncis.de>
Robert Maas, see http://tinyurl.com/uh3t <·······@yahoo.com> wrote:

>> Since when was the PDP-11 a PC?
>companies. Finally, PC means any computer, such as a Macintosh or
>Dell/Linux laptop that is used by individual people for their personal
>work.)

That's my interpretation. Actually, my implication was wrong.. there
_was_ a PDP11 "personal computer", the DEC Pro/350, although it
apparently wasn't quite a commercial success. I originally thought
the DECmates were also PDP11-based but from what I gathered from
the web, they're not.

>(Tit for tat with your treehugger crap. Don't take either seriously.)

;-)

mkb.
From: Kent M Pitman
Subject: Re: LSI-11, tit-forest-tat
Date: 
Message-ID: <umzndmoc5.fsf@nhplace.com>
Matthias Buelow <···@incubus.de> writes:

> ... there _was_ a PDP11 "personal computer", the DEC Pro/350, 
> although it apparently wasn't quite a commercial success... 

And the early pdp11's were very small, hardly usable by more than one
person.  I started to write a post recently on this thread, but maybe
I didn't send it.  A pdp11/05 I used at the MIT Dept was specially
hacked by some local wizzes (I think maybe Earl Killian and/or Charlie
Frankston) to run the 3 or 4 users that we had in our office.  We ran
a custom editor with an emacs-like command set (troll-r, i think maybe
it was called?) and did little more on it than type in card images of
FORTRAN programs to submit to a CDC 7600 supercomputer halfway across
the country... but my impression was that such "heavy" use was unusual
and that the more common use for an 11/05 was single-use, that is,
as a PC.

Btw, related to this, I recall early on in my time at Symbolics (I was
there 1985-1992, so I missed the first 5 or so years) seeing market
analysts reports segmenting the market into "workstations", "personal
computers", and "lisp machines".  I recall pointing out that it seemed
a bad thing that we were not listed under one of those other two
categories, and being assured by a marketing person that this was fine
and proper.  In retrospect, the failure to manage that particular
issue seems one of the key failures that led to the non-acceptance of
Lisp Machines.  Not like fixing it was just a matter of moving it to
another column, but attempting to fix it for real would have meant
confronting the problem of being accepted by the people those other
machines were marketed to, and would have led to some good things.  As
it was, Symbolics had so much money rolling in for a time that it
didn't think it had to care.  Excess money can hide a lot of
sins. When the money subsides, they become instantly apparent.
(Google and other big players take note.)
From: Matthias Buelow
Subject: Re: LSI-11, tit-forest-tat
Date: 
Message-ID: <3mnondF16oe5uU1@news.dfncis.de>
Kent M Pitman <······@nhplace.com> wrote:

>FORTRAN programs to submit to a CDC 7600 supercomputer halfway across
>the country... but my impression was that such "heavy" use was unusual

Hmm... http://www.scd.ucar.edu/computers/gallery/cdc/7600.html
even the thumbnail image looks sexy..

mkb.
From: Pascal Costanza
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <3i8jcaFkc62qU1@individual.net>
Kent M Pitman wrote:

> I wish McCarthy would spend more time articulating what he wants to
> see happen and how it might be done, than talking about how he doesn't
> like what does happen. 

McCarthy's talk was entirely about novel things he would like to see 
happen in the future. He has mentioned Common Lisp standardization only 
in the questions & answers round, and as far as I can tell, he didn't 
really have a good opportunity to make his point clear, mainly due to a 
lack of time.

It keeps puzzling me why people care about what McCarthy thinks about 
standardization.


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Edi Weitz
Subject: Re: ILC2005: McCarthy denounces Common Lisp, "Lisp", XML, and Rahul
Date: 
Message-ID: <u3br43hw7.fsf@agharta.de>
On Sun, 26 Jun 2005 23:00:58 +0200, Pascal Costanza <··@p-cos.net> wrote:

> It keeps puzzling me why people care about what McCarthy thinks
> about standardization.

Good point.

As far as I can see the only interesting thing with respect to Lisp
that McCarthy did in the last years was that he once used Kenny
Tilton's laptop.

:)

Cheers,
Edi.

-- 

Lisp is not dead, it just smells funny.

Real email: (replace (subseq ·········@agharta.de" 5) "edi")