From: Jamie Border
Subject: Beyond CL?
Date: 
Message-ID: <db6c15$gfi$1@nwrdmz02.dmz.ncs.ea.ibs-infra.bt.com>
I've been reading about Arc[1], Paul Graham's Lisp-plus-to-be, and I've been 
thinking[2]..

Couldn't we have a CL++ (sorry) with some nice modern features (libraries as 
good as Python, good profiling (yes, I've seen Allegro), blah blah blah, and 
still keep the S-expression syntax (or lack thereof)?

Aside: I can't see how moving away from s-exp works - won't macros be 
pathetically crippled?

Ramble ramble ramble, but my real point is: why aren't there more Paul 
Grahams doing the same thing?  Is the problem so intrinsically hard?  Does 
nobody care?

Jamie

[1] http://www.paulgraham.com/arc.html
[2] bad idea 

From: Eric Lavigne
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121370659.013912.140630@g14g2000cwa.googlegroups.com>
>Couldn't we have a CL++ (sorry) with some nice modern
>features (libraries as good as Python, good profiling (yes,
>I've seen Allegro), blah blah blah, and still keep the
>S-expression syntax (or lack thereof)?

Yes, there are people who are trying to make CL better. Here is a list
of such projects: http://common-lisp.net/projects.shtml

It basically comes down to time, though. Each of these projects takes a
lot of time. Libraries as good as Python? Well, if you have a favorite
Python library, then you can start translating now ^^ If you want all
of Python's libraries, then time becomes a big problem. Python has more
programmers than we do, so it's hard to keep up. An alternative
solution would be to write a python2lisp translator, but this is no
small task.

>I can't see how moving away from s-exp works - won't macros
>be pathetically crippled?

Paul Graham answered this at http://www.paulgraham.com/arcll1.html
which is linked by the reference page you gave:
http://www.paulgraham.com/arc.html
The basic idea is that macros still operate on s-expressions. Arc
syntax is translated into s-expressions before macros get a chance to
see them.

So the answer (or an answer), I think, to the long pondered question of
syntax for Lisp is: yes, have syntax, but only as abbreviation. Arc
will have syntax, but it will translate in a clearly defined (and in
fact, redefinable) way into underlying s-expressions. Nearly all the
syntax will be optional, and moreover optional at the level of
individual operators.

>why aren't there more Paul Grahams doing the same thing?
>Is the problem so intrinsically hard?  Does
>nobody care?

There are quite a lot of languages out there, so many that it would
probably take a long time just to read all their names. There are a few
unique things about the particular case of Arc, though:
1) It's inventor is famous, especially in this newsgroup.
2) It is promised to be the best language ever. Who wouldn't get
excited about that ^^

Writing a new language is fairly easy. Making it good enough to be
worth learning is hard. Convincing others that it is good enough to be
worth learning (so that they will give it a try) is also hard. Adding a
new library to an existing language is far easier on both counts.
From: Jamie Border
Subject: Re: Beyond CL?
Date: 
Message-ID: <db7t6o$a95$1@nwrdmz03.dmz.ncs.ea.ibs-infra.bt.com>
"Eric Lavigne" <············@gmail.com> wrote

Me>>I can't see how moving away from s-exp works - won't macros
Me>>be pathetically crippled?
>
> Paul Graham answered this at http://www.paulgraham.com/arcll1.html
> which is linked by the reference page you gave:
> http://www.paulgraham.com/arc.html
> The basic idea is that macros still operate on s-expressions. Arc
> syntax is translated into s-expressions before macros get a chance to
> see them.

Yes.  I read this.  But right now lisp code is s-exp based, so I can read 
the result of the macro expansion.  If the macro is written carefully, I can 
see the code I would have had to type manually.  If you introduce syntax and 
macros don't produce output that uses it this break, and I see this as 
lossage.  Yes, I know that his idea is for syntax to be mostly optional.

>
> So the answer (or an answer), I think, to the long pondered question of
> syntax for Lisp is: yes, have syntax, but only as abbreviation. Arc
> will have syntax, but it will translate in a clearly defined (and in
> fact, redefinable) way into underlying s-expressions. Nearly all the
> syntax will be optional, and moreover optional at the level of
> individual operators.

Hmm.

>
Me>>why aren't there more Paul Grahams doing the same thing?
Me>>Is the problem so intrinsically hard?  Does
Me>>nobody care?
>
> There are quite a lot of languages out there, so many that it would
> probably take a long time just to read all their names. There are a few
> unique things about the particular case of Arc, though:
> 1) It's inventor is famous, especially in this newsgroup.

I can see this.  But is there not also room for something that runs CL code 
but has less weirdness?

> 2) It is promised to be the best language ever. Who wouldn't get
> excited about that ^^

*dribble*

>
> Writing a new language is fairly easy. Making it good enough to be
> worth learning is hard. Convincing others that it is good enough to be
> worth learning (so that they will give it a try) is also hard. Adding a
> new library to an existing language is far easier on both counts.

Kind of what I was thinking.  If you can still use _all_ of CL, but pile on 
some nice stuff to attract new users, then surely this could be good?

Jamie 
From: Joe Marshall
Subject: Re: Beyond CL?
Date: 
Message-ID: <slyhnptv.fsf@ccs.neu.edu>
"Jamie Border" <·····@jborder.com> writes:

> I've been reading about Arc[1], Paul Graham's Lisp-plus-to-be, and I've been 
> thinking[2]..
>
> Couldn't we have a CL++ (sorry) with some nice modern features (libraries as 
> good as Python, good profiling (yes, I've seen Allegro), blah blah blah, and 
> still keep the S-expression syntax (or lack thereof)?
>
> Aside: I can't see how moving away from s-exp works - won't macros be 
> pathetically crippled?

Yes.

> Ramble ramble ramble, but my real point is: why aren't there more Paul 
> Grahams doing the same thing?  

We're not independently wealthy.

> Is the problem so intrinsically hard?  

No, but it is time consuming.  A nice full featured CL++ would be a
few man-years in the making.

> Does nobody care?

Food and shelter are always nice, and there is little funding for
language design.
From: paulgraham
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121987571.517520.12340@g43g2000cwa.googlegroups.com>
Joe Marshall wrote:
> "Jamie Border" <·····@jborder.com> writes:
>
> > I've been reading about Arc[1], Paul Graham's Lisp-plus-to-be, and I've been
> > thinking[2]..
> >
> > Couldn't we have a CL++ (sorry) with some nice modern features (libraries as
> > good as Python, good profiling (yes, I've seen Allegro), blah blah blah, and
> > still keep the S-expression syntax (or lack thereof)?
> >
> > Aside: I can't see how moving away from s-exp works - won't macros be
> > pathetically crippled?
>
> Yes.
>
> > Ramble ramble ramble, but my real point is: why aren't there more Paul
> > Grahams doing the same thing?
>
> We're not independently wealthy.
>

You don't have to be independently wealthy to make a new language.
Larry Wall and Guido van Rossum and Matz weren't.

The advantage they had over us in the Lisp world was that they
started from a lower point.  Larry Wall, for example, started out
trying to make a better awk.  That's not hard.  Awk is missing a
lot.  Whereas we in the Lisp world are bumpimg up against the
asymptote.  Among other things, we can't avail ourselves of the
one of the richest sources of features for new languages: taking
stuff from Lisp.  We have to invent genuinely new things.  Things,
moreover, that people like John McCarthy, Gerry Sussman, and
Guy Steele didn't think of.  That's not so easy.

> > Is the problem so intrinsically hard?
>
> No, but it is time consuming.  A nice full featured CL++ would be a
> few man-years in the making.
>
> > Does nobody care?
>
> Food and shelter are always nice, and there is little funding for
> language design.
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1122004536.754247.295730@g43g2000cwa.googlegroups.com>
paulgraham wrote:
> We have to invent genuinely new things.  Things,
> moreover, that people like John McCarthy, Gerry Sussman, and
> Guy Steele didn't think of.  That's not so easy.

I agree. Luckily, we have an advantage: we are standing on their
shoulders. Man of 10,000 years ago had the same brain, and thus the
same intelligence as we do now. Cultural evolution alone has brought us
to this point, and it will, I believe, carry us forward.

One promising idea is to move towards the greater ability to poke
around in the internals of the language. How about this: Lisp is known
as the programmable programming language; what we lack is a
programmable *compiler*. Imagine being able to define one's own code
generation optimizations when you have identified a bottleneck in your
program. Eventually a bank of many users' optimizations could be
collected together, resulting in something that might asymptotically
approach the fabled "sufficiently smart compiler".

However, such things can only be done when the language itself is
defined completely transparently from a small number of basic axioms.
As you say, source code often makes the best spec. It also gives
serious hackers the ability to get down and dirty into the internals of
the language.

Perhaps the compiler should be defined in the spec of the language;
that is, there exists a well-defined translation of source code to
primitive low-level constructs. (I don't mean anything like a
translation to a particular architecture, such as x86.) This should
allow the programmable compiler to become a realistic possibility.

One way to generate new ideas is to think of all the strangest things
we could possibly want to do, and make them possible. In one of your
essays, you said you wanted to iterate through the fields of a struct,
but Common Lisp wouldn't let you. That should be possible. What else
should be?

o Perhaps it should be possible to define (2 (pr "hi")) to print "hi"
twice. (Not that anyone would want to do that - or would they? - but
such a thing should, in principle, be possible. Note that this cannot
be done with a code walker since the number in the first position could
actually be a variable. Now that I think of it, a more realistic
suggestion would be for (2 (fn () (pr "hi"))) to print "hi" twice.
Nevertheless.)

o Perhaps it should always be possible to interrogate a closure, and
retrieve its code and its environment. (Yes, there exists
FUNCTION-LAMBDA-EXPRESSION. The point is, it's not guaranteed to work.)

o Perhaps it should be possible to define extremely high-level
optimizations as well as the low-level ones I mentioned. As an example,
consider automatically memoizing all functions determined to be purely
functional, or automatically compiling partially specialized versions
of functions (e.g. in a raytracer, a ray-sphere-intersect method
specialized for a certain sphere) when it is determined that this would
improve execution speed. Yes, we have methods to do things like this in
CL, but we don't have clean ways of making the magic happen on its own.
In CL, you have to *think* about doing either of these things.

This approach seems promising. It seems to me that Lisp-space is an
infinite domain in which we can see but a small distance away from
where we are standing. Even heading in the wrong direction is better
than stubbornly staying in the same place, although it seems many are
comfortable there and resent being poked by those who have seen a small
glimpse of the treasures that lie around them.

"I do not know what I may appear to the world; but to myself I seem to
have been only like a boy playing on the seashore, and diverting myself
in now and then finding a smoother pebble or a prettier shell than
ordinary, whilst the great ocean of truth lay all undiscovered before
me." - Isaac Newton

Tron3k
From: Matthias Buelow
Subject: Re: Beyond CL?
Date: 
Message-ID: <86r7drjxi2.fsf@drjekyll.mkbuelow.net>
"Tron3k" <······@gmail.com> writes:

>where we are standing. Even heading in the wrong direction is better
>than stubbornly staying in the same place, although it seems many are
>comfortable there and resent being poked by those who have seen a small
>glimpse of the treasures that lie around them.

Go ahead and do it, and come back when you have results.

mkb.
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1122006098.752155.233190@f14g2000cwb.googlegroups.com>
Matthias Buelow wrote:
> "Tron3k" <······@gmail.com> writes:
>
> >where we are standing. Even heading in the wrong direction is better
> >than stubbornly staying in the same place, although it seems many are
> >comfortable there and resent being poked by those who have seen a small
> >glimpse of the treasures that lie around them.
>
> Go ahead and do it, and come back when you have results.
> 

Oh - did I poke you? Sorry.
From: André Thieme
Subject: Re: Beyond CL?
Date: 
Message-ID: <dbprdu$vuu$1@ulric.tng.de>
Tron3k schrieb:

> o Perhaps it should be possible to define (2 (pr "hi")) to print "hi"
> twice. (Not that anyone would want to do that - or would they? - but
> such a thing should, in principle, be possible. Note that this cannot
> be done with a code walker since the number in the first position could
> actually be a variable. Now that I think of it, a more realistic
> suggestion would be for (2 (fn () (pr "hi"))) to print "hi" twice.
> Nevertheless.)

Why not overload *?
(* "hi" 2)
"hihi"


Andr�
-- 
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1122005829.915961.47120@g14g2000cwa.googlegroups.com>
André Thieme wrote:
> Tron3k schrieb:
>
> > o Perhaps it should be possible to define (2 (pr "hi")) to print "hi"
> > twice. (Not that anyone would want to do that - or would they? - but
> > such a thing should, in principle, be possible. Note that this cannot
> > be done with a code walker since the number in the first position could
> > actually be a variable. Now that I think of it, a more realistic
> > suggestion would be for (2 (fn () (pr "hi"))) to print "hi" twice.
> > Nevertheless.)
>
> Why not overload *?
> (* "hi" 2)
> "hihi"

The main idea is not the particular example I used, but the capability
to be able to modify internals of the language to such a degree. Being
able to say (2 (fn () (pr "hi"))) is a litmus test for the ability to
modify at the deepest level the way things are expressed in your
language.
From: André Thieme
Subject: Re: Beyond CL?
Date: 
Message-ID: <dbqtts$tm3$1@ulric.tng.de>
Tron3k schrieb:
> Andr� Thieme wrote:
> 
>>Tron3k schrieb:
>>
>>
>>>o Perhaps it should be possible to define (2 (pr "hi")) to print "hi"
>>>twice. (Not that anyone would want to do that - or would they? - but
>>>such a thing should, in principle, be possible. Note that this cannot
>>>be done with a code walker since the number in the first position could
>>>actually be a variable. Now that I think of it, a more realistic
>>>suggestion would be for (2 (fn () (pr "hi"))) to print "hi" twice.
>>>Nevertheless.)
>>
>>Why not overload *?
>>(* "hi" 2)
>>"hihi"
> 
> 
> The main idea is not the particular example I used, but the capability
> to be able to modify internals of the language to such a degree. Being
> able to say (2 (fn () (pr "hi"))) is a litmus test for the ability to
> modify at the deepest level the way things are expressed in your
> language.

If your language can do that it will be a great language for little 
hacks. For scripts up to 700 lines of code.
For bigger projects beyond 5k LOC the programmers will not be able to 
control the language anymore. Some people see these problems already 
with CL. Too much freedom will create too many different styles that are 
not compatible with each other.
In CL you already have the problem not to be able to say
(defmethod length (...) ...) because it is not object oriented.
If in your language even more is dynamic and flexible then noone can 
write programs in it.


Andr�
-- 
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1122044721.425848.195110@f14g2000cwb.googlegroups.com>
André Thieme wrote:
> Tron3k schrieb:
> > André Thieme wrote:
> >
> >>Tron3k schrieb:
> >>
> >>
> >>>o Perhaps it should be possible to define (2 (pr "hi")) to print "hi"
> >>>twice. (Not that anyone would want to do that - or would they? - but
> >>>such a thing should, in principle, be possible. Note that this cannot
> >>>be done with a code walker since the number in the first position could
> >>>actually be a variable. Now that I think of it, a more realistic
> >>>suggestion would be for (2 (fn () (pr "hi"))) to print "hi" twice.
> >>>Nevertheless.)
> >>
> >>Why not overload *?
> >>(* "hi" 2)
> >>"hihi"
> >
> >
> > The main idea is not the particular example I used, but the capability
> > to be able to modify internals of the language to such a degree. Being
> > able to say (2 (fn () (pr "hi"))) is a litmus test for the ability to
> > modify at the deepest level the way things are expressed in your
> > language.
>
> If your language can do that it will be a great language for little
> hacks. For scripts up to 700 lines of code.
> For bigger projects beyond 5k LOC the programmers will not be able to
> control the language anymore. Some people see these problems already
> with CL. Too much freedom will create too many different styles that are
> not compatible with each other.
> In CL you already have the problem not to be able to say
> (defmethod length (...) ...) because it is not object oriented.
> If in your language even more is dynamic and flexible then noone can
> write programs in it.

First of all, I'd like to thank you for offering a reasoned and cogent
criticism of my idea.

I don't really have a counter-argument at this time, and possibly there
is no counter. It is true that flexibility leads much more often to
people making mistakes than to people actually using it for a useful
purpose. Maybe humans cannot handle infinite power, and there must be
at least *some* limit to what you can do to a language. All I can plead
is that I'm going to see how it works, and at least I might learn
something.

Also, if it turns out to be a scripting language, which it probably
will, I won't be too depressed. :-)

I think it's a really interesting point that you can't say (defmethod
length ...) in CL. (Well, you can, but it requires a workaround.) The
reason, I guess, is that the object system is grafted on as opposed to
being a fundamental part of the language. You are right that this is a
problem due to Lisp being too flexible: that you *can* graft things on
like that, instead of having to recreate the entire language, is
sometimes a problem.

Tron3k
From: Pascal Bourguignon
Subject: Re: Beyond CL?
Date: 
Message-ID: <87vf337a07.fsf@thalassa.informatimago.com>
"Tron3k" <······@gmail.com> writes:
> One promising idea is to move towards the greater ability to poke
> around in the internals of the language. How about this: Lisp is known
> as the programmable programming language; what we lack is a
> programmable *compiler*. Imagine being able to define one's own code
> generation optimizations when you have identified a bottleneck in your
> program. Eventually a bank of many users' optimizations could be
> collected together, resulting in something that might asymptotically
> approach the fabled "sufficiently smart compiler".

What do you think compiler macros are used for?

You're funny Tron3k.  All the "ideas" you bring here are already
implemented in Common Lisp.  What's new, Doc?


-- 
__Pascal Bourguignon__                     http://www.informatimago.com/
Small brave carnivores
Kill pine cones and mosquitoes
Fear vacuum cleaner
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1122006029.555243.205080@g44g2000cwa.googlegroups.com>
Pascal Bourguignon wrote:
> "Tron3k" <······@gmail.com> writes:
> > One promising idea is to move towards the greater ability to poke
> > around in the internals of the language. How about this: Lisp is known
> > as the programmable programming language; what we lack is a
> > programmable *compiler*. Imagine being able to define one's own code
> > generation optimizations when you have identified a bottleneck in your
> > program. Eventually a bank of many users' optimizations could be
> > collected together, resulting in something that might asymptotically
> > approach the fabled "sufficiently smart compiler".
>
> What do you think compiler macros are used for?
>
> You're funny Tron3k.  All the "ideas" you bring here are already
> implemented in Common Lisp.  What's new, Doc?

I am aware of compiler macros. I am talking about being able to modify
the process of transforming the syntax tree of a Lisp program to
machine language.
From: Joe Marshall
Subject: Re: Beyond CL?
Date: 
Message-ID: <sly7s9y2.fsf@comcast.net>
"Tron3k" <······@gmail.com> writes:

> Pascal Bourguignon wrote:
>> "Tron3k" <······@gmail.com> writes:
>> > One promising idea is to move towards the greater ability to poke
>> > around in the internals of the language. How about this: Lisp is known
>> > as the programmable programming language; what we lack is a
>> > programmable *compiler*. Imagine being able to define one's own code
>> > generation optimizations when you have identified a bottleneck in your
>> > program. Eventually a bank of many users' optimizations could be
>> > collected together, resulting in something that might asymptotically
>> > approach the fabled "sufficiently smart compiler".
>>
>> What do you think compiler macros are used for?
>>
>> You're funny Tron3k.  All the "ideas" you bring here are already
>> implemented in Common Lisp.  What's new, Doc?
>
> I am aware of compiler macros. I am talking about being able to modify
> the process of transforming the syntax tree of a Lisp program to
> machine language.

Oh, like Fradet and Le Metayer.
  ACM Transactions on Programmmg Languages and Systems, 
  Vol 13, No. 1, January 1991, Pages 21-51

-- 
~jrm
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1122043988.465739.250610@g47g2000cwa.googlegroups.com>
Joe Marshall wrote:
> "Tron3k" <······@gmail.com> writes:
>
> > Pascal Bourguignon wrote:
> >> "Tron3k" <······@gmail.com> writes:
> >> > One promising idea is to move towards the greater ability to poke
> >> > around in the internals of the language. How about this: Lisp is known
> >> > as the programmable programming language; what we lack is a
> >> > programmable *compiler*. Imagine being able to define one's own code
> >> > generation optimizations when you have identified a bottleneck in your
> >> > program. Eventually a bank of many users' optimizations could be
> >> > collected together, resulting in something that might asymptotically
> >> > approach the fabled "sufficiently smart compiler".
> >>
> >> What do you think compiler macros are used for?
> >>
> >> You're funny Tron3k.  All the "ideas" you bring here are already
> >> implemented in Common Lisp.  What's new, Doc?
> >
> > I am aware of compiler macros. I am talking about being able to modify
> > the process of transforming the syntax tree of a Lisp program to
> > machine language.
>
> Oh, like Fradet and Le Metayer.
>   ACM Transactions on Programming Languages and Systems,
>   Vol 13, No. 1, January 1991, Pages 21-51

Hello. Is this the article you are talking about?

http://delivery.acm.org/10.1145/110000/102805/p21-fradet.pdf?key1=102805&key2=8573402211&coll=GUIDE&dl=ACM&CFID=50651501&CFTOKEN=3276221

If so, thank you *very* much! That is a really cool article which I
will study. I'd really like to play around with these ideas by actually
implementing them.
From: Joe Marshall
Subject: Re: Beyond CL?
Date: 
Message-ID: <fyu6lrui.fsf@ccs.neu.edu>
"Tron3k" <······@gmail.com> writes:

> Joe Marshall wrote:
>> "Tron3k" <······@gmail.com> writes:
>>
>> > Pascal Bourguignon wrote:
>> >> "Tron3k" <······@gmail.com> writes:
>> >> > One promising idea is to move towards the greater ability to poke
>> >> > around in the internals of the language. How about this: Lisp is known
>> >> > as the programmable programming language; what we lack is a
>> >> > programmable *compiler*. Imagine being able to define one's own code
>> >> > generation optimizations when you have identified a bottleneck in your
>> >> > program. Eventually a bank of many users' optimizations could be
>> >> > collected together, resulting in something that might asymptotically
>> >> > approach the fabled "sufficiently smart compiler".
>> >>
>> >> What do you think compiler macros are used for?
>> >>
>> >> You're funny Tron3k.  All the "ideas" you bring here are already
>> >> implemented in Common Lisp.  What's new, Doc?
>> >
>> > I am aware of compiler macros. I am talking about being able to modify
>> > the process of transforming the syntax tree of a Lisp program to
>> > machine language.
>>
>> Oh, like Fradet and Le Metayer.
>>   ACM Transactions on Programming Languages and Systems,
>>   Vol 13, No. 1, January 1991, Pages 21-51
>
> Hello. Is this the article you are talking about?
>
> http://delivery.acm.org/10.1145/110000/102805/p21-fradet.pdf?key1=102805&key2=8573402211&coll=GUIDE&dl=ACM&CFID=50651501&CFTOKEN=3276221
>
> If so, thank you *very* much! That is a really cool article which I
> will study. I'd really like to play around with these ideas by actually
> implementing them.

That'd be the one.

You may also wish to look at dynamic optimizers such as Dynamo.
From: Paolo Amoroso
Subject: Re: Beyond CL?
Date: 
Message-ID: <87ackfm633.fsf@plato.moon.paoloamoroso.it>
"Tron3k" <······@gmail.com> writes:

> One way to generate new ideas is to think of all the strangest things
> we could possibly want to do, and make them possible. In one of your
[...]
> This approach seems promising. It seems to me that Lisp-space is an
> infinite domain in which we can see but a small distance away from
> where we are standing. Even heading in the wrong direction is better
> than stubbornly staying in the same place, although it seems many are
> comfortable there and resent being poked by those who have seen a small
> glimpse of the treasures that lie around them.
>
> "I do not know what I may appear to the world; but to myself I seem to
> have been only like a boy playing on the seashore, and diverting myself
> in now and then finding a smoother pebble or a prettier shell than
> ordinary, whilst the great ocean of truth lay all undiscovered before
> me." - Isaac Newton

Letting imagination loose in science and human creativity is
fascinating and rewarding.  But, in the field of industrial software
development, it may cost more to head in any direction, not
necessarily the wrong one, than to stubbornly stay in the same place.

This article from--shudder--a Microsoft employee is a glimpse at those
costs, and it may help understand why many feel "comfortable" in
resisting changes:

  How many Microsoft employees does it take to change a lightbulb?
  http://blogs.msdn.com/ericlippert/archive/2003/10/28/53298.aspx

Given Common Lisp's history, this language might not be the best
playground for wild experimentation.


Paolo
-- 
Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film
Recommended Common Lisp libraries/tools:
- ASDF/ASDF-INSTALL: system building/installation
- CL-PPCRE: regular expressions
- UFFI: Foreign Function Interface
From: Joe Marshall
Subject: Re: Beyond CL?
Date: 
Message-ID: <wtnjsa6m.fsf@comcast.net>
"paulgraham" <··@bugbear.com> writes:

> You don't have to be independently wealthy to make a new language.
> Larry Wall and Guido van Rossum and Matz weren't.
>
> The advantage they had over us in the Lisp world was that they
> started from a lower point.  Larry Wall, for example, started out
> trying to make a better awk.  That's not hard.  Awk is missing a
> lot.  Whereas we in the Lisp world are bumpimg up against the
> asymptote.  Among other things, we can't avail ourselves of the
> one of the richest sources of features for new languages: taking
> stuff from Lisp.  We have to invent genuinely new things.  Things,
> moreover, that people like John McCarthy, Gerry Sussman, and
> Guy Steele didn't think of.  That's not so easy.


  ``If I have not seen as far as others, it is because giants were
    standing on my shoulders.''
             --- Hal Abelson

-- 
~jrm
From: ··········@gmail.com
Subject: Re: Beyond CL?
Date: 
Message-ID: <1122037854.873248.270070@g49g2000cwa.googlegroups.com>
But there are no new ideas in Python or Ruby, and that didn't stop GvR
and Matz from making them.  These languages are just new packaging of
old ideas in ways that some people find appealing.  This is essentially
what Larry Wall means when he calls Perl "postmodern."  You could do
the same in Lisp.  For example, suppose your goal was to appeal to the
masses of programmers of more popular scripting languages.  You might
try including familiar constructs like "for" (don't tell me you can do
this with a macro in CL because this is for newbies and they'll start
off using plain vanilla Lisp), a more familiar object system, some more
syntax, etc (and I'm not saying this new Lisp is a good idea, just
giving an example).

By the definition for the target audience for this new Lisp, people who
use Common Lisp would be the very last people on Earth to have any
interest in it, because they know they can do the job in Common Lisp -
which is probably why nobody is doing this.  But there's no reason why
you have to come up with new ideas to invent a new Lisp, unless your
objective is to convert Common Lispers (a hopeless task).
From: Joe Marshall
Subject: Re: Beyond CL?
Date: 
Message-ID: <oe8vkm4s.fsf@ccs.neu.edu>
··········@gmail.com writes:

> But there are no new ideas in Python or Ruby, and that didn't stop GvR
> and Matz from making them.  These languages are just new packaging of
> old ideas in ways that some people find appealing.  This is essentially
> what Larry Wall means when he calls Perl "postmodern."  You could do
> the same in Lisp.  For example, suppose your goal was to appeal to the
> masses of programmers of more popular scripting languages.  You might
> try including familiar constructs like "for" (don't tell me you can do
> this with a macro in CL because this is for newbies and they'll start
> off using plain vanilla Lisp), a more familiar object system, some more
> syntax, etc (and I'm not saying this new Lisp is a good idea, just
> giving an example).
>
> By the definition for the target audience for this new Lisp, people who
> use Common Lisp would be the very last people on Earth to have any
> interest in it, because they know they can do the job in Common Lisp -
> which is probably why nobody is doing this.  But there's no reason why
> you have to come up with new ideas to invent a new Lisp, unless your
> objective is to convert Common Lispers (a hopeless task).

This is one reason that there are so many Scheme implementations.

Nonetheless, there are few *industrial strength* Scheme
implementations.  Most are just toys and will probably remain so.
From: Paul F. Dietz
Subject: Re: Beyond CL?
Date: 
Message-ID: <8-KdnWVRwJuUYX3fRVn-uw@dls.net>
Joe Marshall wrote:

> Nonetheless, there are few *industrial strength* Scheme
> implementations.  Most are just toys and will probably remain so.

I think many of the 'let's change Lisp' people don't have
a good appreciation of the importance of maturity in
programming language implementations.  Making a compiler
or language system robust, efficient, and correct takes
a lot of work, and it's not something that you want
to discard without a strong justification.

That said, I'm all for tools that would help reduce
the cost of achieving and maintaining this state.

	Paul
From: Kent M Pitman
Subject: Re: Beyond CL?
Date: 
Message-ID: <ufyu6n8e5.fsf@nhplace.com>
"Paul F. Dietz" <·····@dls.net> writes:

> I think many of the 'let's change Lisp' people don't have
> a good appreciation of the importance of maturity in
> programming language implementations.

I think many of the "let's change things" people don't have a good
appreciation of the importance of maturity.
From: Tayssir John Gabbour
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121383008.148367.307870@f14g2000cwb.googlegroups.com>
Jamie Border wrote:
> I've been reading about Arc[1], Paul Graham's Lisp-plus-to-be, and I've been
> thinking[2]..
>
> Couldn't we have a CL++ (sorry) with some nice modern features (libraries as
> good as Python, good profiling (yes, I've seen Allegro), blah blah blah, and
> still keep the S-expression syntax (or lack thereof)?
>
> Aside: I can't see how moving away from s-exp works - won't macros be
> pathetically crippled?
>
> Ramble ramble ramble, but my real point is: why aren't there more Paul
> Grahams doing the same thing?  Is the problem so intrinsically hard?  Does
> nobody care?

It's an economics problem. So let's think of what Lisp already
required.

* Teams of people who could lavish time on it. Lisp did not have a
single hand, as we can see from examining documents; and I think any
serious advance generally requires groups of people collaborating in
some fashion, despite what mediocre history books like to claim.

* Resources without expectation of profit. Lisp tore through both
government and investor cash.


Tayssir
From: Jamie Border
Subject: Re: Beyond CL?
Date: 
Message-ID: <db7u21$peh$1@nwrdmz02.dmz.ncs.ea.ibs-infra.bt.com>
"Tayssir John Gabbour" <···········@yahoo.com> wrote:
> Jamie Border wrote:
>> I've been reading about Arc[1], Paul Graham's Lisp-plus-to-be, and I've 
>> been
>> thinking[2]..
>>
>> Couldn't we have a CL++ (sorry) with some nice modern features (libraries 
>> as
>> good as Python, good profiling (yes, I've seen Allegro), blah blah blah, 
>> and
>> still keep the S-expression syntax (or lack thereof)?
>>
>> Aside: I can't see how moving away from s-exp works - won't macros be
>> pathetically crippled?
>>
>> Ramble ramble ramble, but my real point is: why aren't there more Paul
>> Grahams doing the same thing?  Is the problem so intrinsically hard? 
>> Does
>> nobody care?
>
> It's an economics problem. So let's think of what Lisp already
> required.
>
> * Teams of people who could lavish time on it. Lisp did not have a
> single hand, as we can see from examining documents; and I think any
> serious advance generally requires groups of people collaborating in
> some fashion, despite what mediocre history books like to claim.
>
> * Resources without expectation of profit. Lisp tore through both
> government and investor cash.
>
Increased  profit?  What if CL got _most_ of the smart programmers, not just 
a few?

Jamie
>
> Tayssir
> 
From: ··········@gmail.com
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121399441.632431.275010@g43g2000cwa.googlegroups.com>
I'm not quite sure I understand what you're suggesting - the features
you suggest could be built as libraries on top of CL, eliminating the
need for a new Lisp dialect.  But there have been and are people
implementing new Lisp dialects, most of which seem to generate little
interest.  If you waste enough time visiting Lisp-related websites,
you'll find them.

Paul Graham is something of a celebrity, so everybody knows about his
vaporware, but he's not the only one.  Goo, for example, is quite
similar to Arc, but it's not very well known at all (yet it may be the
most well known recent new Lisp dialect except for Arc).
From: Raffael Cavallaro
Subject: Re: Beyond CL?
Date: 
Message-ID: <2005071500045316807%raffaelcavallaro@pasdespamsilvousplaitmaccom>
On 2005-07-14 23:50:41 -0400, ··········@gmail.com said:

> Goo, for example, is quite
> similar to Arc, but it's not very well known at all (yet it may be the
> most well known recent new Lisp dialect except for Arc).

Speaking of which, Jonathan Bachrach recently wrote that he'll be 
getting back to goo soon after working on other projects, and noted that

"Any specific requests would be appreciated."

the list is at:

<···················@lists.csail.mit.edu>
From: Pascal Costanza
Subject: Re: Beyond CL?
Date: 
Message-ID: <3jntn3FqtcscU1@individual.net>
Jamie Border wrote:
> I've been reading about Arc[1], Paul Graham's Lisp-plus-to-be, and I've been 
> thinking[2]..
> 
> Couldn't we have a CL++ (sorry) with some nice modern features (libraries as 
> good as Python, good profiling (yes, I've seen Allegro), blah blah blah, and 
> still keep the S-expression syntax (or lack thereof)?

People are already working on this. It's called programming. ;)

> Aside: I can't see how moving away from s-exp works - won't macros be 
> pathetically crippled?

Not necessarily. As long as you have a regular syntax, adding a macro 
system should remain relatively straightforward. The more irregularities 
you add, the harder it gets.

Macros are not the only reason for keeping the syntax simple.

> Ramble ramble ramble, but my real point is: why aren't there more Paul 
> Grahams doing the same thing?  Is the problem so intrinsically hard?  Does 
> nobody care?

There are a number of alternative Lisp dialects being worked on, if 
that's what you mean. An important question is whether such attempted 
fragmentation of the community really buys you anything. It's really 
hard to beat the feature set of Common Lisp.


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Jamie Border
Subject: Re: Beyond CL?
Date: 
Message-ID: <db7sgi$bro$1@nwrdmz01.dmz.ncs.ea.ibs-infra.bt.com>
"Pascal Costanza" <··@p-cos.net> wrote:
>> Ramble ramble ramble, but my real point is: why aren't there more Paul 
>> Grahams doing the same thing?  Is the problem so intrinsically hard? 
>> Does nobody care?
>
> There are a number of alternative Lisp dialects being worked on, if that's 
> what you mean. An important question is whether such attempted 
> fragmentation of the community really buys you anything. It's really hard 
> to beat the feature set of Common Lisp.

Ah.  I failed to express myself properly.  What I meant was this:

The CL standardisation process took disparate Lisps and made them easy to 
modify to make them standard-compliant.

Couldn't we (well, probably not me, I'm still green) do a similar thing; 
taking the best features from Allegro, CMUCL etc, and standardise them?

That way we could have portable networking code etc. with no loss of CL 
wonderfuness.  Surely we (you guys) could shut the anti-Lisp crowd up a bit 
by giving them optional shorter names (mvb not multiple-value-bind, etc.). 
You don't have to want it yourself - make it a package (use-package 
incomprehensibly-short-names).

Surely if Torvalds can manage a whole bunch of hackers and build something 
as big as the Linux kernel, there could be a collaborative effort that might 
succeed.

My thought is that this _would_ buy something: less fragmentation.  Giving 
people standard libraries and (maybe) optionally shadowing some of the 
"COMMON-LISP" package wouldn't break a single line of existing code and 
would help portable code.

This would surely not remove the commerical advantage of people like Franz - 
if they don't have to do stupid stuff like maintain their own standards for 
thing like networking and FFI, surely they could be doing more exciting 
stuff.

Oh, and it would stop the "Lisp is dead" rants.  Perhaps.

Jamie

>
>
> Pascal
>
> -- 
> 2nd European Lisp and Scheme Workshop
> July 26 - Glasgow, Scotland - co-located with ECOOP 2005
> http://lisp-ecoop05.bknr.net/ 
From: Ulrich Hobelmann
Subject: Re: Beyond CL?
Date: 
Message-ID: <3jpd3rFr7eg1U1@individual.net>
Jamie Border wrote:
> Couldn't we (well, probably not me, I'm still green) do a similar thing; 
> taking the best features from Allegro, CMUCL etc, and standardise them?

Obviously, standardizing stuff like threads, pathnames, and other 
features would be great.  I guess the CLRFIs' point is just that.

> That way we could have portable networking code etc. with no loss of CL 
> wonderfuness.  Surely we (you guys) could shut the anti-Lisp crowd up a bit 
> by giving them optional shorter names (mvb not multiple-value-bind, etc.). 
> You don't have to want it yourself - make it a package (use-package 
> incomprehensibly-short-names).

defmacro

No, seriously, I used to think that mvb (and call/cc in Scheme) 
are horrid names, but when you have something like slime (i.e. a 
decent IDE) the length of names doesn't matter at all.  What 
matters is that names are different enough so typing three or four 
characters lets you expand the thing into the long symbol.

> Surely if Torvalds can manage a whole bunch of hackers and build something 
> as big as the Linux kernel, there could be a collaborative effort that might 
> succeed.

Well, one is a hacked-together system that continually adds 
features, like a Lisp implementation.  The other is a standard, 
which takes a lot more work and especially coordination, as LOTS 
of people have to be asked to make it good.

> My thought is that this _would_ buy something: less fragmentation.  Giving 
> people standard libraries and (maybe) optionally shadowing some of the 
> "COMMON-LISP" package wouldn't break a single line of existing code and 
> would help portable code.

Why not?  There could be a new package that users can use instead 
of COMMON-LISP -- if they want.  OTOH the new package would 
probably just include CL, not really shadow anything.

-- 
XML is a prime example of retarded innovation.
	-- Erik Meijer and Peter Drayton, Microsoft Corporation
From: Jamie Border
Subject: Re: Beyond CL?
Date: 
Message-ID: <db818h$kal$1@nwrdmz03.dmz.ncs.ea.ibs-infra.bt.com>
"Ulrich Hobelmann" <···········@web.de> wrote in message 
···················@individual.net...
> Jamie Border wrote:
>> Couldn't we (well, probably not me, I'm still green) do a similar thing; 
>> taking the best features from Allegro, CMUCL etc, and standardise them?
>
> Obviously, standardizing stuff like threads, pathnames, and other features 
> would be great.  I guess the CLRFIs' point is just that.
>
>> That way we could have portable networking code etc. with no loss of CL 
>> wonderfuness.  Surely we (you guys) could shut the anti-Lisp crowd up a 
>> bit by giving them optional shorter names (mvb not multiple-value-bind, 
>> etc.). You don't have to want it yourself - make it a package 
>> (use-package incomprehensibly-short-names).
>
> defmacro
>
> No, seriously, I used to think that mvb (and call/cc in Scheme) are horrid 
> names, but when you have something like slime (i.e. a decent IDE) the 
> length of names doesn't matter at all.  What

Yes, but screen space is as valuable to me as clarity, so it would be nice 
to shorten a few things in a way that wouldn't make my code "unique" (read 
"too hard to read").

> matters is that names are different enough so typing three or four 
> characters lets you expand the thing into the long symbol.
>
>> Surely if Torvalds can manage a whole bunch of hackers and build 
>> something as big as the Linux kernel, there could be a collaborative 
>> effort that might succeed.
>
> Well, one is a hacked-together system that continually adds features, like 
> a Lisp implementation.  The other is a standard, which takes a lot more 
> work and especially coordination, as LOTS of people have to be asked to 
> make it good.

But nobody seems to tell CLRFI.

>
>> My thought is that this _would_ buy something: less fragmentation. 
>> Giving people standard libraries and (maybe) optionally shadowing some of 
>> the "COMMON-LISP" package wouldn't break a single line of existing code 
>> and would help portable code.
>
> Why not?  There could be a new package that users can use instead of 
> COMMON-LISP -- if they want.  OTOH the new package would probably just 
> include CL, not really shadow anything.

Hmm.  I still look green, don't I?  Actually, I like the verbosity, my 
co-workers don't, and I'm changing jobs anyway, so maybe the problem goes 
away for me.

I don't think it's a bad idea, though, to ask people new to Lisp about 
things they'd change if they knew how (portable networking, fast, portable 
string-handling).

Jamie

>
> -- 
> XML is a prime example of retarded innovation.
> -- Erik Meijer and Peter Drayton, Microsoft Corporation 
From: Ulrich Hobelmann
Subject: Re: Beyond CL?
Date: 
Message-ID: <3jplm9FqstdbU1@individual.net>
Jamie Border wrote:
> Hmm.  I still look green, don't I?  Actually, I like the verbosity, my 
> co-workers don't, and I'm changing jobs anyway, so maybe the problem goes 
> away for me.

I think it's mostly self-development and education.  I used to 
like terse languages, because I used vi and refused to use any 
tools.  Now I started using Lisp and Objective C, and though my 
code is much more verbose, it's actually *more* readable, and I 
type *less* than before, because most names are completed quickly 
by SLIME or XCode.

And looking into the Java world (or even open-source C, like gtk) 
most people use verbose method names anyway.  The difference is 
that in Lisp you have verbosity, but you avoid code duplication by 
use of macros.  So the code that's there is both very readable and 
very concise, while C code may be terse, but still cluttered.

> I don't think it's a bad idea, though, to ask people new to Lisp about 
> things they'd change if they knew how (portable networking, fast, portable 
> string-handling).

Not at all.

-- 
XML is a prime example of retarded innovation.
	-- Erik Meijer and Peter Drayton, Microsoft Corporation
From: Pascal Costanza
Subject: Re: Beyond CL?
Date: 
Message-ID: <3jphecFqn40vU1@individual.net>
Jamie Border wrote:
> "Pascal Costanza" <··@p-cos.net> wrote:
> 
>>>Ramble ramble ramble, but my real point is: why aren't there more Paul 
>>>Grahams doing the same thing?  Is the problem so intrinsically hard? 
>>>Does nobody care?
>>
>>There are a number of alternative Lisp dialects being worked on, if that's 
>>what you mean. An important question is whether such attempted 
>>fragmentation of the community really buys you anything. It's really hard 
>>to beat the feature set of Common Lisp.
> 
> Ah.  I failed to express myself properly.  What I meant was this:
> 
> The CL standardisation process took disparate Lisps and made them easy to 
> modify to make them standard-compliant.
> 
> Couldn't we (well, probably not me, I'm still green) do a similar thing; 
> taking the best features from Allegro, CMUCL etc, and standardise them?

These things already happen. See for example 
http://www.cliki.net/ACL-COMPAT or http://uffi.b9.com/

It's important to understand the following: In many other languages, you 
need some kind of standardization efforts because integrating stuff 
often requires changes to the respective languages. This is not always 
obvious, but there is a reason why most languages continually grow, and 
often in incompatible ways. If there is a strong gap between 
language-level and library-level features, and as soon as you add 
sophisticated APIs to languages that started out too simplistic people 
start to feel a need to make the language as such more flexible. I think 
this is inevitable.

In Lisp, this kind of pressure to make the language more flexible 
doesn't exist. The boundary between language and libraries is so blurred 
that you can very easily add features that pretend to be new language 
constructs. This effectively means that you can go quite a long way by 
just implementing libraries before the need for "standardization" ever 
arises.

People are just doing that, and this works quite well.



Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Robert Uhl
Subject: Re: Beyond CL?
Date: 
Message-ID: <m3k6js6ot3.fsf@4dv.net>
Pascal Costanza <··@p-cos.net> writes:
>
> The boundary between language and libraries is so blurred that you can
> very easily add features that pretend to be new language
> constructs. This effectively means that you can go quite a long way by
> just implementing libraries before the need for "standardization" ever
> arises.

But standardisation does offer certain unique advantages, chief among
them the guarantee that a conforming implementation will offer certain
features.  E.g. if the standard included split-string, we wouldn't all
need to grab the code and add it to our utility libraries (and possibly
extend it in incompatible ways...).  If the standard more fully defined
what the components of a pathname indicate, we wouldn't need to each
play about with portable pathname libraries.  If the standard defined
once and for all a canonical MOP, we could just rely on the thing.

-- 
Robert Uhl <http://public.xdi.org/=ruhl>
I really have to hand it to Fox this time, they've hit an all new low.
And I just love new lows.  They remind me that progress is growth, and
growth is a two way street.                              --Rick Felice
From: Pascal Costanza
Subject: Re: Beyond CL?
Date: 
Message-ID: <3jq5vaFqtkulU1@individual.net>
Robert Uhl wrote:

> Pascal Costanza <··@p-cos.net> writes:
> 
>>The boundary between language and libraries is so blurred that you can
>>very easily add features that pretend to be new language
>>constructs. This effectively means that you can go quite a long way by
>>just implementing libraries before the need for "standardization" ever
>>arises.
> 
> But standardisation does offer certain unique advantages, chief among
> them the guarantee that a conforming implementation will offer certain
> features.  E.g. if the standard included split-string, we wouldn't all
> need to grab the code and add it to our utility libraries (and possibly
> extend it in incompatible ways...).  If the standard more fully defined
> what the components of a pathname indicate, we wouldn't need to each
> play about with portable pathname libraries.  If the standard defined
> once and for all a canonical MOP, we could just rely on the thing.

Sure, but this requires that some dedicated groups of people work out 
all the details of the respective specifications. That's the hard part.


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Edi Weitz
Subject: Re: Beyond CL?
Date: 
Message-ID: <ur7e0hwf6.fsf@agharta.de>
On Fri, 15 Jul 2005 18:18:50 +0200, Pascal Costanza <··@p-cos.net> wrote:

> Sure, but this requires that some dedicated groups of people work
> out all the details of the respective specifications. That's the
> hard part.

And then all existing implementations should better implement these
new specifications.  That's the expensive part.

-- 

Lisp is not dead, it just smells funny.

Real email: (replace (subseq ·········@agharta.de" 5) "edi")
From: Jamie Border
Subject: Re: Beyond CL?
Date: 
Message-ID: <db87aq$j8j$1@nwrdmz02.dmz.ncs.ea.ibs-infra.bt.com>
"Pascal Costanza" <··@p-cos.net> wrote:
> Jamie Border wrote:
>> "Pascal Costanza" <··@p-cos.net> wrote:
>>
>>>>Ramble ramble ramble, but my real point is: why aren't there more Paul 
>>>>Grahams doing the same thing?  Is the problem so intrinsically hard? 
>>>>Does nobody care?
>>>
>>>There are a number of alternative Lisp dialects being worked on, if 
>>>that's what you mean. An important question is whether such attempted 
>>>fragmentation of the community really buys you anything. It's really hard 
>>>to beat the feature set of Common Lisp.
>>
>> Ah.  I failed to express myself properly.  What I meant was this:
>>
>> The CL standardisation process took disparate Lisps and made them easy to 
>> modify to make them standard-compliant.
>>
>> Couldn't we (well, probably not me, I'm still green) do a similar thing; 
>> taking the best features from Allegro, CMUCL etc, and standardise them?
>
> These things already happen. See for example 
> http://www.cliki.net/ACL-COMPAT or http://uffi.b9.com/
>
> It's important to understand the following: In many other languages, you 
> need some kind of standardization efforts because integrating stuff often 
> requires changes to the respective languages. This is not always obvious, 
> but there is a reason why most languages continually grow, and often in 
> incompatible ways. If there is a strong gap between language-level and 
> library-level features, and as soon as you add sophisticated APIs to 
> languages that started out too simplistic people start to feel a need to 
> make the language as such more flexible. I think this is inevitable.
>
> In Lisp, this kind of pressure to make the language more flexible doesn't 
> exist. The boundary between language and libraries is so blurred that you 
> can very easily add features that pretend to be new language constructs. 
> This effectively means that you can go quite a long way by just 
> implementing libraries before the need for "standardization" ever arises.
>
> People are just doing that, and this works quite well.

Yes.  I suppose any particular feature/package/whatever that is well-written 
and does the job becomes the "standard" until it gets bettered by something 
new.

Not having much exposure to Lisp _culture_ I am just starting to see how 
this would make most  people run screaming.  But there is a difference 
between forciing ANSI-style standards on people and offering consistent 
(which is what I think I should have said) way of doing things they find 
interesting.

Maybe all it needs is for somebody to assemble a reasonably large collection 
of good stuff (I'm thinking UFFI, Iterate, a lot of CLOCC) and position it 
as "a bunch of stuff aimed at Lisp Newbies to help them do useful stuff 
faster".

That way, people like me wouldn't be pissing off half the guys in c.l.l by 
asking (to them) stupid questions.

Maybe less people would be turned off if they saw that they can write their 
(my) pathetic little programs without such a steep learning curve.

Maybe it wouldn't make a blind bit of difference.  I don't know any more. 
But at least I'm not afraid to say "I don't know".

Jamie
>
> Pascal
>
> -- 
> 2nd European Lisp and Scheme Workshop
> July 26 - Glasgow, Scotland - co-located with ECOOP 2005
> http://lisp-ecoop05.bknr.net/ 
From: Pascal Costanza
Subject: Re: Beyond CL?
Date: 
Message-ID: <3jpnm8Fra8q9U1@individual.net>
Jamie Border wrote:

> Yes.  I suppose any particular feature/package/whatever that is well-written 
> and does the job becomes the "standard" until it gets bettered by something 
> new.

Note that this is what happens anyway in almost any language, no matter 
how "strong" their standardization mechanisms are.

> Not having much exposure to Lisp _culture_ I am just starting to see how 
> this would make most  people run screaming.  But there is a difference 
> between forciing ANSI-style standards on people and offering consistent 
> (which is what I think I should have said) way of doing things they find 
> interesting.
> 
> Maybe all it needs is for somebody to assemble a reasonably large collection 
> of good stuff (I'm thinking UFFI, Iterate, a lot of CLOCC) and position it 
> as "a bunch of stuff aimed at Lisp Newbies to help them do useful stuff 
> faster".

CL implementations typically come packaged with a number of useful 
extensions, so this already happens. Just pick the implementation that 
you think suits you most (i.e., has the additional libraries / features 
that you want to use). The danger of vendor lock-in is very low.

It would surely be great if there were some more coherence across 
several CL implementations wrt such extensions. But this is really, 
really hard to achieve. Note that, for example, ANSI Common Lisp itself 
wasn't achieved in a day. It actually took some 15 years, or so, to be 
finalized, although almost all of the features were already available in 
some form or the other. It's important to realize that this is what 
you're asking for.

Standards should reflect practice that a considerable large amount of 
people agree is good. Standardization itself is not a goal, just a means 
to an end. What happens in a lot of other places is that things get 
standardized without any, or with very little evidence that they 
actually work in practice. Taking into account that the Lisp community 
is relatively small and has relatively little money, I think it's much 
preferable to actually get important things done now and worry about 
streamlining them later when there is enough impetus.

Note that there are already some efforts to do something along the lines 
that you propose, like the CLRFI process. But again, these things are 
not as simple as they seem and they definitely take their time.


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: O-MY-GLIFE
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121455773.560752.87250@g14g2000cwa.googlegroups.com>
Jamie Border wrote:

> That way we could have portable networking code etc. with no loss of CL
> wonderfuness.
  ^^^^^^^^^^^^

Nyce typo. Congrats.
From: Jamie Border
Subject: Re: Beyond CL?
Date: 
Message-ID: <db98ji$bej$1@nwrdmz03.dmz.ncs.ea.ibs-infra.bt.com>
"O-MY-GLIFE" <·······@seznam.cz> wrote:
> Jamie Border wrote:
>
>> That way we could have portable networking code etc. with no loss of CL
>> wonderfuness.
>  ^^^^^^^^^^^^
>
> Nyce typo. Congrats.
    ^^^^

Sophisticated sense of humour.  Congrats.

Jamie
From: David Steuber
Subject: Re: Beyond CL?
Date: 
Message-ID: <87wtnogjq1.fsf@david-steuber.com>
"Jamie Border" <·····@jborder.com> writes:

> Surely if Torvalds can manage a whole bunch of hackers and build something 
> as big as the Linux kernel, there could be a collaborative effort that might 
> succeed.

Have a look at the number of free lisp implementations under active
development.  For a small community, it seems there  is a wide
variation on what is the right way to implement Common Lisp and vendor
specific extensions.

This would also extend to the commercial lisps as well.  They have the
advantage of being paid to develop lisp so long as they have enough
customers.

Common Lisp has more per user variety than Linux does.

-- 
My .sig file sucks.  Can anyone recommend a better one?
From: R. Mattes
Subject: Re: Beyond CL?
Date: 
Message-ID: <pan.2005.07.14.21.37.37.864742@mh-freiburg.de>
On Thu, 14 Jul 2005 18:48:05 +0000, Jamie Border wrote:

> I've been reading about Arc[1], Paul Graham's Lisp-plus-to-be, and I've been 
> thinking[2]..
> 
> Couldn't we have a CL++ (sorry)

Wouldn't that be (incf CL) - or short INCL ?-)

> with some nice modern features (libraries as 
> good as Python, 

Funny, the python libs i had to work with were of rather mediocre quality.
There is _lots_ of python code, agreed, but quantity isn't quality (and
the same is true for CL libs as well ..).

> good profiling (yes, I've seen
Allegro), blah blah blah, and still keep
> the S-expression syntax (or lack thereof)?
> 
> Aside: I can't see how moving away from s-exp works - won't macros be
> pathetically crippled?
> 
> Ramble ramble ramble, but my real point is: why aren't there more Paul
> Grahams doing the same thing?  Is the problem so intrinsically hard?
> Does nobody care?

I think people do care - see common-lisp.net and the loads of Debian CL
packages. But it does take time -- and we all need to earn money, don't we?

 Cheers RalfD

> Jamie
> 
> [1] http://www.paulgraham.com/arc.html [2] bad idea
From: Jamie Border
Subject: Re: Beyond CL?
Date: 
Message-ID: <db7t92$ahj$1@nwrdmz03.dmz.ncs.ea.ibs-infra.bt.com>
"R. Mattes" <··@mh-freiburg.de> wrote:

Me>> with some nice modern features (libraries as
Me>> good as Python,
>
> Funny, the python libs i had to work with were of rather mediocre quality.

OK, read good as extensive.

> There is _lots_ of python code, agreed, but quantity isn't quality (and
> the same is true for CL libs as well ..).
>
>> good profiling (yes, I've seen
> Allegro), blah blah blah, and still keep
>> the S-expression syntax (or lack thereof)?
>>
>> Aside: I can't see how moving away from s-exp works - won't macros be
>> pathetically crippled?
>>
>> Ramble ramble ramble, but my real point is: why aren't there more Paul
>> Grahams doing the same thing?  Is the problem so intrinsically hard?
>> Does nobody care?
>
> I think people do care - see common-lisp.net and the loads of Debian CL
> packages. But it does take time -- and we all need to earn money, don't 
> we?
>
> Cheers RalfD
>
>> Jamie
>>
>> [1] http://www.paulgraham.com/arc.html
>> [2] bad idea
From: Robert Uhl
Subject: Re: Beyond CL?
Date: 
Message-ID: <m3fyug6ojp.fsf@4dv.net>
"R. Mattes" <··@mh-freiburg.de> writes:
>
> Funny, the python libs i had to work with were of rather mediocre
> quality.  There is _lots_ of python code, agreed, but quantity isn't
> quality (and the same is true for CL libs as well ..).

The python standard library is, if not perfect, pretty damned useful.
E.g. pickle: the ability to take _any_ data structure, write it to a
file, and read it in later.  That's pretty sweet, and as far as I know
there is no CL equivalent.  Or string, which offers more features than
CL's strings.  Or re, the regexp package.  Or unittest.  And _all_ of
these are _guaranteed_ to exist on every single Python installation.
Which is pretty nice.

-- 
Robert Uhl <http://public.xdi.org/=ruhl>
Brought to you by 'Ouchies', the sharp, prickly toy you bathe with...
From: Pascal Costanza
Subject: Re: Beyond CL?
Date: 
Message-ID: <3jq6efFr62e9U1@individual.net>
Robert Uhl wrote:
> "R. Mattes" <··@mh-freiburg.de> writes:
> 
>>Funny, the python libs i had to work with were of rather mediocre
>>quality.  There is _lots_ of python code, agreed, but quantity isn't
>>quality (and the same is true for CL libs as well ..).
> 
> The python standard library is, if not perfect, pretty damned useful.
> E.g. pickle: the ability to take _any_ data structure, write it to a
> file, and read it in later.  That's pretty sweet, and as far as I know
> there is no CL equivalent.

print and read, see http://www.pentaside.org/paper/persistence-lemmens.txt

> Or string, which offers more features than
> CL's strings.  Or re, the regexp package.

http://www.weitz.de/cl-ppcre/

Don't know what you mean wrt strings.

> Or unittest.

http://www.cliki.net/Test%20Framework

> And _all_ of
> these are _guaranteed_ to exist on every single Python installation.
> Which is pretty nice.

...but not across different versions of Python. And not across different 
dialects of Python. Which is pretty bad.

I think portability across implementations and across time is much 
better in Common Lisp.


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Robert Uhl
Subject: Re: Beyond CL?
Date: 
Message-ID: <m37jfo2l9b.fsf@4dv.net>
Pascal Costanza <··@p-cos.net> writes:
>
>> The python standard library is, if not perfect, pretty damned useful.
>> E.g. pickle: the ability to take _any_ data structure, write it to a
>> file, and read it in later.  That's pretty sweet, and as far as I
>> know there is no CL equivalent.
>
> print and read, see http://www.pentaside.org/paper/persistence-lemmens.txt

_Not_ the same thing, you've gotta admit: pickle Just Works, while print
and read don't.  I'll grant that you need to do a few evil things with
pickle and external (e.g. C-based) data types.

>> And _all_ of these are _guaranteed_ to exist on every single Python
>> installation.  Which is pretty nice.
>
> ...but not across different versions of Python. And not across
> different dialects of Python. Which is pretty bad.
>
> I think portability across implementations and across time is much
> better in Common Lisp.

There's CLtL1 and CLtL2 and ANSI CL--those are different versions too.
Granted, it's nice having a steady target, but it's also nice to gain
new functionality.  I don't know enough about the changes in CL to
compare to those in python.

-- 
Robert Uhl <http://public.xdi.org/=ruhl>
Skill without imagination is craftsmanship and gives us many useful
objects such as wickerwork picnic baskets.  Imagination without skill
gives us modern art.                                   --Tom Stoppard
From: Juliusz Chroboczek
Subject: Serialising Lisp objects [was: Beyond CL?]
Date: 
Message-ID: <7i8y01efe6.fsf_-_@lanthane.pps.jussieu.fr>
>>> E.g. pickle: the ability to take _any_ data structure, write it to a
>>> file, and read it in later.  That's pretty sweet, and as far as I
>>> know there is no CL equivalent.

>> print and read, see http://www.pentaside.org/paper/persistence-lemmens.txt

> _Not_ the same thing, you've gotta admit: pickle Just Works, while print
> and read don't.  I'll grant that you need to do a few evil things with
> pickle and external (e.g. C-based) data types.

PRINT and READ just work, as long as you know what you're doing.
(This means defining print methods that do something reasonable when
*PRINT-READABLY* is true, and of course wrapping your print calls in
WITH-STANDARD-IO-SYNTAX.)

The main problem with PRINT/READ is performance.  When you need better
performance, you should use the fasloader (and MAKE-LOAD-FORM) instead.

Unfortunately, the only standard interface to the fasloader is LOAD,
which means that you need to do some unpleasant hacking in order to
portably fasdump random objects.

Most implementations of Common Lisp do provide some sort of sane
interface to the fasloader, though.  It is a pity that no de facto
standard exists for that, though.

(If anyone could write a cross-platform wrapper to the fasdumper
interface, you'd be doing me a favour.)

                                        Juliusz
From: ···········@gmail.com
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121554193.816464.232690@f14g2000cwb.googlegroups.com>
Robert Uhl wrote:
> "R. Mattes" <··@mh-freiburg.de> writes:
> >
> > Funny, the python libs i had to work with were of rather mediocre
> > quality.  There is _lots_ of python code, agreed, but quantity isn't
> > quality (and the same is true for CL libs as well ..).
>
> The python standard library is, if not perfect, pretty damned useful.
> E.g. pickle: the ability to take _any_ data structure, write it to a
> file, and read it in later.

You might want to look at the asdf-installable
cl-store.http://common-lisp.net/project/cl-store/

Cheers.
hy
From: Immanuel Litzroth
Subject: Re: Beyond CL?
Date: 
Message-ID: <3bqbc5ex.fsf@immanuelinux.site>
Robert Uhl <·········@NOSPAMgmail.com> writes:

> "R. Mattes" <··@mh-freiburg.de> writes:
>>
>> Funny, the python libs i had to work with were of rather mediocre
>> quality.  There is _lots_ of python code, agreed, but quantity isn't
>> quality (and the same is true for CL libs as well ..).
>
> The python standard library is, if not perfect, pretty damned useful.
> E.g. pickle: the ability to take _any_ data structure, write it to a
> file, and read it in later.  That's pretty sweet, and as far as I know
> there is no CL equivalent.  Or string, which offers more features than
> CL's strings.  Or re, the regexp package.  Or unittest.  And _all_ of
> these are _guaranteed_ to exist on every single Python installation.
> Which is pretty nice.
>
well, after having read the python documentation I am amazed at your
" the ability to take _any_ data structure ". 

After pickling the function f, redefining f, and then loading the pickle 
it's behaviour is according to the new definition! That's what I call persistence. 
Immanuel
From: Ulrich Hobelmann
Subject: Re: Beyond CL?
Date: 
Message-ID: <3jpd9gFr7eg1U2@individual.net>
R. Mattes wrote:
> Wouldn't that be (incf CL) - or short INCL ?-)

It's not about adding, it's about consing.

'(LISP)

(ok, maybe consing nil isn't too much of an improvement ;) )

-- 
XML is a prime example of retarded innovation.
	-- Erik Meijer and Peter Drayton, Microsoft Corporation
From: Pascal Bourguignon
Subject: Re: Beyond CL?
Date: 
Message-ID: <87ackofgxa.fsf@thalassa.informatimago.com>
Ulrich Hobelmann <···········@web.de> writes:

> R. Mattes wrote:
>> Wouldn't that be (incf CL) - or short INCL ?-)
>
> It's not about adding, it's about consing.
>
> '(LISP)
>
> (ok, maybe consing nil isn't too much of an improvement ;) )

(cons 'common (cons 'lisp (cons '++ nil)))

-- 
__Pascal_Bourguignon__               _  Software patents are endangering
()  ASCII ribbon against html email (o_ the computer industry all around
/\  1962:DO20I=1.100                //\ the world http://lpf.ai.mit.edu/
    2001:my($f)=`fortune`;          V_/   http://petition.eurolinux.org/
From: Pascal Bourguignon
Subject: Re: Beyond CL?
Date: 
Message-ID: <873bqhgh8x.fsf@thalassa.informatimago.com>
"Jamie Border" <·····@jborder.com> writes:

> I've been reading about Arc[1], Paul Graham's Lisp-plus-to-be, and I've been 
> thinking[2]..
>
> Couldn't we have a CL++ (sorry) with some nice modern features (libraries as 
> good as Python, good profiling (yes, I've seen Allegro), blah blah blah, and 
> still keep the S-expression syntax (or lack thereof)?
>
> Aside: I can't see how moving away from s-exp works - won't macros be 
> pathetically crippled?
>
> Ramble ramble ramble, but my real point is: why aren't there more Paul 
> Grahams doing the same thing?  Is the problem so intrinsically hard?  Does 
> nobody care?

Nobody care.  If you want more library, just write then in Common Lisp.

-- 
__Pascal Bourguignon__                     http://www.informatimago.com/

This is a signature virus.  Add me to your signature and help me to live
From: Jamie Border
Subject: Re: Beyond CL?
Date: 
Message-ID: <db7trs$fb5$1@nwrdmz01.dmz.ncs.ea.ibs-infra.bt.com>
"Pascal Bourguignon" <···@informatimago.com> wrote:

Me>> Ramble ramble ramble, but my real point is: why aren't there more Paul
Me>> Grahams doing the same thing?  Is the problem so intrinsically hard? 
Does
Me>> nobody care?
>
> Nobody care.  If you want more library, just write then in Common Lisp.

Yes.  That way we will all reinvent similar wheels.  And mine will be crap 
because I'm so new to Lisp :-(

>
> -- 
> __Pascal Bourguignon__                     http://www.informatimago.com/
>
> This is a signature virus.  Add me to your signature and help me to live 
From: Tim X
Subject: Re: Beyond CL?
Date: 
Message-ID: <873bqgpee5.fsf@tiger.rapttech.com.au>
"Jamie Border" <·····@jborder.com> writes:

> "Pascal Bourguignon" <···@informatimago.com> wrote:
> 
> Me>> Ramble ramble ramble, but my real point is: why aren't there more Paul
> Me>> Grahams doing the same thing?  Is the problem so intrinsically hard? 
> Does
> Me>> nobody care?
> >
> > Nobody care.  If you want more library, just write then in Common Lisp.
> 
> Yes.  That way we will all reinvent similar wheels.  And mine will be crap 
> because I'm so new to Lisp :-(

Ah yes, but it will be much better quality and more flexible crap than
a wheel re-invented in C or Java!

-- 
Tim Cross
The e-mail address on this message is FALSE (obviously!). My real e-mail is
to a company in Australia called rapttech and my login is tcross - if you 
really need to send mail, you should be able to work it out!
From: ····@lycos.com
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121423119.002439.6790@o13g2000cwo.googlegroups.com>
If you wrote some you library , there's a plethora place
to post it. If it's usefull there won't be problem to find
somebody to eager to help to make it better .
Everytime  we need something nontrivial we Google first
than we write it by ourselves .
From: Jamie Border
Subject: Re: Beyond CL?
Date: 
Message-ID: <db828b$q5q$1@nwrdmz01.dmz.ncs.ea.ibs-infra.bt.com>
"Tim X" <····@spamto.devnul.com> wrote:
> "Jamie Border" <·····@jborder.com> writes:
>
>> "Pascal Bourguignon" <···@informatimago.com> wrote:
>>
>> Me>> Ramble ramble ramble, but my real point is: why aren't there more 
>> Paul
>> Me>> Grahams doing the same thing?  Is the problem so intrinsically hard?
>> Does
>> Me>> nobody care?
>> >
>> > Nobody care.  If you want more library, just write then in Common Lisp.
>>
>> Yes.  That way we will all reinvent similar wheels.  And mine will be 
>> crap
>> because I'm so new to Lisp :-(
>
> Ah yes, but it will be much better quality and more flexible crap than
> a wheel re-invented in C or Java!
>
That's what keeps me going.  All the junk I write seems to be a little bit 
better, and I'm enjoying it more.

> -- 
> Tim Cross
> The e-mail address on this message is FALSE (obviously!). My real e-mail 
> is
> to a company in Australia called rapttech and my login is tcross - if you
> really need to send mail, you should be able to work it out! 
From: ····@lycos.com
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121419900.629151.147070@g49g2000cwa.googlegroups.com>
Question :
What's the term for the green programmer
who instead of exercising  and practicing his  coding
 skills is trying to draw attention to himself
 by posting  in forums and newsgroups various "hot" threads  ?

If there's none give your suggestions 'couse
the above are really floating .
From: Jamie Border
Subject: Re: Beyond CL?
Date: 
Message-ID: <db81s0$p72$1@nwrdmz01.dmz.ncs.ea.ibs-infra.bt.com>
<····@lycos.com> wrote in message
> Question :
> What's the term for the green programmer
> who instead of exercising  and practicing his  coding
> skills is trying to draw attention to himself
> by posting  in forums and newsgroups various "hot" threads  ?

Green? No.  New to Lisp, yes.

I am writing as much CL as I can.  I've replaced some dumass utilities that 
I use at work with cells-gtk based stuff, and I'm trying to understand the 
CPS code-walker in UnCommon Web.

I'm trying, but I'm also so new that I don't care if i'm asking old 
questions.

If I wanted to draw attention to myself I would start talking about game 
development in Lisp.

Jamie

>
> If there's none give your suggestions 'couse
> the above are really floating .
> 
From: Tim X
Subject: Re: Beyond CL?
Date: 
Message-ID: <87y888nzh3.fsf@tiger.rapttech.com.au>
"Jamie Border" <·····@jborder.com> writes:

> 
> If I wanted to draw attention to myself I would start talking about game 
> development in Lisp.
> 

Surely you mean you would start talking about how lisp is no good for
game programming because it doesn't have the right GL library in a
compact easy to use format that doesn't depend on some other library
and is portable to every platform, even those which haven't been
invented yet.

-- 
Tim Cross
The e-mail address on this message is FALSE (obviously!). My real e-mail is
to a company in Australia called rapttech and my login is tcross - if you 
really need to send mail, you should be able to work it out!
From: Jamie Border
Subject: Re: Beyond CL?
Date: 
Message-ID: <db8330$8og$1@nwrdmz02.dmz.ncs.ea.ibs-infra.bt.com>
"Tim X" <····@spamto.devnul.com> wrote:
> "Jamie Border" <·····@jborder.com> writes:
>
>>
>> If I wanted to draw attention to myself I would start talking about game
>> development in Lisp.
>>
>
> Surely you mean you would start talking about how lisp is no good for
> game programming because it doesn't have the right GL library in a
> compact easy to use format that doesn't depend on some other library
> and is portable to every platform, even those which haven't been
> invented yet.

Tim, you've made my day.

>
> -- 
> Tim Cross
> The e-mail address on this message is FALSE (obviously!). My real e-mail 
> is
> to a company in Australia called rapttech and my login is tcross - if you
> really need to send mail, you should be able to work it out! 
From: Kenny Tilton
Subject: Re: Beyond CL?
Date: 
Message-ID: <3iGBe.696$Ow4.351053@twister.nyc.rr.com>
Jamie Border wrote:
> I've been reading about Arc[1], Paul Graham's Lisp-plus-to-be, and I've been 
> thinking[2]..
> 
> Couldn't we have a CL++

Congratulations, you have earned a membership in the Complete Fucking 
Morons of Programming Club. These are people who look at the best 
programming language available and worry about why it is not better. A 
necessary requirement is not being smart enough to understand how much 
better the better language is, such that in their wildest dreams they 
could not be held back by said language.

The good news is that the denizens of comp.lang.lisp will leap frothing 
at this pavlovian bell and discuss the profundities of your post for 
weeks to come.

-- 
Kenny

Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film

"I've wrestled with reality for 35 years, Doctor, and I'm happy to state 
I finally won out over it."
     Elwood P. Dowd, "Harvey", 1950
From: ··········@gmail.com
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121405544.653449.122480@g47g2000cwa.googlegroups.com>
If speculating about the possibility of Lisps superior to CL is enough
to get you into the Complete Fucking Morons of Programming Club, is it
Henry Baker or John McCarthy that teaches you the secret handshake?
From: Kenny Tilton
Subject: Re: Beyond CL?
Date: 
Message-ID: <0lKBe.1087$Na6.397324@twister.nyc.rr.com>
··········@gmail.com wrote:
> If speculating about the possibility of Lisps superior to CL is enough
> to get you into the Complete Fucking Morons of Programming Club, is it
> Henry Baker or John McCarthy that teaches you the secret handshake?

That gets you into the Bullshit Debating Tactics Club.


-- 
Kenny

Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film

"I've wrestled with reality for 35 years, Doctor, and I'm happy to state 
I finally won out over it."
     Elwood P. Dowd, "Harvey", 1950
From: Brian
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121434325.346109.83090@o13g2000cwo.googlegroups.com>
Actually, he's quite right.  I think that your post had no value
whatsoever.  Sadly, I see a lot of inflamatory posts with no
constructive value at all in this group...
From: Russell McManus
Subject: Re: Beyond CL?
Date: 
Message-ID: <87d5pkw3fi.fsf@cl-user.org>
"Brian" <········@gmail.com> writes:

> Actually, he's quite right.  I think that your post had no value
> whatsoever.  Sadly, I see a lot of inflamatory posts with no
> constructive value at all in this group...

welcome to usenet
From: Jamie Border
Subject: Re: Beyond CL?
Date: 
Message-ID: <db8go9$f16$1@nwrdmz02.dmz.ncs.ea.ibs-infra.bt.com>
"Brian" <········@gmail.com> wrote:
> Actually, he's quite right.  I think that your post had no value
> whatsoever.  Sadly, I see a lot of inflamatory posts with no
> constructive value at all in this group...

No, they have amusement value.  I don't think that it does any harm to get 
called a fool by somebody as obviously talented as Kenny.

Jamie
From: William D Clinger
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121565825.890290.6020@g49g2000cwa.googlegroups.com>
Kenny Tilton wrote:
> > If speculating about the possibility of Lisps superior to CL is enough
> > to get you into the Complete Fucking Morons of Programming Club, is it
> > Henry Baker or John McCarthy that teaches you the secret handshake?
>
> That gets you into the Bullshit Debating Tactics Club.

Where Kenny is a certified instructor for the secret handshake?

Will
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121574175.330432.203200@g49g2000cwa.googlegroups.com>
I find the attitude that it is wrong to suggest improvements for Common
Lisp short-sighted and pathetic.
From: Paul F. Dietz
Subject: Re: Beyond CL?
Date: 
Message-ID: <XuudnTA1jKhVz0ffRVn-gA@dls.net>
Tron3k wrote:

> I find the attitude that it is wrong to suggest improvements for Common
> Lisp short-sighted and pathetic.

Improvements are just fine.  But they actually have to *be*
improvements, and be worth the cost of adding them.

	Paul
From: jayessay
Subject: Re: Beyond CL?
Date: 
Message-ID: <m3vf38t57u.fsf@rigel.goldenthreadtech.com>
"Paul F. Dietz" <·····@dls.net> writes:

> Tron3k wrote:
> 
> > I find the attitude that it is wrong to suggest improvements for Common
> > Lisp short-sighted and pathetic.
> 
> Improvements are just fine.  But they actually have to *be*
> improvements, and be worth the cost of adding them.

Even more apropos in this case: worth the cost of even _looking_ at them.

/Jon

> 
> 	Paul

-- 
'j' - a n t h o n y at romeo/charley/november com
From: Tayssir John Gabbour
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121621190.635634.192770@g47g2000cwa.googlegroups.com>
Tron3k wrote:
> I find the attitude that it is wrong to suggest improvements for Common
> Lisp short-sighted and pathetic.

I've listened to some political people speak about their "leftist"
experience, and they have an interesting problem of detachment from
others.

What happens is, most people don't like to hear the problems of the
status quo if they don't really feel they can change it. Sane people
make the best out of their situations; they adjust to what the market
provides. You might not care for a prison commisary now, but after a
while in prison you'd adjust to it.

Same with bad TV: we adjust to it, and even dislike good TV shows if
that means 6 nights out of the TV week are ruined because we can only
enjoy 1. In fact, people don't even have to "like" bad TV if it at
least gives them something to talk about with others.

This even applies to Common Lispers -- when we criticize other
languages, you'll literally hear people saying, "Well, having that much
power is terrible!" The pushback is frequently emotional, and in
practice the only way I know to take conversations out of that zone is
by being honest and rational about Lisp's faults. And offer them a
hopefully enjoyable learning opportunity like Peter Seibel's book, so
they don't have to crawl through boring BS which overhypes lists and
recursion.

(Unless that's actually interesting to them, in which case there's the
SICP vids.)

As I see from your comments, you perhaps have a greater feeling of
power to create your own dialect than many Common Lispers. This may
explain the curious sense you're getting that we're hypocritical. And
we are. But there are worse hypocracies.

Anyway, that explanation sounds persuasive to me, though I don't know
whether it's entirely true.
http://www.zmag.org/audio/albnll.ram


Tayssir
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121625925.175476.166650@g47g2000cwa.googlegroups.com>
Tayssir John Gabbour wrote:
> Tron3k wrote:
> > I find the attitude that it is wrong to suggest improvements for Common
> > Lisp short-sighted and pathetic.
>
> I've listened to some political people speak about their "leftist"
> experience, and they have an interesting problem of detachment from
> others.
>
> What happens is, most people don't like to hear the problems of the
> status quo if they don't really feel they can change it. Sane people
> make the best out of their situations; they adjust to what the market
> provides. You might not care for a prison commisary now, but after a
> while in prison you'd adjust to it.
>
> Same with bad TV: we adjust to it, and even dislike good TV shows if
> that means 6 nights out of the TV week are ruined because we can only
> enjoy 1. In fact, people don't even have to "like" bad TV if it at
> least gives them something to talk about with others.
>
> This even applies to Common Lispers -- when we criticize other
> languages, you'll literally hear people saying, "Well, having that much
> power is terrible!" The pushback is frequently emotional, and in
> practice the only way I know to take conversations out of that zone is
> by being honest and rational about Lisp's faults. And offer them a
> hopefully enjoyable learning opportunity like Peter Seibel's book, so
> they don't have to crawl through boring BS which overhypes lists and
> recursion.
>
> (Unless that's actually interesting to them, in which case there's the
> SICP vids.)
>
> As I see from your comments, you perhaps have a greater feeling of
> power to create your own dialect than many Common Lispers. This may
> explain the curious sense you're getting that we're hypocritical. And
> we are. But there are worse hypocracies.
>
> Anyway, that explanation sounds persuasive to me, though I don't know
> whether it's entirely true.
> http://www.zmag.org/audio/albnll.ram
>
>
> Tayssir

I think you made a really excellent analysis of this phenomenon! It
helped me to understand what you guys are feeling.

I guess I can understand how annoying it is for some upstart like me
(or Paul Graham, for that matter) to come in here and start telling you
guys you should do things my way.

It's just that I'm working on my own Lisp which is like Arc except it
has a special, awesome improvement I made. ;-) It's top secret so far.
But programming in it is soooo wicked! So I can't help but keep trying
to come up with new improvements on Common Lisp.

Tron3k
From: Don Geddis
Subject: Re: Beyond CL?
Date: 
Message-ID: <87psthyt8b.fsf@sidious.geddis.org>
"Tron3k" <······@gmail.com> wrote on 16 Jul 2005 21:2:
> I find the attitude that it is wrong to suggest improvements for Common
> Lisp short-sighted and pathetic.

You misunderstand the reaction you've seen, as well as the quality of your
own suggestions.

Plenty of CL improvements have been discussed here with great interest.
SPLIT-SEQUENCE was a recent one.  Grey streams.  Franz's first-class
environments.  Regular expressions.  FFI.

All the people in those discussions knew the historical context, knew why
certain issues had been decided before, and actually worked to design real
improvement.

You, on the other hand, appear to know very little of the huge amount of
language design work that has already gone into Common Lisp, and you express
your naive preferences as though they are the Word of God on programming
language design.  When in reality, the bulk of your ideas had already been
considered by smarter people than you long ago, and rejected for very good
reasons that you're not even aware of.

Try asking more questions, and asserting fewer statements about what is
"broken" in CL and needs to be "fixed".  Perhaps you think EQUALP ought to
be generic and extensible, so you could customize it for your own data
structures.  Instead of staying "EQUALP is CL is broken, and a better CL++
would make it generic and extensible", try asking "why isn't EQUALP extensible
in CL?  What were the designers thinking in making its definition fixed?"

You may be surprised about what you learn about programming language design.
Common Lisp is very, very, very well designed.  The few things that you might
get widespread agreement on being theoretically improvable in some new Lisp
dialect tend to be conscious choices made for backwards compatibility, an
argument which holds even stronger today.

There are a few topics out there which, in hindsight, the designers might have
done slightly differently now that they have 20 years of experience to examine.
But those topics are few and far between, and you haven't come close to finding
any of them.  Lisp-1 vs. Lisp-n is not one.

        -- Don
_______________________________________________________________________________
Don Geddis                  http://don.geddis.org/               ···@geddis.org
They all laughed at Albert Einstein.  They all laughed at Columbus.
Unfortunately, they also all laughed at Bozo the Clown.
	-- William H. Jefferys
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121646125.456707.132250@z14g2000cwz.googlegroups.com>
Don Geddis wrote:
> You may be surprised about what you learn about programming language design.
> Common Lisp is very, very, very well designed.

I wonder about this. By unabashedly stealing from Paul Graham and
adding some of my new ideas, I've been able to create a better (even if
only slightly) variant of Lisp in the space of 2 days.

Note also: Common Lisp LOOP and FORMAT each have much better
replacements floating around the web.

If I personally consider my Lisp better than Common Lisp, then what do
I care what others say about it? I still win.

Tron3k
From: Arthur Lemmens
Subject: Re: Beyond CL?
Date: 
Message-ID: <opst3ll2zgk6vmsw@news.xs4all.nl>
Tron3k <······@gmail.com> wrote:

> I've been able to create a better (even if only slightly) variant
> of Lisp in the space of 2 days.

Haha.
From: Pascal Costanza
Subject: Re: Beyond CL?
Date: 
Message-ID: <3k1anpFrgl93U1@individual.net>
Tron3k wrote:
> Don Geddis wrote:
> 
>>You may be surprised about what you learn about programming language design.
>>Common Lisp is very, very, very well designed.
> 
> I wonder about this. By unabashedly stealing from Paul Graham and
> adding some of my new ideas, I've been able to create a better (even if
> only slightly) variant of Lisp in the space of 2 days.

Did this include writing a number of considerably large programs to see 
how things fit together in the larger picture, or have you put this 
aside for the next two days?


Pascal

P.S.: Maybe I should listen to Kenny earlier next time... :}

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121695547.699775.176680@g47g2000cwa.googlegroups.com>
Pascal Costanza wrote:
> Tron3k wrote:
> > Don Geddis wrote:
> >
> >>You may be surprised about what you learn about programming language design.
> >>Common Lisp is very, very, very well designed.
> >
> > I wonder about this. By unabashedly stealing from Paul Graham and
> > adding some of my new ideas, I've been able to create a better (even if
> > only slightly) variant of Lisp in the space of 2 days.
>
> Did this include writing a number of considerably large programs to see
> how things fit together in the larger picture, or have you put this
> aside for the next two days?
>

What I've done is translate quite a few of the major functions in my
MMORPG game to this new language, and I have been pleasantly surprised
by the neatness and readability of it.
From: Ulrich Hobelmann
Subject: Re: Beyond CL?
Date: 
Message-ID: <3k1vkhFscjqeU1@individual.net>
Tron3k wrote:
> Pascal Costanza wrote:
>> Tron3k wrote:
>>> Don Geddis wrote:
>>>
>>>> You may be surprised about what you learn about programming language design.
>>>> Common Lisp is very, very, very well designed.
>>> I wonder about this. By unabashedly stealing from Paul Graham and
>>> adding some of my new ideas, I've been able to create a better (even if
>>> only slightly) variant of Lisp in the space of 2 days.
>> Did this include writing a number of considerably large programs to see
>> how things fit together in the larger picture, or have you put this
>> aside for the next two days?
>>
> 
> What I've done is translate quite a few of the major functions in my
> MMORPG game to this new language, and I have been pleasantly surprised
> by the neatness and readability of it.

Does that mean, you already have a new-lisp interpreter?  in C, in 
Lisp?

And is that MMORPG public?

-- 
XML is a prime example of retarded innovation.
	-- Erik Meijer and Peter Drayton, Microsoft Corporation
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121701549.640140.191580@f14g2000cwb.googlegroups.com>
Ulrich Hobelmann wrote:
> Tron3k wrote:
> > Pascal Costanza wrote:
> >> Tron3k wrote:
> >>> Don Geddis wrote:
> >>>
> >>>> You may be surprised about what you learn about programming language design.
> >>>> Common Lisp is very, very, very well designed.
> >>> I wonder about this. By unabashedly stealing from Paul Graham and
> >>> adding some of my new ideas, I've been able to create a better (even if
> >>> only slightly) variant of Lisp in the space of 2 days.
> >> Did this include writing a number of considerably large programs to see
> >> how things fit together in the larger picture, or have you put this
> >> aside for the next two days?
> >>
> >
> > What I've done is translate quite a few of the major functions in my
> > MMORPG game to this new language, and I have been pleasantly surprised
> > by the neatness and readability of it.
>
> Does that mean, you already have a new-lisp interpreter?  in C, in
> Lisp?

Well, what I said there was I just translated the functions into the
new syntax which exists in my head. But yes, right now I have written a
very simple READ and EVAL in Common Lisp.

> And is that MMORPG public?

Not yet! I think in a couple of months I will open it up to
comp.lang.lisp for you guys to play in. Maybe that will get me some
credibility, LOL.
From: Don Geddis
Subject: Re: Beyond CL?
Date: 
Message-ID: <87zmsitlk7.fsf@sidious.geddis.org>
> Don Geddis wrote:
>> You may be surprised about what you learn about programming language design.
>> Common Lisp is very, very, very well designed.

"Tron3k" <······@gmail.com> wrote on 17 Jul 2005 17:2:
> I wonder about this. By unabashedly stealing from Paul Graham and
> adding some of my new ideas, I've been able to create a better (even if
> only slightly) variant of Lisp in the space of 2 days.

Nobody believes you about this.  From everything you've told us so far,
everyone around here believes that your "new" Lisp is worse, not better,
than Common Lisp.

> Note also: Common Lisp LOOP and FORMAT each have much better
> replacements floating around the web.

You keep asserting these things like they are facts, that CL is full of
"broken" ideas, and the "better" replacements are obvious.

Try to be a little less arrogant.  Open your mind to the idea that it might
be controversial which version of LOOP or FORMAT is superior.

> If I personally consider my Lisp better than Common Lisp, then what do
> I care what others say about it? I still win.

But you've been asserting that your new language is better for _everyone_,
not just better for you.  That all our lives would be improved if only we
programmed in your new dialect instead of our old, outdated Common Lisp.
That you're inventing the Lisp from "20 years in the future", whose programmers
would "laugh" at the silly design of Common Lisp in 2005.

Clearly, all of this is false.

In fact, it now looks likely that you may never have written a large
application in Common Lisp before.  Because it turns out that almost every
experienced CL programmer first builds up a brand-new mini-language in CL,
and then programs their complex target application in that domain-specific
language.  For their application, their brand new mini-language is MUCH better
than generic Common Lisp.  But none would be so arrogant as to claim that
their mini-language is superior to CL _in_general_.

        -- Don
_______________________________________________________________________________
Don Geddis                  http://don.geddis.org/               ···@geddis.org
Human beings, who are almost unique in having the ability to learn from the
experience of others, are also remarkable for their apparent disinclination to
do so.  -- Douglas Adams, _Last Chance to See_
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121819420.861721.327630@g49g2000cwa.googlegroups.com>
Don Geddis wrote:
> > Don Geddis wrote:
> >> You may be surprised about what you learn about programming language design.
> >> Common Lisp is very, very, very well designed.
>
> "Tron3k" <······@gmail.com> wrote on 17 Jul 2005 17:2:
> > I wonder about this. By unabashedly stealing from Paul Graham and
> > adding some of my new ideas, I've been able to create a better (even if
> > only slightly) variant of Lisp in the space of 2 days.
>
> Nobody believes you about this.  From everything you've told us so far,
> everyone around here believes that your "new" Lisp is worse, not better,
> than Common Lisp.

Nobody knows what my new Lisp *is*. So why does "everyone" leap to the
opportunity of attacking a strawman? That is a sign of insecurity.

> > Note also: Common Lisp LOOP and FORMAT each have much better
> > replacements floating around the web.
>
> You keep asserting these things like they are facts, that CL is full of
> "broken" ideas, and the "better" replacements are obvious.

I have a better LOOP and FORMAT. My FORMAT is of a different form, but
more extensible than CL format. My LOOP is a completely new look at
iteration.

> Try to be a little less arrogant.  Open your mind to the idea that it might
> be controversial which version of LOOP or FORMAT is superior.

I guess Paul Graham is closed-minded too. But then I don't feel bad if
I'm in such good company.

> > If I personally consider my Lisp better than Common Lisp, then what do
> > I care what others say about it? I still win.
>
> But you've been asserting that your new language is better for _everyone_,
> not just better for you.  That all our lives would be improved if only we
> programmed in your new dialect instead of our old, outdated Common Lisp.
> That you're inventing the Lisp from "20 years in the future", whose programmers
> would "laugh" at the silly design of Common Lisp in 2005.
>
> Clearly, all of this is false.

You know what? Maybe it *is* false. But MAYBE, through playing around
with new Lisp ideas, by not allowing my mind to think for one instant
that Lisp == Common Lisp, I will come up with the Lisp of the future.
It's a long shot. But it's more likely I'll do it than someone with the
pathetic attitude you have. Keep repeating to yourself that the
designers of Common Lisp were far smarter than you. That's what I keep
hearing around here. Yes, they were smart, but they were also designing
by committee, and hence the crazier-sounding ideas had no chance.

> In fact, it now looks likely that you may never have written a large
> application in Common Lisp before.  Because it turns out that almost every
> experienced CL programmer first builds up a brand-new mini-language in CL,
> and then programs their complex target application in that domain-specific
> language.  For their application, their brand new mini-language is MUCH better
> than generic Common Lisp.  But none would be so arrogant as to claim that
> their mini-language is superior to CL _in_general_.
>

Well, for my MMORPG I have a mini-language for interacting with OpenGL,
consisting of many macros. Who said anything about mini-languages
anyway?
From: jayessay
Subject: Re: Beyond CL?
Date: 
Message-ID: <m3ackhz7d1.fsf@rigel.goldenthreadtech.com>
"Tron3k" <······@gmail.com> writes:

> Don Geddis wrote:
> > > Don Geddis wrote:
> > >> You may be surprised about what you learn about programming language
> > >> design. Common Lisp is very, very, very well designed.
> >
> > "Tron3k" <······@gmail.com> wrote on 17 Jul 2005 17:2:
> > > I wonder about this. By unabashedly stealing from Paul Graham and
> > > adding some of my new ideas, I've been able to create a better (even if
> > > only slightly) variant of Lisp in the space of 2 days.
> >
> > Nobody believes you about this.  From everything you've told us so far,
> > everyone around here believes that your "new" Lisp is worse, not better,
> > than Common Lisp.
> 
> Nobody knows what my new Lisp *is*. So why does "everyone" leap to the
> opportunity of attacking a strawman? That is a sign of insecurity.

Or maybe they simply like replying and playing along with trolls.  Could be.


> > Clearly, all of this is false.
> 
> You know what? Maybe it *is* false. But MAYBE, through playing around

Maybe.  Here's a suggestion.  Just shut up and do it.  When you have
the thing done (at least 80% or so complete) then present it.  Maybe
at a conference.  Don't waste your time spewing the sort of baloney
you've been spewing here in c.l.l.  Unless, of course, the spewing of
the baloney is really what you enjoy and not the producing of anything
new and interesting.  If that's the case, I suppose you should just
keep on spewing.  If not, just shut up and do it.  People will be much
more open, respectful, interested and impressed.


/Jon

-- 
'j' - a n t h o n y at romeo/charley/november com
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121909478.584108.295910@o13g2000cwo.googlegroups.com>
jayessay wrote:
> Maybe.  Here's a suggestion.  Just shut up and do it.  When you have
> the thing done (at least 80% or so complete) then present it.  Maybe
> at a conference.

Well I've got the reader going! The evaluator is for tomorrow.

> Don't waste your time spewing the sort of baloney
> you've been spewing here in c.l.l.

I am here to try to work out my ideas. It is, believe it or not,
actually helping. Also, the massive resistance to me here is an
inspiration for me to keep on working.

I do agree that I shouldn't waste too much time here. I probably won't
post as much as I did in my initial burst in the coming weeks.

Also, your rudeness is uncalled for. Let's keep comp.lang.lisp out of
the USENET gutter, shall we?
From: Jamie Border
Subject: [OT] CFMPC (was: Beyond CL?)
Date: 
Message-ID: <dbnqj3$nu4$1@nwrdmz01.dmz.ncs.ea.ibs-infra.bt.com>
Tron3k> I do agree that I shouldn't waste too much time here. I probably 
won't
Tron3k> post as much as I did in my initial burst in the coming weeks.

Tron3k, you are cordially invited to join the Complete Fucking Morons of 
Programming Club, of which I am now Vice-Chairman :-)

The published work that has prompted this invitation is intentionally 
misquoted below:

YOU: CL is crap.  I can do better.
CLL: Prove it.
YOU: ...

This leads me to believe that a great future awaits you within CFMPC.

> Also, your rudeness is uncalled for. Let's keep comp.lang.lisp out of
> the USENET gutter, shall we?

I don't think it's in the gutter.  I think it's more likely that the 
guys/gals here, who for the most part are seasoned professionals, ceased to 
be easily offended some time ago.

Afterthought:  I hope you realise that this "invitation" is not sincere...

I want CFMPC all for myself ;-)

Jamie
--

"Tron3k" <······@gmail.com> wrote:
>
>
>
From: jayessay
Subject: Re: Beyond CL?
Date: 
Message-ID: <m3sly8xn23.fsf@rigel.goldenthreadtech.com>
"Tron3k" <······@gmail.com> writes:

> Also, your rudeness is uncalled for. Let's keep comp.lang.lisp out of
> the USENET gutter, shall we?

What rudeness?  That is sound and sincere advice.


/Jon

-- 
'j' - a n t h o n y at romeo/charley/november com
From: Don Geddis
Subject: Re: Beyond CL?
Date: 
Message-ID: <871x5tt3k4.fsf@sidious.geddis.org>
"Tron3k" <······@gmail.com> wrote on 19 Jul 2005 17:3:
> Nobody knows what my new Lisp *is*. So why does "everyone" leap to the
> opportunity of attacking a strawman?

We've attacked only what you've revealed of it so far.  Yes, you may have
some amazing secret magic.  But given what your shown to date, the odds are
against it.

> I guess Paul Graham is closed-minded too. But then I don't feel bad if
> I'm in such good company.

Leaving aside for the moment the question of whether Paul Graham (who you
seem to quote at every opportunity) is the lone genius whose orders we ought
to follow without question, your similarity to him in any way reminds me of
this:
        But the fact that some geniuses were laughed at does not imply that
        all who are laughed at are geniuses.  They laughed at Columbus, they
        laughed at Fulton, they laughed at the Wright brothers.  But they
        also laughed at Bozo the Clown.  [Carl Sagan]

> But MAYBE, through playing around with new Lisp ideas, by not allowing my
> mind to think for one instant that Lisp == Common Lisp, I will come up with
> the Lisp of the future.

Unlikely, given that you don't seem to understand what went in to Common Lisp.
Instead, you appear to be rediscovering things that the programming language
community learned decades ago.  Good for you, but not very interesting for the
state of the art.

> Keep repeating to yourself that the designers of Common Lisp were far
> smarter than you.

No, no, you misunderstood.  The CL designers were far smarter than _you_.
Not far smarter than me.

        -- Don
_______________________________________________________________________________
Don Geddis                  http://don.geddis.org/               ···@geddis.org
Republicans for Voldemort!  -- goats.com
From: Kent M Pitman
Subject: Re: Beyond CL?
Date: 
Message-ID: <uackgb67g.fsf@nhplace.com>
Don Geddis <···@geddis.org> writes:

> > Keep repeating to yourself that the designers of Common Lisp were far
> > smarter than you.
> 
> No, no, you misunderstood.  The CL designers were far smarter than _you_.
> Not far smarter than me.

Sounds like a good debate topic for closing day at some lisp conference...
"Proponents of various lisp dialects will debate about which dialect
had the smartest designer(s) ... and whether it helped."
From: Joe Marshall
Subject: Re: Beyond CL?
Date: 
Message-ID: <mzoht45g.fsf@comcast.net>
"Tron3k" <······@gmail.com> writes:

> Nobody knows what my new Lisp *is*. So why does "everyone" leap to the
> opportunity of attacking a strawman? That is a sign of insecurity.

The fact of the matter is that we're scared.

Yes, scared.

Scared that you may have discovered the `Holy Grail' of programming
languages that was right under our noses all along.

> I have a better LOOP and FORMAT. My FORMAT is of a different form, but
> more extensible than CL format. My LOOP is a completely new look at
> iteration.

We're stuck in a rut.  We're entrenched in LOOP, DO, ITERATE, SERIES,
GENERATORS, coroutines, tail-recursion, non-deterministic programming,
and God knows what else.  I can't wait for your new look.

> You know what? Maybe it *is* false. But MAYBE, through playing around
> with new Lisp ideas, by not allowing my mind to think for one instant
> that Lisp == Common Lisp, I will come up with the Lisp of the future.

Maybe, just MAYBE, you've hit upon the *one* (or several!) idea(s)
that has been holding us back.  The ideal LISP that has eluded the
likes of Abelson, Adams, Allen, Baker, Bawden, Blair, Bobrow, Brooks,
Chailloux, Clinger, Danvy, Deutsch, Devin, Diffie, Eastlake, Evans,
Fahlman, Fateman, Felliesen, Fenichel, Gabriel, Greenberg, Greenblatt,
Greussay, Guzman, Hart, Hartley, Hawkinson, Haynes, Hearn, Hillis,
Holloway, Ito, Kelsey, Knight, Krantz, Levin, Martin, Masinter,
McCarthy, Minsky, Moon, Moore, Moses, Murphy, Nelson, Pitman, Quam,
Queinnec, Rees, Shivers, Steele, Sussman, Teitelman, Wand, Weinreb,
White, Winograd, Winston, Yochelson, Zawinski and others too numerous
to mention.

> It's a long shot. But it's more likely I'll do it than someone with the
> pathetic attitude you have. Keep repeating to yourself that the
> designers of Common Lisp were far smarter than you.

Will do!

> That's what I keep hearing around here. Yes, they were smart, but
> they were also designing by committee, and hence the
> crazier-sounding ideas had no chance.

You haven't shared an office with Greenblatt, Greenspun, or Stallman,
have you?

-- 
~jrm
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121909126.022746.71880@o13g2000cwo.googlegroups.com>
Joe Marshall wrote:
> Maybe, just MAYBE, you've hit upon the *one* (or several!) idea(s)
> that has been holding us back.  The ideal LISP that has eluded the
> likes of Abelson, Adams, Allen, Baker, Bawden, Blair, Bobrow, Brooks,
> Chailloux, Clinger, Danvy, Deutsch, Devin, Diffie, Eastlake, Evans,
> Fahlman, Fateman, Felliesen, Fenichel, Gabriel, Greenberg, Greenblatt,
> Greussay, Guzman, Hart, Hartley, Hawkinson, Haynes, Hearn, Hillis,
> Holloway, Ito, Kelsey, Knight, Krantz, Levin, Martin, Masinter,
> McCarthy, Minsky, Moon, Moore, Moses, Murphy, Nelson, Pitman, Quam,
> Queinnec, Rees, Shivers, Steele, Sussman, Teitelman, Wand, Weinreb,
> White, Winograd, Winston, Yochelson, Zawinski and others too numerous
> to mention.

Now think about all the people who came before Einstein. ;-)
From: Joe Marshall
Subject: Re: Beyond CL?
Date: 
Message-ID: <1x5tszur.fsf@comcast.net>
"Tron3k" <······@gmail.com> writes:

> Joe Marshall wrote:
>> Maybe, just MAYBE, you've hit upon the *one* (or several!) idea(s)
>> that has been holding us back.  The ideal LISP that has eluded the
>> likes of Abelson, Adams, Allen, Baker, Bawden, Blair, Bobrow, Brooks,
>> Chailloux, Clinger, Danvy, Deutsch, Devin, Diffie, Eastlake, Evans,
>> Fahlman, Fateman, Felliesen, Fenichel, Gabriel, Greenberg, Greenblatt,
>> Greussay, Guzman, Hart, Hartley, Hawkinson, Haynes, Hearn, Hillis,
>> Holloway, Ito, Kelsey, Knight, Krantz, Levin, Martin, Masinter,
>> McCarthy, Minsky, Moon, Moore, Moses, Murphy, Nelson, Pitman, Quam,
>> Queinnec, Rees, Shivers, Steele, Sussman, Teitelman, Wand, Weinreb,
>> White, Winograd, Winston, Yochelson, Zawinski and others too numerous
>> to mention.
>
> Now think about all the people who came before Einstein. ;-)


  I said, ``You know they refused Jesus, too.''
  He said, ``You're not Him.''
                          --- Bob Dylan's 115th Dream


-- 
~jrm
From: Matthias Buelow
Subject: Re: Beyond CL?
Date: 
Message-ID: <86r7dt2f5a.fsf@drjekyll.mkbuelow.net>
Joe Marshall <·············@comcast.net> writes:

>We're stuck in a rut.  We're entrenched in ...
>... non-deterministic programming,

Care to elaborate on that?

mkb.
From: Joe Marshall
Subject: Re: Beyond CL?
Date: 
Message-ID: <ackht2y5.fsf@comcast.net>
Matthias Buelow <···@incubus.de> writes:

> Joe Marshall <·············@comcast.net> writes:
>
>>We're stuck in a rut.  We're entrenched in ...
>>... non-deterministic programming,
>
> Care to elaborate on that?

I had the Screamer package in mind, but there are other examples.

Some problems that are implemented as iteration are actually iterative
searches through a space of possible solutions.  Instead of expressly
iterating, you write the program as if the computer could somehow
*pick* the solution out of a set of possible alternatives.  (In
actuality, the computer tries them one at a time and when it fails,
goes on to the next.)  

To give an example, consider the `Send more money' problem: find an
assignment of the digits 0-9 to the letters D, E, M, N, O, R, S, Y,
such that
   SEND + MORE = MONEY

Chris Double's solution is this:

(let* ((s (an-integer-betweenv 1 9))
       (e (an-integer-betweenv 0 9))
       (n (an-integer-betweenv 0 9))
       (d (an-integer-betweenv 0 9))
       (m (an-integer-betweenv 1 9))
       (o (an-integer-betweenv 0 9))
       (r (an-integer-betweenv 0 9))
       (y (an-integer-betweenv 0 9)))
   (assert! (/=v s e n d m o r y))
   (assert! 
     (=v                 
         (+v (*v 1000 s) (*v 100 e) (*v 10 n) d
             (*v 1000 m) (*v 100 o) (*v 10 r) e)
         (+v (*v 10000 m) (*v 1000 o) (*v 100 n) (*v 10 e) y)))
    (one-value 
      (solution (list s e n d m o r y) (static-ordering #'linear-force))))

The basic idea is that we set up non-deterministic values --- the
binding of S to the non-deterministic value (an-integer-betweenv 1 9)
--- and some constraints --- (/=v s e n d m o r y) --- and ask for a
solution.

The iteration is implicit in the way the search is conducted.

For some problems, this is an incredible win, for others, it is *very*
slow.  (Sigh)

-- 
~jrm
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121910442.474922.215430@g43g2000cwa.googlegroups.com>
Joe Marshall wrote:
> Matthias Buelow <···@incubus.de> writes:
>
> > Joe Marshall <·············@comcast.net> writes:
> >
> >>We're stuck in a rut.  We're entrenched in ...
> >>... non-deterministic programming,
> >
> > Care to elaborate on that?
>
> I had the Screamer package in mind, but there are other examples.
>
> Some problems that are implemented as iteration are actually iterative
> searches through a space of possible solutions.  Instead of expressly
> iterating, you write the program as if the computer could somehow
> *pick* the solution out of a set of possible alternatives.  (In
> actuality, the computer tries them one at a time and when it fails,
> goes on to the next.)
>
> To give an example, consider the `Send more money' problem: find an
> assignment of the digits 0-9 to the letters D, E, M, N, O, R, S, Y,
> such that
>    SEND + MORE = MONEY
>
> Chris Double's solution is this:
>
> (let* ((s (an-integer-betweenv 1 9))
>        (e (an-integer-betweenv 0 9))
>        (n (an-integer-betweenv 0 9))
>        (d (an-integer-betweenv 0 9))
>        (m (an-integer-betweenv 1 9))
>        (o (an-integer-betweenv 0 9))
>        (r (an-integer-betweenv 0 9))
>        (y (an-integer-betweenv 0 9)))
>    (assert! (/=v s e n d m o r y))
>    (assert!
>      (=v
>          (+v (*v 1000 s) (*v 100 e) (*v 10 n) d
>              (*v 1000 m) (*v 100 o) (*v 10 r) e)
>          (+v (*v 10000 m) (*v 1000 o) (*v 100 n) (*v 10 e) y)))
>     (one-value
>       (solution (list s e n d m o r y) (static-ordering #'linear-force))))
>
> The basic idea is that we set up non-deterministic values --- the
> binding of S to the non-deterministic value (an-integer-betweenv 1 9)
> --- and some constraints --- (/=v s e n d m o r y) --- and ask for a
> solution.
>
> The iteration is implicit in the way the search is conducted.
>
> For some problems, this is an incredible win, for others, it is *very*
> slow.  (Sigh)

This stuff is *really* cool! Chapter 22 of On Lisp completely blew me
away the first time I read it. (I tried to explain the idea to one of
my Python friends, but he didn't get it. "Why would you ever want to do
that?" he said. *sigh*)
From: André Thieme
Subject: Re: Beyond CL?
Date: 
Message-ID: <dbk13l$2sb$1@ulric.tng.de>
Tron3k schrieb:

> I wonder about this. By unabashedly stealing from Paul Graham and
> adding some of my new ideas,

 > I've been able to create a better (even if
> only slightly) variant of Lisp in the space of 2 days.

I am impressed. Not only did you master CL, no, you improved it within a 
few hours (less than 48). You are probably the first making a 
programming language better by renaming functions of an existing 
language. I have a suggestion: write in your "new" language a program 
that takes source codes of your language and translates them into CL. So 
you could use mature compilers for your language.

See, making something better than Common Lisp is not the easiest of all 
tasks. Even Paul Graham had to discover that when writing Arc, with the 
consequence that we did not hear anything new about Arc since some.. 
months? Or even years?


> If I personally consider my Lisp better than Common Lisp, then what do
> I care what others say about it? I still win.

Yes, you are the winner. It is your right to find better whatever you want.


Andr�
-- 
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121815434.882307.59050@z14g2000cwz.googlegroups.com>
André Thieme wrote:
> Tron3k schrieb:
>
> > I wonder about this. By unabashedly stealing from Paul Graham and
> > adding some of my new ideas,
>
>  > I've been able to create a better (even if
> > only slightly) variant of Lisp in the space of 2 days.
>
> I am impressed. Not only did you master CL, no, you improved it within a
> few hours (less than 48). You are probably the first making a
> programming language better by renaming functions of an existing
> language.

Haha! If only you knew what I'm really doing.

> I have a suggestion: write in your "new" language a program
> that takes source codes of your language and translates them into CL. So
> you could use mature compilers for your language.

That is what I'll probably do. No reason to write my own compiler.

> See, making something better than Common Lisp is not the easiest of all
> tasks. Even Paul Graham had to discover that when writing Arc, with the
> consequence that we did not hear anything new about Arc since some..
> months? Or even years?

That's what makes me very excited. He must be building Lisp again from
the ground up, including types and everything else. Everything from the
smallest possible set of principles: That takes a while.
From: Ulrich Hobelmann
Subject: Re: Beyond CL?
Date: 
Message-ID: <3k6h25Fsrq5hU2@individual.net>
Tron3k wrote:
>> I am impressed. Not only did you master CL, no, you improved it within a
>> few hours (less than 48). You are probably the first making a
>> programming language better by renaming functions of an existing
>> language.
> 
> Haha! If only you knew what I'm really doing.

Renaming the parentheses, too?

-- 
XML is a prime example of retarded innovation.
	-- Erik Meijer and Peter Drayton, Microsoft Corporation
From: Pascal Costanza
Subject: Re: Beyond CL?
Date: 
Message-ID: <3jvn6dFrsgcfU1@individual.net>
Tron3k wrote:
> I find the attitude that it is wrong to suggest improvements for Common
> Lisp short-sighted and pathetic.

Check out the HyperSpec. There are a number of issues listed that were 
discussed during the standardization of Common Lisp - see 
http://www.lispworks.com/documentation/HyperSpec/Front/X3J13Iss.htm

Apparently, each issue had to follow a certain structure, containing 
details about things like: problem description, rationale, current 
practice, cost to implementors, cost to users, cost of non-adoption.

A _good_ suggestion for improvement should cover such aspects in order 
to have some convincing power. Stuff like "switch from Lisp-n to Lisp-1" 
would have to objectively mention all the drawbacks of such a change. 
See for example the excellent discussion at 
http://www.nhplace.com/kent/Papers/Technical-Issues.html

Another example is the switch from converting identifiers to all upper 
case by default (as in ANSI Common Lisp) vs. keeping case (as in Allegro 
Common Lisp "modern" mode and as proposed for R6RS Scheme). There was 
recently a discussion at comp.lang.scheme about this issue, and in a 
chat at ILC'05, Steve Haflich explained Franz's reasons for the modern 
mode. Essentially, keeping case avoids all kinds of problems with 
characters that don't have unambiguous mappings between lower case and 
upper case, and since the whole world has decided to adopt Unicode, a 
switch to case sensitivity would solve a whole bunch of problems at 
once. I think that could be an actual improvement, but the more 
important point here is that the reasons are stronger than just "I would 
prefer it like that".

So essentially, if your arguments for or against a change are not 
convincing enough, it all just boils down to a matter of taste. And 
then, under those circumstances, it is really important to realize that 
there is no need at all to force others to agree to your personal 
aesthetical preferences. Contrary to other language, Lisp is flexible 
enough such that you can just build your own world in which things are 
just exactly like you want them to be. So why bother trying to inflict 
costs upon others that they don't want to bear?

For example, some time ago I have implemented a with-funcalls macro that 
allows you to embed a Lisp-1 programming style within Common Lisp. See 
http://groups-beta.google.com/group/comp.lang.lisp/msg/8fce6ead716e6501

So if you prefer that programming style, there's noone stopping you from 
just using it.

It seems to me that people have just been trained too much in not being 
able to change the language and therefore expecting features to be added 
by vendors (because they're the only ones that can actually do it). 
These restrictions simply don't exist in Lisp. So go ahead, just do what 
you want! ;)


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121626959.931226.224530@g43g2000cwa.googlegroups.com>
Pascal Costanza wrote:
> Tron3k wrote:
> > I find the attitude that it is wrong to suggest improvements for Common
> > Lisp short-sighted and pathetic.
>
> Check out the HyperSpec. There are a number of issues listed that were
> discussed during the standardization of Common Lisp - see
> http://www.lispworks.com/documentation/HyperSpec/Front/X3J13Iss.htm
>
> Apparently, each issue had to follow a certain structure, containing
> details about things like: problem description, rationale, current
> practice, cost to implementors, cost to users, cost of non-adoption.
>
> A _good_ suggestion for improvement should cover such aspects in order
> to have some convincing power. Stuff like "switch from Lisp-n to Lisp-1"
> would have to objectively mention all the drawbacks of such a change.
> See for example the excellent discussion at
> http://www.nhplace.com/kent/Papers/Technical-Issues.html

Ah, I see your point here. In other words, my ideas are pretty
well-known and obvious, but they don't consider the practical problems.
I'll grant you that. Thankfully I am free of worrying about backwards
compatibility since I'm making my own Lisp. :-)

> Another example is the switch from converting identifiers to all upper
> case by default (as in ANSI Common Lisp) vs. keeping case (as in Allegro
> Common Lisp "modern" mode and as proposed for R6RS Scheme). There was
> recently a discussion at comp.lang.scheme about this issue, and in a
> chat at ILC'05, Steve Haflich explained Franz's reasons for the modern
> mode. Essentially, keeping case avoids all kinds of problems with
> characters that don't have unambiguous mappings between lower case and
> upper case, and since the whole world has decided to adopt Unicode, a
> switch to case sensitivity would solve a whole bunch of problems at
> once. I think that could be an actual improvement, but the more
> important point here is that the reasons are stronger than just "I would
> prefer it like that".

I decided a while ago my Lisp will be case-sensitive. It will be
created with Win32 game development in mind, and I don't want people
calling Win32 or OpenGL functions in all small case, that's just hard
to read.

> So essentially, if your arguments for or against a change are not
> convincing enough, it all just boils down to a matter of taste. And
> then, under those circumstances, it is really important to realize that
> there is no need at all to force others to agree to your personal
> aesthetical preferences. Contrary to other language, Lisp is flexible
> enough such that you can just build your own world in which things are
> just exactly like you want them to be. So why bother trying to inflict
> costs upon others that they don't want to bear?

Let's say I've given up on trying to change the Common Lisp standard,
heh. So I won't be forcing anything on anybody.

I'm wondering about 'taste' ... are some 'tastes' universal? For
example, I think everyone could agree on which one is nicer-looking:

(defun dec-list (lst)
   (mapcar #'1- lst))

(def (dec-list lst)
    (map 1- lst))

But then again, not being able to use 'list' as a variable name could
certainly leave a bad 'taste' in some people's mouths. (I guess I'm
different in that I purposely try not to give my variable names the
same names as functions even in Common Lisp, because it gets a bit
confusing.)

> For example, some time ago I have implemented a with-funcalls macro that
> allows you to embed a Lisp-1 programming style within Common Lisp. See
> http://groups-beta.google.com/group/comp.lang.lisp/msg/8fce6ead716e6501
>
> So if you prefer that programming style, there's noone stopping you from
> just using it.

That's very cool! I think it's a given that in general, just about
anything in possible in Common Lisp. In the absolute limit, you could
write your own reader and your own code-walker to transform any
language into Common Lisp.

> It seems to me that people have just been trained too much in not being
> able to change the language and therefore expecting features to be added
> by vendors (because they're the only ones that can actually do it).
> These restrictions simply don't exist in Lisp. So go ahead, just do what
> you want! ;)

Heh, one of the features in my new language is complicated to implement
in Common Lisp. I might need to write a whole new reader in order to do
it. Yes, I can do whatever I want, but it's kind of like saying you can
do whatever you want in C if you write a compiler that takes your new
language to C. ;)
From: M Jared Finder
Subject: Re: Beyond CL?
Date: 
Message-ID: <8KCdncFTzvfsXEffRVn-3A@speakeasy.net>
Tron3k wrote:
> Pascal Costanza wrote:
> 
> Let's say I've given up on trying to change the Common Lisp standard,
> heh. So I won't be forcing anything on anybody.
> 
> I'm wondering about 'taste' ... are some 'tastes' universal? For
> example, I think everyone could agree on which one is nicer-looking:
> 
> (defun dec-list (lst)
>    (mapcar #'1- lst))
> 
> (def (dec-list lst)
>     (map 1- lst))
> 
> But then again, not being able to use 'list' as a variable name could
> certainly leave a bad 'taste' in some people's mouths. (I guess I'm
> different in that I purposely try not to give my variable names the
> same names as functions even in Common Lisp, because it gets a bit
> confusing.)

I prefer the Common Lisp form to the other form because it also applies 
to defining non-value, non-function objects such as structures, classes, 
conditions, loop clauses, generic functions.  It also allows you to 
support different types of variable definitions like with defvar vs 
defparameter vs defconstant.

   -- MJF
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121641382.459844.320280@g14g2000cwa.googlegroups.com>
M Jared Finder wrote:
> Tron3k wrote:
> > Pascal Costanza wrote:
> >
> > Let's say I've given up on trying to change the Common Lisp standard,
> > heh. So I won't be forcing anything on anybody.
> >
> > I'm wondering about 'taste' ... are some 'tastes' universal? For
> > example, I think everyone could agree on which one is nicer-looking:
> >
> > (defun dec-list (lst)
> >    (mapcar #'1- lst))
> >
> > (def (dec-list lst)
> >     (map 1- lst))
> >
> > But then again, not being able to use 'list' as a variable name could
> > certainly leave a bad 'taste' in some people's mouths. (I guess I'm
> > different in that I purposely try not to give my variable names the
> > same names as functions even in Common Lisp, because it gets a bit
> > confusing.)
>
> I prefer the Common Lisp form to the other form because it also applies
> to defining non-value, non-function objects such as structures, classes,
> conditions, loop clauses, generic functions.  It also allows you to
> support different types of variable definitions like with defvar vs
> defparameter vs defconstant.
>
>    -- MJF

Hmm, I don't understand why the DEF form couldn't just apply to
defining everything. This is a Lisp-1 remember, not a Lisp-n. If you
made a class called 'bagel', you could just type bagel at the command
line and get something like:
#<STANDARD CLASS bagel>

Ok, defvar vs. defparameter vs. defconstant. I think defconstant would
be kind of an "optimization" sort of thing; you make some sort of
DECLAIM that this variable will never change and the compiler will
inline it. Interestingly, you could apply this to optimizing functions
as well: when you're shipping your program, you tell the compiler that
all function definitions are constant, and nobody's going to change
them. Then it can compile it as a direct jump or inline it.

defvar vs defparameter? I guess my DEF is really equivalent to
assignment. If you want something like DEFVAR that doesn't modify it if
it is already bound, you could implement it fairly easily.
From: M Jared Finder
Subject: Re: Beyond CL?
Date: 
Message-ID: <tdGdnfVD5IsObkffRVn-vA@speakeasy.net>
Tron3k wrote:
> M Jared Finder wrote:
> 
>>Tron3k wrote:
>>
>>>Pascal Costanza wrote:
>>>
>>>Let's say I've given up on trying to change the Common Lisp standard,
>>>heh. So I won't be forcing anything on anybody.
>>>
>>>I'm wondering about 'taste' ... are some 'tastes' universal? For
>>>example, I think everyone could agree on which one is nicer-looking:
>>>
>>>(defun dec-list (lst)
>>>   (mapcar #'1- lst))
>>>
>>>(def (dec-list lst)
>>>    (map 1- lst))
>>>
>>>But then again, not being able to use 'list' as a variable name could
>>>certainly leave a bad 'taste' in some people's mouths. (I guess I'm
>>>different in that I purposely try not to give my variable names the
>>>same names as functions even in Common Lisp, because it gets a bit
>>>confusing.)
>>
>>I prefer the Common Lisp form to the other form because it also applies
>>to defining non-value, non-function objects such as structures, classes,
>>conditions, loop clauses, generic functions.  It also allows you to
>>support different types of variable definitions like with defvar vs
>>defparameter vs defconstant.
> 
> Hmm, I don't understand why the DEF form couldn't just apply to
> defining everything. This is a Lisp-1 remember, not a Lisp-n. If you
> made a class called 'bagel', you could just type bagel at the command
> line and get something like:
> #<STANDARD CLASS bagel>

You kinda missed my point.  In CL, you define a class by evaluating the 
defclass form (defclass bagel (bread hole-mixin) ...))

What form would you evaluate to define a class in your lisp?  I assume 
it will use def, because to create a special shortcut only for functions 
seems pretty useless.  But how are you exposing this functionality to 
the programmer, so they can add their own types of def clauses?  I 
expect that eventually you'll either have a whole bunch of weird syntax 
for defining different variables vs defining classes vs defining 
functions, or you'll end up with something like:

(def class bagel (bread hole-mixin) ...)
(def package my-package ...)
(def loop-clause (FOR var IN-REVERSE seq) ...)

and you'd have some define-def-form take the place of defmacro, but only 
for def forms.

Using a special syntax will end up ugly, and making such a specialized 
defmacro seems silly.

> Ok, defvar vs. defparameter vs. defconstant. I think defconstant would
> be kind of an "optimization" sort of thing; you make some sort of
> DECLAIM that this variable will never change and the compiler will
> inline it. Interestingly, you could apply this to optimizing functions
> as well: when you're shipping your program, you tell the compiler that
> all function definitions are constant, and nobody's going to change
> them. Then it can compile it as a direct jump or inline it.

I like this idea.  Telling the compiler that you can inline a function 
does seem very similar to telling the compiler that a value will not 
change.  They both can have a significant speed increase, with the 
disadvantage that if you do end up wanting to change the value/function, 
you will most likely have to recompile the world.

   -- MJF
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121648236.642852.7180@g44g2000cwa.googlegroups.com>
M Jared Finder wrote:
> Tron3k wrote:
> > M Jared Finder wrote:
> >
> >>Tron3k wrote:
> >>
> >>>Pascal Costanza wrote:
> >>>
> >>>Let's say I've given up on trying to change the Common Lisp standard,
> >>>heh. So I won't be forcing anything on anybody.
> >>>
> >>>I'm wondering about 'taste' ... are some 'tastes' universal? For
> >>>example, I think everyone could agree on which one is nicer-looking:
> >>>
> >>>(defun dec-list (lst)
> >>>   (mapcar #'1- lst))
> >>>
> >>>(def (dec-list lst)
> >>>    (map 1- lst))
> >>>
> >>>But then again, not being able to use 'list' as a variable name could
> >>>certainly leave a bad 'taste' in some people's mouths. (I guess I'm
> >>>different in that I purposely try not to give my variable names the
> >>>same names as functions even in Common Lisp, because it gets a bit
> >>>confusing.)
> >>
> >>I prefer the Common Lisp form to the other form because it also applies
> >>to defining non-value, non-function objects such as structures, classes,
> >>conditions, loop clauses, generic functions.  It also allows you to
> >>support different types of variable definitions like with defvar vs
> >>defparameter vs defconstant.
> >
> > Hmm, I don't understand why the DEF form couldn't just apply to
> > defining everything. This is a Lisp-1 remember, not a Lisp-n. If you
> > made a class called 'bagel', you could just type bagel at the command
> > line and get something like:
> > #<STANDARD CLASS bagel>
>
> You kinda missed my point.  In CL, you define a class by evaluating the
> defclass form (defclass bagel (bread hole-mixin) ...))
>
> What form would you evaluate to define a class in your lisp?  I assume
> it will use def, because to create a special shortcut only for functions
> seems pretty useless.  But how are you exposing this functionality to
> the programmer, so they can add their own types of def clauses?  I
> expect that eventually you'll either have a whole bunch of weird syntax
> for defining different variables vs defining classes vs defining
> functions, or you'll end up with something like:
>
> (def class bagel (bread hole-mixin) ...)
> (def package my-package ...)
> (def loop-clause (FOR var IN-REVERSE seq) ...)
>
> and you'd have some define-def-form take the place of defmacro, but only
> for def forms.
>
> Using a special syntax will end up ugly, and making such a specialized
> defmacro seems silly.

Hmm, I don't think I'm going to do it that way. (I like your bagel
'hole-mixin'. Hehe.)

I might do it like this:
(def bagel
    (class (bread hole-mixin)
         ...))

With regards to defining a package, I'll just auto-create a package
when it is first referred to.

Things like defining a loop clause or whatever can certainly have their
own def-loop-clause forms or whatever? What's wrong with that? Def is
something of a special case of defining by assignment.

> > Ok, defvar vs. defparameter vs. defconstant. I think defconstant would
> > be kind of an "optimization" sort of thing; you make some sort of
> > DECLAIM that this variable will never change and the compiler will
> > inline it. Interestingly, you could apply this to optimizing functions
> > as well: when you're shipping your program, you tell the compiler that
> > all function definitions are constant, and nobody's going to change
> > them. Then it can compile it as a direct jump or inline it.
>
> I like this idea.

Thanks!

> Telling the compiler that you can inline a function
> does seem very similar to telling the compiler that a value will not
> change.  They both can have a significant speed increase, with the
> disadvantage that if you do end up wanting to change the value/function,
> you will most likely have to recompile the world.

Right, or maybe I'll just assume the user means it when he says he
won't modify something. ;)
But yes the compiler might auto-inline something. Then we have to
recompile the world when the user changes it.
From: M Jared Finder
Subject: Re: Beyond CL?
Date: 
Message-ID: <AYudnTVshYWFk0bfRVn-tg@speakeasy.net>
Tron3k wrote:
> M Jared Finder wrote:
>>Tron3k wrote:
>>>
>>>Hmm, I don't understand why the DEF form couldn't just apply to
>>>defining everything. This is a Lisp-1 remember, not a Lisp-n. If you
>>>made a class called 'bagel', you could just type bagel at the command
>>>line and get something like:
>>>#<STANDARD CLASS bagel>
>>
>>You kinda missed my point.  In CL, you define a class by evaluating the
>>defclass form (defclass bagel (bread hole-mixin) ...))
>>
>>What form would you evaluate to define a class in your lisp?  I assume
>>it will use def, because to create a special shortcut only for functions
>>seems pretty useless.  But how are you exposing this functionality to
>>the programmer, so they can add their own types of def clauses?  I
>>expect that eventually you'll either have a whole bunch of weird syntax
>>for defining different variables vs defining classes vs defining
>>functions, or you'll end up with something like:
>>
>>(def class bagel (bread hole-mixin) ...)
>>(def package my-package ...)
>>(def loop-clause (FOR var IN-REVERSE seq) ...)
>>
>>and you'd have some define-def-form take the place of defmacro, but only
>>for def forms.
>>
>>Using a special syntax will end up ugly, and making such a specialized
>>defmacro seems silly.
> 
> Hmm, I don't think I'm going to do it that way. (I like your bagel
> 'hole-mixin'. Hehe.)
> 
> I might do it like this:
> (def bagel
>     (class (bread hole-mixin)
>          ...))
> 
> With regards to defining a package, I'll just auto-create a package
> when it is first referred to.
> 
> Things like defining a loop clause or whatever can certainly have their
> own def-loop-clause forms or whatever? What's wrong with that? Def is
> something of a special case of defining by assignment.

I'm talking about extendibility, because that is one of *the* strengths 
I see in CL.  Here you are creating a special case shortcut in def that 
only works for functions and is not exposed to the programmer in any 
way.  It doesn't make code significantly faster to not expose this 
functionality, so I don't see what the advantage is.

Much of the code I write is defining functions and variables, but 
another similarly sized chunk of that code is defining non-function, 
non-variable objects, like classes, methods, or loop clauses.

   -- MJF
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121656165.253765.10660@g47g2000cwa.googlegroups.com>
M Jared Finder wrote:
> Tron3k wrote:
> > M Jared Finder wrote:
> >>Tron3k wrote:
> >>>
> >>>Hmm, I don't understand why the DEF form couldn't just apply to
> >>>defining everything. This is a Lisp-1 remember, not a Lisp-n. If you
> >>>made a class called 'bagel', you could just type bagel at the command
> >>>line and get something like:
> >>>#<STANDARD CLASS bagel>
> >>
> >>You kinda missed my point.  In CL, you define a class by evaluating the
> >>defclass form (defclass bagel (bread hole-mixin) ...))
> >>
> >>What form would you evaluate to define a class in your lisp?  I assume
> >>it will use def, because to create a special shortcut only for functions
> >>seems pretty useless.  But how are you exposing this functionality to
> >>the programmer, so they can add their own types of def clauses?  I
> >>expect that eventually you'll either have a whole bunch of weird syntax
> >>for defining different variables vs defining classes vs defining
> >>functions, or you'll end up with something like:
> >>
> >>(def class bagel (bread hole-mixin) ...)
> >>(def package my-package ...)
> >>(def loop-clause (FOR var IN-REVERSE seq) ...)
> >>
> >>and you'd have some define-def-form take the place of defmacro, but only
> >>for def forms.
> >>
> >>Using a special syntax will end up ugly, and making such a specialized
> >>defmacro seems silly.
> >
> > Hmm, I don't think I'm going to do it that way. (I like your bagel
> > 'hole-mixin'. Hehe.)
> >
> > I might do it like this:
> > (def bagel
> >     (class (bread hole-mixin)
> >          ...))
> >
> > With regards to defining a package, I'll just auto-create a package
> > when it is first referred to.
> >
> > Things like defining a loop clause or whatever can certainly have their
> > own def-loop-clause forms or whatever? What's wrong with that? Def is
> > something of a special case of defining by assignment.
>
> I'm talking about extendibility, because that is one of *the* strengths
> I see in CL.  Here you are creating a special case shortcut in def that
> only works for functions and is not exposed to the programmer in any
> way.  It doesn't make code significantly faster to not expose this
> functionality, so I don't see what the advantage is.
>
> Much of the code I write is defining functions and variables, but
> another similarly sized chunk of that code is defining non-function,
> non-variable objects, like classes, methods, or loop clauses.
>
>    -- MJF

I'm a bit confused as to what "special-case shortcut" you're talking
about.
Is it that I'm allowing the user to specify a function as:
(def (plus-one x) (1+ x))
instead of:
(def plus-one
    [x|(1+ x)])
???
[Side note: You could of course do (def plus-one 1+). Hehe.]

I don't see what the problem with that is; it's just like Scheme.
From: M Jared Finder
Subject: Re: Beyond CL?
Date: 
Message-ID: <6bednQBdLJOEskbfRVn-3Q@speakeasy.net>
Tron3k wrote:
> M Jared Finder wrote:
> 
>>I'm talking about extendibility, because that is one of *the* strengths
>>I see in CL.  Here you are creating a special case shortcut in def that
>>only works for functions and is not exposed to the programmer in any
>>way.  It doesn't make code significantly faster to not expose this
>>functionality, so I don't see what the advantage is.
>>
>>Much of the code I write is defining functions and variables, but
>>another similarly sized chunk of that code is defining non-function,
>>non-variable objects, like classes, methods, or loop clauses.
> 
> I'm a bit confused as to what "special-case shortcut" you're talking
> about.
> Is it that I'm allowing the user to specify a function as:
> (def (plus-one x) (1+ x))
> instead of:
> (def plus-one
>     [x|(1+ x)])
> ???
> [Side note: You could of course do (def plus-one 1+). Hehe.]
> 
> I don't see what the problem with that is; it's just like Scheme.

In my first draft of that post, I had the comment "I didn't like it in 
Scheme, and I don't like it now."  The problem I have with such a short 
cut is it makes the assumption that most of what I define is either 
functions or variables.  That is not the case -- an equally huge chunk 
are the other things I keep mentioning.

   -- MJF
From: Pascal Costanza
Subject: Re: Beyond CL?
Date: 
Message-ID: <3k18u0Frv8goU1@individual.net>
M Jared Finder wrote:

> What form would you evaluate to define a class in your lisp?  I assume 
> it will use def, because to create a special shortcut only for functions 
> seems pretty useless.  But how are you exposing this functionality to 
> the programmer, so they can add their own types of def clauses?  I 
> expect that eventually you'll either have a whole bunch of weird syntax 
> for defining different variables vs defining classes vs defining 
> functions, or you'll end up with something like:
> 
> (def class bagel (bread hole-mixin) ...)
> (def package my-package ...)
> (def loop-clause (FOR var IN-REVERSE seq) ...)
> 
> and you'd have some define-def-form take the place of defmacro, but only 
> for def forms.

This reminds me of Marco Antoniotti's definer library. See 
http://common-lisp.net/project/definer/


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Pascal Costanza
Subject: Re: Beyond CL?
Date: 
Message-ID: <3k07i5FrvoseU1@individual.net>
Tron3k wrote:

> If you want something like DEFVAR that doesn't modify it if
> it is already bound, you could implement it fairly easily.

See? Now you're reacting in the same way. ;)

There are always tradeoffs...

Pascal

--
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121645770.272737.77310@g47g2000cwa.googlegroups.com>
Pascal Costanza wrote:
> Tron3k wrote:
>
> > If you want something like DEFVAR that doesn't modify it if
> > it is already bound, you could implement it fairly easily.
>
> See? Now you're reacting in the same way. ;)
>
> There are always tradeoffs...

Hehehe. :-)

Well I'll probably provide this defvar-like thing standard anyway. I
just need a good name ...

Why not make DEF take keyword parameters?

(def x 3
   $new nil) ; signifying that you don't want to put in a 'new'
             ; value if there is already one there

(The dollar sign is the keyword marker ... I appropriated the colon for
my own nefarious purposes.)

I love language design, it's really so fun! :-)
From: Joe Marshall
Subject: Re: Beyond CL?
Date: 
Message-ID: <wtnom5iu.fsf@ccs.neu.edu>
"Tron3k" <······@gmail.com> writes:

> I love language design, it's really so fun! :-)

It is fun, but it isn't easy.  You have to put a lot of thought into
every aspect of the design if you want something good.  I have seen a
*lot* of people make the *exact* same mistakes over and over.
From: ···········@gmail.com
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121709425.102110.284980@g43g2000cwa.googlegroups.com>
perhaps it would be useful to have a list of a bunch of the frequently
made mistakes, both to aid people who are trying to design and/or
implement a language, and to give to people who have just rediscovered
one of the mistakes notice that what they are doing has been tried
before and has failed.
From: Ulrich Hobelmann
Subject: Re: Beyond CL?
Date: 
Message-ID: <3k2it1FrvdvvU1@individual.net>
Joe Marshall wrote:
> "Tron3k" <······@gmail.com> writes:
> 
>> I love language design, it's really so fun! :-)
> 
> It is fun, but it isn't easy.  You have to put a lot of thought into
> every aspect of the design if you want something good.  I have seen a
> *lot* of people make the *exact* same mistakes over and over.

Probably, yes.  Look at C++, C#, Java etc.

But mistakes is how we learn.  A list of problems and their 
solutions would make sense, but sometimes you have to learn it the 
hard way.  There are some hard C programming errors that I guess I 
had to make once, but I never repeat them, because that would take 
far too long.

Books that explain a certain culture (not just how something 
works, but WHY) can help here too, stuff like "The Art of Unix 
Programming."

-- 
XML is a prime example of retarded innovation.
	-- Erik Meijer and Peter Drayton, Microsoft Corporation
From: Joe Marshall
Subject: Re: Beyond CL?
Date: 
Message-ID: <hdepl2yl.fsf@ccs.neu.edu>
Ulrich Hobelmann <···········@web.de> writes:

> Joe Marshall wrote:
>> "Tron3k" <······@gmail.com> writes:
>>
>>> I love language design, it's really so fun! :-)
>> It is fun, but it isn't easy.  You have to put a lot of thought into
>> every aspect of the design if you want something good.  I have seen a
>> *lot* of people make the *exact* same mistakes over and over.
>
> Probably, yes.  Look at C++, C#, Java etc.
>
> But mistakes is how we learn.  

Only if we bother.
From: Peter Seibel
Subject: Re: Beyond CL?
Date: 
Message-ID: <m21x5xgru4.fsf@gigamonkeys.com>
"Tron3k" <······@gmail.com> writes:

> I'm wondering about 'taste' ... are some 'tastes' universal? For
> example, I think everyone could agree on which one is nicer-looking:
>
> (defun dec-list (lst)
>    (mapcar #'1- lst))
>
> (def (dec-list lst)
>     (map 1- lst))

I think everyone will agree that anytime you think everyone could
agree on *anything* you are wrong. (BTW, I prefer the Common Lisp
style version though I'd rename "lst" to "list". But there's no
accounting for taste.)

-Peter

-- 
Peter Seibel           * ·····@gigamonkeys.com
Gigamonkeys Consulting * http://www.gigamonkeys.com/
Practical Common Lisp  * http://www.gigamonkeys.com/book/
From: Pascal Costanza
Subject: Re: Beyond CL?
Date: 
Message-ID: <3jvqdsFs2i6kU1@individual.net>
Tron3k wrote:

>>A _good_ suggestion for improvement should cover such aspects in order
>>to have some convincing power. Stuff like "switch from Lisp-n to Lisp-1"
>>would have to objectively mention all the drawbacks of such a change.
>>See for example the excellent discussion at
>>http://www.nhplace.com/kent/Papers/Technical-Issues.html
> 
> Ah, I see your point here. In other words, my ideas are pretty
> well-known and obvious, but they don't consider the practical problems.
> I'll grant you that. Thankfully I am free of worrying about backwards
> compatibility since I'm making my own Lisp. :-)

Just keep in mind that this could create some adoption problems (which 
may or may not matter to you).

> Let's say I've given up on trying to change the Common Lisp standard,
> heh. So I won't be forcing anything on anybody.

ok ;)

> I'm wondering about 'taste' ... are some 'tastes' universal?

As someone who listens to music most other people in my social context 
find horrible (industrial, experimental, electronic stuff), and enjoys 
other forms of art that are equally off, I strongly disbelieve in the 
notion of universal taste apart from some very small common denominator. 
Just to make the background clear.

> For
> example, I think everyone could agree on which one is nicer-looking:
> 
> (defun dec-list (lst)
>    (mapcar #'1- lst))
> 
> (def (dec-list lst)
>     (map 1- lst))

This seems to be a very hypothetical question because the example 
doesn't seem to do anything useful that I would want to encapsulate in a 
function in the first place. (Yes, I think form and function are related.)

I don't think def instead of defun (or define-function) is a good idea. 
There are other kinds of things that you may want to define, and it's 
good to be able to tell from the car what you're actually doing. For 
example, the first versions of Tiny CLOS (Gregor Kiczales's adoption of 
CLOS for Scheme) relied on everything being first class, so that you 
could write (define person (make-class ...)) instead of (defclass person 
...). However, most derivations of Tiny CLOS have adopted more 
distinctive macros, like defclass, defgeneric and defmethod. So 
apparently, there is something to be gained from being less generic.

In other words, aesthetical preferences can adopt to practical 
considerations. That's why I don't think that #' is in any way ugly.

> But then again, not being able to use 'list' as a variable name could
> certainly leave a bad 'taste' in some people's mouths. (I guess I'm
> different in that I purposely try not to give my variable names the
> same names as functions even in Common Lisp, because it gets a bit
> confusing.)

If it were only about functions and variables, I couldn't care less. The 
good thing about Lisp-n is that it encourages you to create new 
namespaces for your own concepts (which is quite easy to do by using 
hashtables to map from names to concepts). Schemers seem to have a 
tendency to create naming conventions instead, like using angle brackets 
for class names (i.e., they say (define <person> ...) to indicate that 
they want to create a class). I think that overall, working with 
multiple namespaces is safer because the danger of accidental name 
clashes is considerably reduced (not only in macros, but in general).

For example, I also think that using the same namespace for lexical and 
for special variables in Common Lisp is suboptimal exactly for these 
reasons. ISLISP has a better design in this regard in that it 
effectively provides a separate namespace for special (dynamic) 
variables. Again, because two different concepts have to share the same 
namespace, this has lead to a naming convention instead (i.e., the use 
of asterisks to signal that a name is supposed to refer to a special 
binding). This is a workable solution, but not as good as it could be.

>>It seems to me that people have just been trained too much in not being
>>able to change the language and therefore expecting features to be added
>>by vendors (because they're the only ones that can actually do it).
>>These restrictions simply don't exist in Lisp. So go ahead, just do what
>>you want! ;)
> 
> Heh, one of the features in my new language is complicated to implement
> in Common Lisp. I might need to write a whole new reader in order to do
> it. Yes, I can do whatever I want, but it's kind of like saying you can
> do whatever you want in C if you write a compiler that takes your new
> language to C. ;)

Of course, there are limits. Still, even writing a compiler for your new 
language should be easier to do in Common Lisp than it would be in C.

So I am curious. What's so spectacular about your new Lisp?


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Pascal Bourguignon
Subject: Re: Beyond CL?
Date: 
Message-ID: <87fyudb2hf.fsf@thalassa.informatimago.com>
"Tron3k" <······@gmail.com> writes:
> Let's say I've given up on trying to change the Common Lisp standard,
> heh. So I won't be forcing anything on anybody.
>
> I'm wondering about 'taste' ... are some 'tastes' universal? For
> example, I think everyone could agree on which one is nicer-looking:
>
> (defun dec-list (lst)
>    (mapcar #'1- lst))
>
> (def (dec-list lst)
>     (map 1- lst))

I don't know about taste, but why would you want to change Common Lisp
when you can perfectly write:

(defpackage "MY-PGM" (:use "MY-SHORTNAME-LISP"))
(in-package "MY-PGM)
(def (dec-list lst)
     (map 1- lst))

instead of:

(DEFPACKAGE "my-pgm" (:USE "common-lisp"))
(IN-PACKAGE "my-pgm)
(defun dec-list (lst)
    (mapcar #'1- lst))

and it's not even hard to implement.

> Heh, one of the features in my new language is complicated to implement
> in Common Lisp. I might need to write a whole new reader in order to do
> it. Yes, I can do whatever I want, but it's kind of like saying you can
> do whatever you want in C if you write a compiler that takes your new
> language to C. ;)

Why do you think there's a Greenspun's Tenth Law?


-- 
__Pascal Bourguignon__                     http://www.informatimago.com/

Nobody can fix the economy.  Nobody can be trusted with their finger
on the button.  Nobody's perfect.  VOTE FOR NOBODY.
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121640774.096062.180030@z14g2000cwz.googlegroups.com>
Pascal Bourguignon wrote:
> I don't know about taste, but why would you want to change Common Lisp
> when you can perfectly write:
>
> (defpackage "MY-PGM" (:use "MY-SHORTNAME-LISP"))
> (in-package "MY-PGM)
> (def (dec-list lst)
>      (map 1- lst))
>
> instead of:
>
> (DEFPACKAGE "my-pgm" (:USE "common-lisp"))
> (IN-PACKAGE "my-pgm)
> (defun dec-list (lst)
>     (mapcar #'1- lst))
>
> and it's not even hard to implement.
>

Ok, so I'll try to implement my new Lisp on top of Common Lisp if this
is possible. I have one problem though: in my Lisp, if x is (a b c d)
and you eval (x 2), you get c. How can I do this on Common Lisp short
of writing a full-fledged code-walker? If I end up having to write a
code-walker I'd prefer to just make a Lisp from ground-up anyway.
From: ··········@gmail.com
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121662671.246335.101830@g14g2000cwa.googlegroups.com>
Are you familiar with Gauche Scheme?  You can define methods on types
so that you can do something like:

gosh> (define-method object-apply ((s <list>) (i <integer>))
               (list-ref s i))
#<generic object-apply (7)>
gosh> (let ((x '(a b c d)))
               (x 2))
c

Or, of course, you could make it return the ith cdr, i concatenations
of the list, etc.

This is kind of neat, but I think that most of the time when you
evaluate (a-list 2) it's a mistake, and I'd rather have it raise an
error than fail silently.  But I've never used it in practice.
From: Kent M Pitman
Subject: Re: Beyond CL?
Date: 
Message-ID: <u4qas3dad.fsf@nhplace.com>
··········@gmail.com writes:

> Are you familiar with Gauche Scheme?  You can define methods on types
> so that you can do something like:
> 
> gosh> (define-method object-apply ((s <list>) (i <integer>))
>                (list-ref s i))
> #<generic object-apply (7)>
> gosh> (let ((x '(a b c d)))
>                (x 2))
> c
> 
> Or, of course, you could make it return the ith cdr, i concatenations
> of the list, etc.
> 
> This is kind of neat, but I think that most of the time when you
> evaluate (a-list 2) it's a mistake, and I'd rather have it raise an
> error than fail silently.  But I've never used it in practice.

This is the property I keep calling "sparseness" in language design.

In assembly language, it's common to make 0 be an invalid opcode because
0's happen a lot and if you execute non-code by accident, you'll hopefully
quickly find a 0 and the program will stop.

The analogy in higher level languages is to assure that you don't make
everything be "defined".

People with only a superficial familiarity with TECO used to fear that it
would just run forever if given a wrong program, since every single
character was an operator.  But in practice it would quickly get arity
or other bounds errors.  I'd bet the same is so of APL.

In C++, you often run with NULLs well past where you wish because they 
match up typewise with pointers to objects even though NULL really has
a different "shape" than other pointer data of the same "type".  When you
finally get in the debugger, finding the source of the NULL is sometimes
tricky.

In Common Lisp, changes to do things like make everything generic would be
a step in the direction of running longer after an error had happened, only
finding yourself in the debugger when the good debugging context was gone.

- - - 

The other problem with this is that the BIG temptation is to want to do
just as you say--to define LIST.  But CL's basic theory of design says
that you should never define things that are in public/shared packages since
other modules might be tempted to make incompatible definitions of the
same public/shared symbols.  For user data, of course, it's a different
matter.  But then, it's not hard to do

 (defmethod ref ((v vector) (key fixnum)) (aref v key))

 (defmethod ref ((h hash-table) (key t)) (gethash key h))

and then to define your own REF extensions in your own package.

CL doesn't have any way at all to define new things that can go in the
car of a list, so the issue you actually cite in your example doesn't
come up.  You'd have to write (funcall a-list 2) in order for the example
to work in CL, so at that point doing (ref a-list 2) is just as easy
and requires no special language support.

My weird (and properly-named) http://www.nhplace.com/kent/Half-Baked/
addresses the issue of finding syntactic common ground between CL and
Scheme.  I didn't make much success of it, but I think it's tractable.
Maybe some day some bored soul will complete it...
From: Marcin 'Qrczak' Kowalczyk
Subject: Re: Beyond CL?
Date: 
Message-ID: <87fyu88aaz.fsf@qrnik.zagroda>
Kent M Pitman <······@nhplace.com> writes:

> The analogy in higher level languages is to assure that you don't
> make everything be "defined".

Like (car nil), (if 0 x y), (append (< 3 4) '(5)),
(gethash 'a (make-hash-table))...

-- 
   __("<         Marcin Kowalczyk
   \__/       ······@knm.org.pl
    ^^     http://qrnik.knm.org.pl/~qrczak/
From: Pascal Bourguignon
Subject: Re: Beyond CL?
Date: 
Message-ID: <877jfpatwr.fsf@thalassa.informatimago.com>
"Tron3k" <······@gmail.com> writes:

> Pascal Bourguignon wrote:
>> I don't know about taste, but why would you want to change Common Lisp
>> when you can perfectly write:
>>
>> (defpackage "MY-PGM" (:use "MY-SHORTNAME-LISP"))
>> (in-package "MY-PGM)
>> (def (dec-list lst)
>>      (map 1- lst))
>>
>> instead of:
>>
>> (DEFPACKAGE "my-pgm" (:USE "common-lisp"))
>> (IN-PACKAGE "my-pgm)
>> (defun dec-list (lst)
>>     (mapcar #'1- lst))
>>
>> and it's not even hard to implement.
>>
>
> Ok, so I'll try to implement my new Lisp on top of Common Lisp if this
> is possible. I have one problem though: in my Lisp, if x is (a b c d)
> and you eval (x 2), you get c. How can I do this on Common Lisp short
> of writing a full-fledged code-walker? If I end up having to write a
> code-walker I'd prefer to just make a Lisp from ground-up anyway.

Nothing from the other world:

(defpackage "MY-STRANGE-LISP"
 (:use "COMMON-LISP")
 (:shadow "EVAL")
 (:export "EVAL" #|...|#))
(in-package "MY-STRANGE-LISP"
(defun eval (&rest args)
   #|...|#)

Indeed, you need to implement your own eval, since it is not
COMMON-LISP:EVAL, but since you're posting on comp.lang.lisp, we'll
assume you want to implement it in Common Lisp.



-- 
__Pascal_Bourguignon__               _  Software patents are endangering
()  ASCII ribbon against html email (o_ the computer industry all around
/\  1962:DO20I=1.100                //\ the world http://lpf.ai.mit.edu/
    2001:my($f)=`fortune`;          V_/   http://petition.eurolinux.org/
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121649336.761628.197640@f14g2000cwb.googlegroups.com>
Pascal Bourguignon wrote:
> "Tron3k" <······@gmail.com> writes:
>
> > Pascal Bourguignon wrote:
> >> I don't know about taste, but why would you want to change Common Lisp
> >> when you can perfectly write:
> >>
> >> (defpackage "MY-PGM" (:use "MY-SHORTNAME-LISP"))
> >> (in-package "MY-PGM)
> >> (def (dec-list lst)
> >>      (map 1- lst))
> >>
> >> instead of:
> >>
> >> (DEFPACKAGE "my-pgm" (:USE "common-lisp"))
> >> (IN-PACKAGE "my-pgm)
> >> (defun dec-list (lst)
> >>     (mapcar #'1- lst))
> >>
> >> and it's not even hard to implement.
> >>
> >
> > Ok, so I'll try to implement my new Lisp on top of Common Lisp if this
> > is possible. I have one problem though: in my Lisp, if x is (a b c d)
> > and you eval (x 2), you get c. How can I do this on Common Lisp short
> > of writing a full-fledged code-walker? If I end up having to write a
> > code-walker I'd prefer to just make a Lisp from ground-up anyway.
>
> Nothing from the other world:
>
> (defpackage "MY-STRANGE-LISP"
>  (:use "COMMON-LISP")
>  (:shadow "EVAL")
>  (:export "EVAL" #|...|#))
> (in-package "MY-STRANGE-LISP"
> (defun eval (&rest args)
>    #|...|#)
>
> Indeed, you need to implement your own eval, since it is not
> COMMON-LISP:EVAL, but since you're posting on comp.lang.lisp, we'll
> assume you want to implement it in Common Lisp.

As it turns out I need to write my own READer too, so the reading of
forms will be done by my program, the evaluation of forms will be done
by my program, the printing also will be done by my program. But now
I'm not building my language on *top* of Common Lisp anymore, I'm using
Common Lisp as my preferential language to program my interpreter in -
but I could use C++ too (but I won't, hehe).

So yes, that was my original plan from the start.
From: O-MY-GLIFE
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121803324.630206.299020@z14g2000cwz.googlegroups.com>
Tron3k wrote:
> Pascal Bourguignon wrote:

> > Nothing from the other world:
> >
> > (defpackage "MY-STRANGE-LISP"
> >  (:use "COMMON-LISP")
> >  (:shadow "EVAL")
> >  (:export "EVAL" #|...|#))
> > (in-package "MY-STRANGE-LISP"
> > (defun eval (&rest args)
> >    #|...|#)
> >
> > Indeed, you need to implement your own eval, since it is not
> > COMMON-LISP:EVAL, but since you're posting on comp.lang.lisp, we'll
> > assume you want to implement it in Common Lisp.

Recently I had a feeling that I need a WHILE loop. Maybe not a lispy
feeling but what the heck?!

(defmacro while (test &body body)
  `(tagbody
    start
    (unless ,test (go end))
    ,@body
    (go start)
    end))

So if I'll ever want my MY-STRANGE-LISP, I know that I can compile to
to assignments and goto wrapped up in tagbodies and let the lisp
compiler do the dirty job for me. But that assumes that I can live with
standard read & print.

I'm just pointing to another posibility of doing Lisp Expansion...
From: Rob Warnock
Subject: Re: Beyond CL?
Date: 
Message-ID: <a7adnQZO9_qZIHzfRVn-gw@speakeasy.net>
O-MY-GLIFE <·······@seznam.cz> wrote:
+---------------
| Recently I had a feeling that I need a WHILE loop. Maybe not a lispy
| feeling but what the heck?!
| 
| (defmacro while (test &body body)
|   `(tagbody
|     start
|     (unless ,test (go end))
|     ,@body
|     (go start)
|     end))
+---------------

I was about to suggest that you make START & END be gensyms, to allow
proper nesting, then realized there's a much simpler solution:

    (defmacro while (test &body body)
      `(do () ((not ,test)) ,@body))

But in CMUCL, guess what that expands to? Surprise, surprise!  ;-}  ;-}

    > (macroexpand '(while (< x 3) (print x) (incf x)))

    (BLOCK NIL
      (LET ()
	(TAGBODY
	  (GO #:G981)
	 #:G980
	  (PRINT X)
	  (INCF X)
	  (PSETQ)
	 #:G981
	  (UNLESS (NOT (< X 3)) (GO #:G980))
	  (RETURN-FROM NIL (PROGN)))))
    T
    >


-Rob

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: O-MY-GLIFE
Subject: Re: Beyond CL?
Date: 
Message-ID: <1122144987.520729.234600@f14g2000cwb.googlegroups.com>
Rob Warnock wrote:
> O-MY-GLIFE <·······@seznam.cz> wrote:
> +---------------
> | Recently I had a feeling that I need a WHILE loop. Maybe not a lispy
> | feeling but what the heck?!
> |
> | (defmacro while (test &body body)
> |   `(tagbody
> |     start
> |     (unless ,test (go end))
> |     ,@body
> |     (go start)
> |     end))
> +---------------
>
> I was about to suggest that you make START & END be gensyms, to allow
> proper nesting, then realized there's a much simpler solution:
>
>     (defmacro while (test &body body)
>       `(do () ((not ,test)) ,@body))
>

I was having a "gut feeling" that labels should be local to *tagbody*
(lexically scoped), and indeed they are. As for *do* ... I'm not happy
about it. There is a famous qoute that a language is low level when it
forces you to care about details you don't want to care about.

The *do* is about the only construct in CL for which I must count
parentheses.

> But in CMUCL, guess what that expands to? Surprise, surprise!  ;-}  ;-}
>
>     > (macroexpand '(while (< x 3) (print x) (incf x)))
>
>     (BLOCK NIL
>       (LET ()
> 	(TAGBODY
> 	  (GO #:G981)
> 	 #:G980
> 	  (PRINT X)
> 	  (INCF X)
> 	  (PSETQ)
> 	 #:G981
> 	  (UNLESS (NOT (< X 3)) (GO #:G980))
> 	  (RETURN-FROM NIL (PROGN)))))
>     T
>     >
>

Indeed ...
From: Pascal Costanza
Subject: Re: Beyond CL?
Date: 
Message-ID: <3k08jiFs22hpU1@individual.net>
Tron3k wrote:

> Ok, so I'll try to implement my new Lisp on top of Common Lisp if this
> is possible. I have one problem though: in my Lisp, if x is (a b c d)
> and you eval (x 2), you get c. How can I do this on Common Lisp short
> of writing a full-fledged code-walker? If I end up having to write a
> code-walker I'd prefer to just make a Lisp from ground-up anyway.

(defmacro glet1 ((name value) &body body)
   (let ((index (gensym "INDEX")))
     `(let ((,name ,value))
        (flet ((,name (,index) (elt ,name ,index)))
          ,@body))))

(defmacro glet ((&rest bindings) &body body)
   (reduce (lambda (binding sofar)
             `(glet1 ,binding ,sofar))
           bindings
           :from-end t
           :initial-value `(progn ,@body)))

? (glet ((x '(a b c d)))
     (x 2))
C


Optimizations and better names are left as an exercise to the reader. ;)


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121648566.387490.278970@g14g2000cwa.googlegroups.com>
Pascal Costanza wrote:
> Tron3k wrote:
>
> > Ok, so I'll try to implement my new Lisp on top of Common Lisp if this
> > is possible. I have one problem though: in my Lisp, if x is (a b c d)
> > and you eval (x 2), you get c. How can I do this on Common Lisp short
> > of writing a full-fledged code-walker? If I end up having to write a
> > code-walker I'd prefer to just make a Lisp from ground-up anyway.
>
> (defmacro glet1 ((name value) &body body)
>    (let ((index (gensym "INDEX")))
>      `(let ((,name ,value))
>         (flet ((,name (,index) (elt ,name ,index)))
>           ,@body))))
>
> (defmacro glet ((&rest bindings) &body body)
>    (reduce (lambda (binding sofar)
>              `(glet1 ,binding ,sofar))
>            bindings
>            :from-end t
>            :initial-value `(progn ,@body)))
>
> ? (glet ((x '(a b c d)))
>      (x 2))
> C
>
>
> Optimizations and better names are left as an exercise to the reader. ;)
>
>
> Pascal
>
> --
> 2nd European Lisp and Scheme Workshop
> July 26 - Glasgow, Scotland - co-located with ECOOP 2005
> http://lisp-ecoop05.bknr.net/

Clever. Unfortunately it doesn't work. For example:

('(1 2 3) 2) must evaluate to 3.
From: Pascal Costanza
Subject: Re: Beyond CL?
Date: 
Message-ID: <3k18gvFs8ql2U1@individual.net>
Tron3k wrote:

> Clever. Unfortunately it doesn't work. For example:
> 
> ('(1 2 3) 2) must evaluate to 3.

What does this buy you? Why don't you just write 3?

At least one of the two elements should be a variable if this should 
make sense. If the list a , my macro does the job. If the index is a 
variable, why don't you just use nth or elt?


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121695414.809062.243580@z14g2000cwz.googlegroups.com>
Pascal Costanza wrote:
> Tron3k wrote:
>
> > Clever. Unfortunately it doesn't work. For example:
> >
> > ('(1 2 3) 2) must evaluate to 3.
>
> What does this buy you? Why don't you just write 3?
>
> At least one of the two elements should be a variable if this should
> make sense. If the list a , my macro does the job. If the index is a
> variable, why don't you just use nth or elt?
>

Obviously the point is that the first element of the list must be
evaluated. What if x were ((a b c) d e) and you wanted to do ((car x)
1) which evals to b. Why shouldn't you be able to do that?
From: Pascal Costanza
Subject: Re: Beyond CL?
Date: 
Message-ID: <3k1s9mFsd2icU2@individual.net>
Tron3k wrote:
> Pascal Costanza wrote:
> 
>>Tron3k wrote:
>>
>>>Clever. Unfortunately it doesn't work. For example:
>>>
>>>('(1 2 3) 2) must evaluate to 3.
>>
>>What does this buy you? Why don't you just write 3?
>>
>>At least one of the two elements should be a variable if this should
>>make sense. If the list a , my macro does the job. If the index is a
>>variable, why don't you just use nth or elt?
> 
> Obviously the point is that the first element of the list must be
> evaluated. What if x were ((a b c) d e) and you wanted to do ((car x)
> 1) which evals to b. Why shouldn't you be able to do that?

What kinds of problems do you intend to solve with this approach?


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121697111.165413.24300@g44g2000cwa.googlegroups.com>
Pascal Costanza wrote:
> Tron3k wrote:
> > Pascal Costanza wrote:
> >
> >>Tron3k wrote:
> >>
> >>>Clever. Unfortunately it doesn't work. For example:
> >>>
> >>>('(1 2 3) 2) must evaluate to 3.
> >>
> >>What does this buy you? Why don't you just write 3?
> >>
> >>At least one of the two elements should be a variable if this should
> >>make sense. If the list a , my macro does the job. If the index is a
> >>variable, why don't you just use nth or elt?
> >
> > Obviously the point is that the first element of the list must be
> > evaluated. What if x were ((a b c) d e) and you wanted to do ((car x)
> > 1) which evals to b. Why shouldn't you be able to do that?
>
> What kinds of problems do you intend to solve with this approach?

- This feature interacts very nicely with the new syntax. It looks much
better than AREF or NTH (I really dislike AREF, it's friggin' huge!).

- This feature is extensible to many different situations. You can
"call" any object like it is a function, and get different features for
different objects.

- Now I don't need make-instance or constructors because you just "call
the class" to create something from it. You can call an object to get
particular slots or to call methods which are particular to the object.
(We still have multimethods though.) One thing I REALLY like about this
is that you can have a slot like X for a 3D vector object and you can
just say (my-vec-obj x) [uses a special evaluation rule more like a
macro] instead of (vec3-x my-vec-obj).

- All this makes me very happy. :-)
From: Kent M Pitman
Subject: Re: Beyond CL?
Date: 
Message-ID: <uoe90i1sp.fsf@nhplace.com>
"Tron3k" <······@gmail.com> writes:

> - This feature interacts very nicely with the new syntax. It looks much
> better than AREF or NTH (I really dislike AREF, it's friggin' huge!).

But it must be the same for all programs.  

This is like "prayer in schools".  Many religious folk think it's a
fine idea.  But when they get down to figuring out the wording for
those prayers, they realize they disagree...

If you want ELT and someone else wants NTH, you may both think the syntax
elegant but you can't load your code into the same image unless EVERYONE
using the feature agrees.

> - This feature is extensible to many different situations. 

And not extensible in situations you think.  Like standard classes.  Because
of the issue I just cited.  You mustn't be defining it for standard classes
because you will break others' programs that might do the same thing.

And what is the gain?  If you actually do a named function call, everyone
gets what they want.

> - Now I don't need make-instance or constructors because you just "call
> the class" to create something from it.

Assuming someone else doesn't define calling the class to mean invoking it.
e.g., some people will define

 (defclass random-frob () ())
 (defmethod eval-me ((thing random-frob)) (random))

and then you'll have a confusing irregularity.  

All these things you propose work well if you personally are God and can make
everyone in the world want to do things just exactly your way.  Language
design is different than program design--it involves making tools that allow
people to play together without feeling the need to strangle one another for
the ways they have intruded on one another's spaces...

> - All this makes me very happy. :-)

Now we get to the core of it.
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121700747.753408.233560@g47g2000cwa.googlegroups.com>
Kent M Pitman wrote:
> "Tron3k" <······@gmail.com> writes:
>
> > - This feature interacts very nicely with the new syntax. It looks much
> > better than AREF or NTH (I really dislike AREF, it's friggin' huge!).
>
> But it must be the same for all programs.
>
> This is like "prayer in schools".  Many religious folk think it's a
> fine idea.  But when they get down to figuring out the wording for
> those prayers, they realize they disagree...
>
> If you want ELT and someone else wants NTH, you may both think the syntax
> elegant but you can't load your code into the same image unless EVERYONE
> using the feature agrees.
>
> > - This feature is extensible to many different situations.
>
> And not extensible in situations you think.  Like standard classes.  Because
> of the issue I just cited.  You mustn't be defining it for standard classes
> because you will break others' programs that might do the same thing.
>
> And what is the gain?  If you actually do a named function call, everyone
> gets what they want.

About your objections here: why not make the eval-method choice
specific to the package you're in? Then people who don't like each
other's choices won't work in the same package. In fact you could
customize the syntax for the domain you're working in, too!

> > - Now I don't need make-instance or constructors because you just "call
> > the class" to create something from it.
>
> Assuming someone else doesn't define calling the class to mean invoking it.
> e.g., some people will define
>
>  (defclass random-frob () ())
>  (defmethod eval-me ((thing random-frob)) (random))
>
> and then you'll have a confusing irregularity.
>
> All these things you propose work well if you personally are God and can make
> everyone in the world want to do things just exactly your way.  Language
> design is different than program design--it involves making tools that allow
> people to play together without feeling the need to strangle one another for
> the ways they have intruded on one another's spaces...

So the package idea fixes this too.

> > - All this makes me very happy. :-)
>
> Now we get to the core of it.

You say this like it is some dark and evil thing, rather than precisely
the correct way to design a language. I think you need some remedial
courses in PaulGrahamology. ;-)
From: Pascal Bourguignon
Subject: Re: Beyond CL?
Date: 
Message-ID: <87vf38dv8z.fsf@thalassa.informatimago.com>
"Tron3k" <······@gmail.com> writes:
> One thing I REALLY like about this
> is that you can have a slot like X for a 3D vector object and you can
> just say (my-vec-obj x) [uses a special evaluation rule more like a
> macro] instead of (vec3-x my-vec-obj).

What do we earn in writting (my-vec x) instead of (x my-vec) ?


> - All this makes me very happy. :-)

Then enjoy!

-- 
__Pascal Bourguignon__                     http://www.informatimago.com/
Our enemies are innovative and resourceful, and so are we. They never
stop thinking about new ways to harm our country and our people, and
neither do we. -- Georges W. Bush
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121698614.921133.84360@g49g2000cwa.googlegroups.com>
Pascal Bourguignon wrote:
> "Tron3k" <······@gmail.com> writes:
> > One thing I REALLY like about this
> > is that you can have a slot like X for a 3D vector object and you can
> > just say (my-vec-obj x) [uses a special evaluation rule more like a
> > macro] instead of (vec3-x my-vec-obj).
>
> What do we earn in writing (my-vec x) instead of (x my-vec) ?

Well if I had that, then:

T3KLISP-USER> x
#<STANDARD-METHOD x> (or whatever)

And then if I use x somewhere as a variable which I always do, I can't
access the x method. So I always end up calling it vec3-x or something
which is giving redundant information. Even in Common Lisp I do this,
where an 'x' variable wouldn't even interfere with it. It feels
dangerous to take over the ubiquitous  symbol 'x'.

>
>
> > - All this makes me very happy. :-)
> 
> Then enjoy!

Thank you! :-)
From: Pascal Bourguignon
Subject: Re: Beyond CL?
Date: 
Message-ID: <87irz8dr4s.fsf@thalassa.informatimago.com>
"Tron3k" <······@gmail.com> writes:

> Pascal Bourguignon wrote:
>> "Tron3k" <······@gmail.com> writes:
>> > One thing I REALLY like about this
>> > is that you can have a slot like X for a 3D vector object and you can
>> > just say (my-vec-obj x) [uses a special evaluation rule more like a
>> > macro] instead of (vec3-x my-vec-obj).
>>
>> What do we earn in writing (my-vec x) instead of (x my-vec) ?
>
> Well if I had that, then:
>
> T3KLISP-USER> x
> #<STANDARD-METHOD x> (or whatever)
>
> And then if I use x somewhere as a variable which I always do, I can't
> access the x method. So I always end up calling it vec3-x or something
> which is giving redundant information. Even in Common Lisp I do this,
> where an 'x' variable wouldn't even interfere with it. It feels
> dangerous to take over the ubiquitous  symbol 'x'.

However, you don't seem to feel it dangerous to take over the
ubiquitous symbol 'it' in your prose...

-- 
__Pascal Bourguignon__                     http://www.informatimago.com/
Kitty like plastic.
Confuses for litter box.
Don't leave tarp around.
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121704335.371933.186770@f14g2000cwb.googlegroups.com>
Pascal Bourguignon wrote:
> "Tron3k" <······@gmail.com> writes:
>
> > Pascal Bourguignon wrote:
> >> "Tron3k" <······@gmail.com> writes:
> >> > One thing I REALLY like about this
> >> > is that you can have a slot like X for a 3D vector object and you can
> >> > just say (my-vec-obj x) [uses a special evaluation rule more like a
> >> > macro] instead of (vec3-x my-vec-obj).
> >>
> >> What do we earn in writing (my-vec x) instead of (x my-vec) ?
> >
> > Well if I had that, then:
> >
> > T3KLISP-USER> x
> > #<STANDARD-METHOD x> (or whatever)
> >
> > And then if I use x somewhere as a variable which I always do, I can't
> > access the x method. So I always end up calling it vec3-x or something
> > which is giving redundant information. Even in Common Lisp I do this,
> > where an 'x' variable wouldn't even interfere with it. It feels
> > dangerous to take over the ubiquitous  symbol 'x'.
>
> However, you don't seem to feel it dangerous to take over the
> ubiquitous symbol 'it' in your prose...

Well let's consider this in both Common Lisp and T3K-Lisp.

Common Lisp: Yes I probably should use 'x' in Common Lisp; my fear of
it is probably irrational.

T3K-Lisp: There is the problem with variables named 'x'.

So that's pretty simple.
From: Pascal Costanza
Subject: Re: Beyond CL?
Date: 
Message-ID: <3k208kFsaqhjU1@individual.net>
Tron3k wrote:

> - Now I don't need make-instance or constructors because you just "call
> the class" to create something from it. You can call an object to get
> particular slots or to call methods which are particular to the object.

There are some ambiguities involved here. Assume c refers to a class. 
Does (c 'x) create a new instance or refer to a slot in the class 
metaobject? If it is one of the two, how do you achieve the other? 
Assume obj refers to an object. Does (obj print) call the print method 
on obj, or does it return the method metaobject? If it is one of the 
two, how do you achieve the other?

> One thing I REALLY like about this
> is that you can have a slot like X for a 3D vector object and you can
> just say (my-vec-obj x) [uses a special evaluation rule more like a
> macro] instead of (vec3-x my-vec-obj).

You can't use a "macro evaluation rule" (you probably mean macro 
expansion at compile-time?) because you don't know until runtime what 
type the car of an expression will be. Or does your Lisp have static 
type checking?


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121701154.815286.238040@z14g2000cwz.googlegroups.com>
Pascal Costanza wrote:
> Tron3k wrote:
>
> > - Now I don't need make-instance or constructors because you just "call
> > the class" to create something from it. You can call an object to get
> > particular slots or to call methods which are particular to the object.
>
> There are some ambiguities involved here. Assume c refers to a class.
> Does (c 'x) create a new instance or refer to a slot in the class
> metaobject? If it is one of the two, how do you achieve the other?
> Assume obj refers to an object. Does (obj print) call the print method
> on obj, or does it return the method metaobject? If it is one of the
> two, how do you achieve the other?

The idea is that this new syntax is sort of a 'sugar' for what I would
regularly do in a language without it. So c-x is the metaobject and (c
'x) creates a new instance.

> > One thing I REALLY like about this
> > is that you can have a slot like X for a 3D vector object and you can
> > just say (my-vec-obj x) [uses a special evaluation rule more like a
> > macro] instead of (vec3-x my-vec-obj).
>
> You can't use a "macro evaluation rule" (you probably mean macro
> expansion at compile-time?) because you don't know until runtime what
> type the car of an expression will be. Or does your Lisp have static
> type checking?
>

Well, my first try at this is an interpreter. With an interpreter I can
do whatever I want, whenever I want. So I can say, "Oh, this is an
instance of a class, don't evaluate the arguments." I'm just playing
around here, maybe I'll scrap that later.
From: Pascal Costanza
Subject: Re: Beyond CL?
Date: 
Message-ID: <3k21nvFritttU1@individual.net>
Tron3k wrote:

>>>- Now I don't need make-instance or constructors because you just "call
>>>the class" to create something from it. You can call an object to get
>>>particular slots or to call methods which are particular to the object.
>>
>>There are some ambiguities involved here. Assume c refers to a class.
>>Does (c 'x) create a new instance or refer to a slot in the class
>>metaobject? If it is one of the two, how do you achieve the other?
>>Assume obj refers to an object. Does (obj print) call the print method
>>on obj, or does it return the method metaobject? If it is one of the
>>two, how do you achieve the other?
> 
> The idea is that this new syntax is sort of a 'sugar' for what I would
> regularly do in a language without it. So c-x is the metaobject and (c
> 'x) creates a new instance.

I don't understand your response. How do I get the slot 'x from the 
class metaobject? In CLOS, classes are (indirect) instances of 
standard-object as well, so I can say (slot-value c 'x). c-x is what 
metaobject? The class metaobject, the slot metaobject, the method 
metaobject?!? If I get the slot metaobject with c-x, I still don't have 
the slot value. Could you be a bit more precise?

>>>One thing I REALLY like about this
>>>is that you can have a slot like X for a 3D vector object and you can
>>>just say (my-vec-obj x) [uses a special evaluation rule more like a
>>>macro] instead of (vec3-x my-vec-obj).
>>
>>You can't use a "macro evaluation rule" (you probably mean macro
>>expansion at compile-time?) because you don't know until runtime what
>>type the car of an expression will be. Or does your Lisp have static
>>type checking?
> 
> Well, my first try at this is an interpreter. With an interpreter I can
> do whatever I want, whenever I want. So I can say, "Oh, this is an
> instance of a class, don't evaluate the arguments." I'm just playing
> around here, maybe I'll scrap that later.

Probably. ;)


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121702830.418073.155460@g49g2000cwa.googlegroups.com>
Pascal Costanza wrote:
> I don't understand your response. How do I get the slot 'x from the
> class metaobject? In CLOS, classes are (indirect) instances of
> standard-object as well, so I can say (slot-value c 'x). c-x is what
> metaobject? The class metaobject, the slot metaobject, the method
> metaobject?!? If I get the slot metaobject with c-x, I still don't have
> the slot value. Could you be a bit more precise?

Hehe, you've confused me now. Perhaps there is a problem with this
which I will find out later, but this is what I have in my head.

Ok, imagine this situation:

T3KLisp-User> bagel
#<Class bagel>

T3KLisp-User> bagel-tastiness
#<Method bagel-tastiness>

T3KLisp-User> (def x (bagel))
x

T3KLisp-User> x
#<bagel>

T3KLisp-User> (= (x tastiness) 42)     ; '=' is assignment.
42

T3KLisp-User> (x tastiness)
42

T3KLisp-User> (bagel-tastiness x)
42

I guess this makes sense?
From: Pascal Costanza
Subject: Re: Beyond CL?
Date: 
Message-ID: <3k23hdFsbs2vU1@individual.net>
Tron3k wrote:
> Pascal Costanza wrote:
> 
>>I don't understand your response. How do I get the slot 'x from the
>>class metaobject? In CLOS, classes are (indirect) instances of
>>standard-object as well, so I can say (slot-value c 'x). c-x is what
>>metaobject? The class metaobject, the slot metaobject, the method
>>metaobject?!? If I get the slot metaobject with c-x, I still don't have
>>the slot value. Could you be a bit more precise?
> 
> 
> Hehe, you've confused me now. Perhaps there is a problem with this
> which I will find out later, but this is what I have in my head.
> 
> Ok, imagine this situation:
> 
> T3KLisp-User> bagel
> #<Class bagel>
> 
> T3KLisp-User> bagel-tastiness
> #<Method bagel-tastiness>
> 
> T3KLisp-User> (def x (bagel))
> x
> 
> T3KLisp-User> x
> #<bagel>
> 
> T3KLisp-User> (= (x tastiness) 42)     ; '=' is assignment.
> 42
> 
> T3KLisp-User> (x tastiness)
> 42
> 
> T3KLisp-User> (bagel-tastiness x)
> 42
> 
> I guess this makes sense?

That's not what I meant.

Here is what I mean:

CL-USER 1 > (use-package :clos)
T

CL-USER 2 > (defclass instances-class (standard-class)
               ((instances :initform '())))
#<STANDARD-CLASS INSTANCES-CLASS 100A541F>

CL-USER 3 > (defmethod validate-superclass
                        ((class instances-class)
                         (superclass standard-class))
               t)
#<STANDARD-METHOD VALIDATE-SUPERCLASS NIL (INSTANCES-CLASS 
STANDARD-CLASS) 10073D2F>

CL-USER 4 > (defmethod make-instance :around
               ((class instances-class) &rest initargs)
               (declare (dynamic-extent initargs))
               (let ((instance (call-next-method)))
                 (push instance (slot-value class 'instances))
                 instance))
#<STANDARD-METHOD MAKE-INSTANCE (:AROUND) (INSTANCES-CLASS) 1008531B>

CL-USER 5 > (defclass bagel ()
               (tastiness)
               (:metaclass instances-class))
#<INSTANCES-CLASS BAGEL 100A1B4B>

CL-USER 6 > (setf x (make-instance 'bagel))
#<BAGEL 100754F3>

CL-USER 7 > (setf (slot-value x 'tastiness) 42)
42

CL-USER 8 > (slot-value x 'tastiness)
42

CL-USER 9 > (slot-value (find-class 'bagel) 'instances)
(#<BAGEL 10BBD753>)

CL-USER 10 > (slot-value (car *) 'tastiness)
42

CL-USER 11 > (typep x 'standard-object)
T

CL-USER 12 > (typep (find-class 'bagel) 'standard-object)
T


The important step is step 9: Here I access the instances slot of the 
class metaobject. How do you do this in your language, and how do you 
distinguish between accessing slots in the class metaobject and creating 
new instances of that class?


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Matthias Buelow
Subject: Re: Beyond CL?
Date: 
Message-ID: <86irz8yrcb.fsf@drjekyll.mkbuelow.net>
"Tron3k" <······@gmail.com> writes:

>T3KLisp-User> (= (x tastiness) 42)     ; '=' is assignment.

Ugh. That's a bad choice. Use := at least instead of = (or <- or -> or
whatever, if you really want to avoid "set"-style stuff).

mkb.
From: Pascal Bourguignon
Subject: Re: Beyond CL?
Date: 
Message-ID: <873bqcdlht.fsf@thalassa.informatimago.com>
Matthias Buelow <···@incubus.de> writes:

> "Tron3k" <······@gmail.com> writes:
>
>>T3KLisp-User> (= (x tastiness) 42)     ; '=' is assignment.
>
> Ugh. That's a bad choice. Use := at least instead of = (or <- or -> or
> whatever, if you really want to avoid "set"-style stuff).

Well, := <- and -> must be discarted for they're multi-characters.

I'd propose: (← var value) to assign a value 
        and: (⇠ var prop value) to assign a property
        and: (⇦ fun λ v1 v2 · body) to assign a function.

                            ^ 
                            |
                            +-- Note this is not a dot! ;-)

-- 
__Pascal Bourguignon__                     http://www.informatimago.com/

In a World without Walls and Fences, 
who needs Windows and Gates?
From: M Jared Finder
Subject: Re: Beyond CL?
Date: 
Message-ID: <UKCdnVQk4Ye84EHfRVn-uQ@speakeasy.net>
Pascal Bourguignon wrote:
> Matthias Buelow <···@incubus.de> writes:
>>"Tron3k" <······@gmail.com> writes:
>>>T3KLisp-User> (= (x tastiness) 42)     ; '=' is assignment.
>>
>>Ugh. That's a bad choice. Use := at least instead of = (or <- or -> or
>>whatever, if you really want to avoid "set"-style stuff).
> 
> Well, := <- and -> must be discarted for they're multi-characters.
> 
> I'd propose: (← var value) to assign a value 
>         and: (⇠ var prop value) to assign a property
>         and: (⇦ fun λ v1 v2 · body) to assign a function.
> 
>                             ^ 
>                             |
>                             +-- Note this is not a dot! ;-)

Feeling like programming in APL, are you?

   -- MJF
From: Joe Marshall
Subject: Re: Beyond CL?
Date: 
Message-ID: <ll41l2zs.fsf@ccs.neu.edu>
Pascal Bourguignon <···@informatimago.com> writes:

> Matthias Buelow <···@incubus.de> writes:
>
>> "Tron3k" <······@gmail.com> writes:
>>
>>>T3KLisp-User> (= (x tastiness) 42)     ; '=' is assignment.
>>
>> Ugh. That's a bad choice. Use := at least instead of = (or <- or -> or
>> whatever, if you really want to avoid "set"-style stuff).
>
> Well, := <- and -> must be discarted for they're multi-characters.
>
> I'd propose: (← var value) to assign a value 
>         and: (⇠ var prop value) to assign a property
>         and: (⇦ fun λ v1 v2 · body) to assign a function.
>
>                             ^ 
>                             |
>                             +-- Note this is not a dot! ;-)

It sure as hell isn't in iso-8859-1
From: Pascal Bourguignon
Subject: Re: Beyond CL?
Date: 
Message-ID: <87ek9t9j20.fsf@thalassa.informatimago.com>
Joe Marshall <···@ccs.neu.edu> writes:
> It sure as hell isn't in iso-8859-1

No. It was: Content-Type: text/plain; charset=utf-8


-- 
__Pascal Bourguignon__                     http://www.informatimago.com/
Until real software engineering is developed, the next best practice
is to develop with a dynamic system that has extreme late binding in
all aspects. The first system to really do this in an important way
is Lisp. -- Alan Kay
From: Joe Marshall
Subject: Re: Beyond CL?
Date: 
Message-ID: <r7dtt5l1.fsf@comcast.net>
Pascal Bourguignon <···@informatimago.com> writes:

> Joe Marshall <···@ccs.neu.edu> writes:
>> It sure as hell isn't in iso-8859-1
>
> No. It was: Content-Type: text/plain; charset=utf-8

Sigh.  Broken gnus.

-- 
~jrm
From: Robert Uhl
Subject: Re: Beyond CL?
Date: 
Message-ID: <m3r7dsg7v1.fsf@4dv.net>
Joe Marshall <·············@comcast.net> writes:
>
> >> It sure as hell isn't in iso-8859-1
> >
> > No. It was: Content-Type: text/plain; charset=utf-8
> 
> Sigh.  Broken gnus.

Worked for me, and I'm a gnus user...

-- 
Robert Uhl <http://public.xdi.org/=ruhl>
The aggressor is a man of peace.  He wants nothing more than to march
into a neighbouring country unresisted.                  --Clausewitz
From: Matthias Buelow
Subject: Re: Beyond CL?
Date: 
Message-ID: <868y013uiy.fsf@drjekyll.mkbuelow.net>
Joe Marshall <·············@comcast.net> writes:

>> No. It was: Content-Type: text/plain; charset=utf-8
>Sigh.  Broken gnus.

Might be "broken" xemacs. Didn't render properly here either.
Imho it's a bit early yet to assume that utf-8 is generally acceptable
on the 'net (Usenet and email, that is). Many people are using
"traditional" mail+newsreaders which for some time to come won't be
able to properly display Unicode.

mkb.
From: Edi Weitz
Subject: Re: Beyond CL?
Date: 
Message-ID: <uk6jl6n35.fsf@agharta.de>
On Thu, 21 Jul 2005 01:52:21 +0200, Matthias Buelow <···@incubus.de> wrote:

> Joe Marshall <·············@comcast.net> writes:
>
>>Sigh.  Broken gnus.
>
> Might be "broken" xemacs. Didn't render properly here either.

Rendered fine for me - GNU Emacs 22 from CVS, Windows XP.

Cheers,
Edi.

-- 

Lisp is not dead, it just smells funny.

Real email: (replace (subseq ·········@agharta.de" 5) "edi")
From: Matthias Buelow
Subject: Re: Beyond CL?
Date: 
Message-ID: <86vf352f86.fsf@drjekyll.mkbuelow.net>
Edi Weitz <········@agharta.de> writes:

>Rendered fine for me - GNU Emacs 22 from CVS, Windows XP.

Might be MULE.. my xemacs here is a non-mule build.
Or it might just be that xemacs doesn't have utf8-support yet.

mkb.
From: André Thieme
Subject: Tramp on Windows (was: Re: Beyond CL?)
Date: 
Message-ID: <dbn3b9$m8s$1@ulric.tng.de>
Edi Weitz schrieb:
> On Thu, 21 Jul 2005 01:52:21 +0200, Matthias Buelow <···@incubus.de> wrote:
> 
> 
>>Joe Marshall <·············@comcast.net> writes:
>>
>>
>>>Sigh.  Broken gnus.
>>
>>Might be "broken" xemacs. Didn't render properly here either.
> 
> 
> Rendered fine for me - GNU Emacs 22 from CVS, Windows XP.

Btw Edi, do you know how to get Tramp working under Windows without cygwin?


Andr�
-- 
From: Edi Weitz
Subject: Re: Tramp on Windows
Date: 
Message-ID: <uoe8wmuc3.fsf@agharta.de>
On Thu, 21 Jul 2005 05:07:24 +0200, Andr� Thieme <······························@justmail.de> wrote:

> Btw Edi, do you know how to get Tramp working under Windows without
> cygwin?

Never tried it.  I notice that the Tramp manual mentions PuTTY,
though.

-- 

Lisp is not dead, it just smells funny.

Real email: (replace (subseq ·········@agharta.de" 5) "edi")
From: Tim X
Subject: Re: Tramp on Windows
Date: 
Message-ID: <87ackfl643.fsf@tiger.rapttech.com.au>
Edi Weitz <········@agharta.de> writes:

> On Thu, 21 Jul 2005 05:07:24 +0200, André Thieme <······························@justmail.de> wrote:
> 
> > Btw Edi, do you know how to get Tramp working under Windows without
> > cygwin?
> 
> Never tried it.  I notice that the Tramp manual mentions PuTTY,
> though.
> 

Tramp works really well, but you do need an ssh/scp client to get the
most out of it (there are other 'methods' available, but ssh/scp is
the easiest to configure and I've found to be the most reliable).

You will need to setup your ssh keys and then setup an ssh agent to
seemlesly handle your ssh keys. What you need to do is get thinigs
workinig so that when you try to connect to the remote host from
outside emacs, you do something like ssh hostname and don't need to
enter any password or passphrase. I'm not familiar with how this works
with windows ssh clients, but I'm sure you can set things up to work
in a similar way. 

Once you have that working, you can then see about geting tramp to use
this method. One tricky thing can be getting the 'pathname' correct
(nothing to do with lisp pathnames). The format is determined by some
regular expressions in the tramp customization variables. The pathname
patttern is used to distinguish local paths/files from remote
ones. I've had problems in the past trying to use the same pattern as
ange-ftp uses. This problem stems from confusion between ange-ftp and
tramp - it seems that sometimes if tramp is loaded first and then
something loads ange-ftp, when you try to access a remote file, emacs
tries to use ange-ftp instead of tramp. There are a rew solutions I've
seen for this, but I've been lazy and just use a different path
descriptor for tramp than ange-ftp. i.e. ange-ftp uses something like 

·········@some.host:/path/to/file

In tramp I use

/[host]:/path/to/file 

or 

/[····@host]:/path/to/file

if the remote username is different from my local username. 

HTH

Tim

-- 
Tim Cross
The e-mail address on this message is FALSE (obviously!). My real e-mail is
to a company in Australia called rapttech and my login is tcross - if you 
really need to send mail, you should be able to work it out!
From: Tim X
Subject: Re: Tramp on Windows
Date: 
Message-ID: <877jfjl63e.fsf@tiger.rapttech.com.au>
Edi Weitz <········@agharta.de> writes:

> On Thu, 21 Jul 2005 05:07:24 +0200, André Thieme <······························@justmail.de> wrote:
> 
> > Btw Edi, do you know how to get Tramp working under Windows without
> > cygwin?
> 
> Never tried it.  I notice that the Tramp manual mentions PuTTY,
> though.
> 

Tramp works really well, but you do need an ssh/scp client to get the
most out of it (there are other 'methods' available, but ssh/scp is
the easiest to configure and I've found to be the most reliable).

You will need to setup your ssh keys and then setup an ssh agent to
seemlesly handle your ssh keys. What you need to do is get thinigs
workinig so that when you try to connect to the remote host from
outside emacs, you do something like ssh hostname and don't need to
enter any password or passphrase. I'm not familiar with how this works
with windows ssh clients, but I'm sure you can set things up to work
in a similar way. 

Once you have that working, you can then see about geting tramp to use
this method. One tricky thing can be getting the 'pathname' correct
(nothing to do with lisp pathnames). The format is determined by some
regular expressions in the tramp customization variables. The pathname
patttern is used to distinguish local paths/files from remote
ones. I've had problems in the past trying to use the same pattern as
ange-ftp uses. This problem stems from confusion between ange-ftp and
tramp - it seems that sometimes if tramp is loaded first and then
something loads ange-ftp, when you try to access a remote file, emacs
tries to use ange-ftp instead of tramp. There are a rew solutions I've
seen for this, but I've been lazy and just use a different path
descriptor for tramp than ange-ftp. i.e. ange-ftp uses something like 

·········@some.host:/path/to/file

In tramp I use

/[host]:/path/to/file 

or 

/[····@host]:/path/to/file

if the remote username is different from my local username. 

HTH

Tim

-- 
Tim Cross
The e-mail address on this message is FALSE (obviously!). My real e-mail is
to a company in Australia called rapttech and my login is tcross - if you 
really need to send mail, you should be able to work it out!
From: Andrew Raines
Subject: Re: Tramp on Windows (was: Re: Beyond CL?)
Date: 
Message-ID: <uvf34du8e.fsf@raines.ws>
André Thieme <······························@justmail.de> writes:

> Btw Edi, do you know how to get Tramp working
> under Windows without cygwin?

PuTTY makes life on win32 slightly bearable.  If you
understand its components, you can mimic
ssh-agent/ssh on *nix.

Install the whole PuTTY bundle[1] so you get
pageant.exe, plink.exe, and puttygen.exe.  I copy
the executables to c:\WINDOWS because it's already
in exec-path in Emacs.

TRAMP should now try to use plink to connect to the
remote host.  Run Pageant when your machine boots to
store your private key and you won't have to type a
password.  Make sure you use puttygen.exe to create
a key or Pageant won't work.  Put the public half on
your destination host in ~/.ssh/authorized_keys.

Footnotes: 
[1]  http://the.earth.li/~sgtatham/putty/latest/x86/putty-0.58-installer.exe

-- 
    ··@raines.ws (Andrew A. Raines)
From: Pascal Costanza
Subject: Re: Beyond CL?
Date: 
Message-ID: <3k1aqjFs7kl3U1@individual.net>
Pascal Costanza wrote:
> Tron3k wrote:
> 
>> Clever. Unfortunately it doesn't work. For example:
>>
>> ('(1 2 3) 2) must evaluate to 3.
> 
> 
> What does this buy you? Why don't you just write 3?
> 
> At least one of the two elements should be a variable if this should 
> make sense. If the list a , my macro does the job. If the index is a 
> variable, why don't you just use nth or elt?

"If the list is a variable, my macro..."


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Kent M Pitman
Subject: Re: Beyond CL?
Date: 
Message-ID: <u1x5wjhm5.fsf@nhplace.com>
Pascal Costanza <··@p-cos.net> writes:

> Tron3k wrote:
> 
> > Clever. Unfortunately it doesn't work. For example:
> > ('(1 2 3) 2) must evaluate to 3.
> 
> What does this buy you? Why don't you just write 3?
>
> At least one of the two elements should be a variable if this should
> make sense. If the list a , my macro does the job. If the index is a
> variable, why don't you just use nth or elt?

I agree with Pascal here but would put the question a different way:
"Why don't you want to be able to choose NTH or ELT or any of various
other things explicitly?"  In general, there may be multiple ways to
access or traverse a structure, including not only algorithms for 
getting the answer but error-reporting behavior both in terms of 
restricting domain and enforcing range. e.g.,
 (NTH 3 '(A B C))
yields NIL while
 (ELT '(A B C) 3)
signals an error.  I find these clearer than
 ('(A B C) 3)
which appears to give linguistic preference to something that by rights
should be a user/program choice.

But there's a more subtle issue as well:

NOTE WELL: This discussion presumes a Lisp1, which CL is not.  CL does not
 evaluate VARIABLES nor EXPRESSIONS (other than lambda expressions)
 in the car of a list. Discussion below is necessarily hypothetical
 about a world that did do these things.

If you could do (setq my-structure '(a b c)) and then
do (my-structure 2) => C, the problems are several.  Note only does it
appear to give linguistic preference to one of many operators, as
mentioned above, but it also means that you get "other things" falling
through because many things that ought to be caught as errors will fall
through.  This includes not only (setq my-structure "abc") which maybe
some PEOPLE would be happy about but some APPLICATIONS would not be,
but also, more insidiously, people who do (setq id '(lambda (x) x))
and then (numberp (id 2)) => NIL
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121700006.763764.218130@g49g2000cwa.googlegroups.com>
Kent M Pitman wrote:
> Pascal Costanza <··@p-cos.net> writes:
>
> > Tron3k wrote:
> >
> > > Clever. Unfortunately it doesn't work. For example:
> > > ('(1 2 3) 2) must evaluate to 3.
> >
> > What does this buy you? Why don't you just write 3?
> >
> > At least one of the two elements should be a variable if this should
> > make sense. If the list a , my macro does the job. If the index is a
> > variable, why don't you just use nth or elt?
>
> I agree with Pascal here but would put the question a different way:
> "Why don't you want to be able to choose NTH or ELT or any of various
> other things explicitly?"  In general, there may be multiple ways to
> access or traverse a structure, including not only algorithms for
> getting the answer but error-reporting behavior both in terms of
> restricting domain and enforcing range. e.g.,
>  (NTH 3 '(A B C))
> yields NIL while
>  (ELT '(A B C) 3)
> signals an error.  I find these clearer than
>  ('(A B C) 3)
> which appears to give linguistic preference to something that by rights
> should be a user/program choice.

Ok, so the thing with my language is that everything is thought of as a
list, but implementing one REALLY as an array is a decision made by the
compiler under declaration-style advice given by the user. So there do
not exist two different accessors nth and elt. There's really only one
thing you can do. And I will definitely throw an error for all
out-of-range accesses. Is this NIL return for NTH particularly useful?
I haven't ever used it. Also, note that nothing prevents you from
creating your own accessors.

There is something important about my Lisp. All the special features
are adamantly OPTIONAL. You can always do something the Regular Lisp
Way if you want. I suppose you might even be able to disable the
magical features if you desire - in fact, since everything ought to be
customizable, you will almost certainly be able to do so.

> But there's a more subtle issue as well:
>
> NOTE WELL: This discussion presumes a Lisp1, which CL is not.  CL does not
>  evaluate VARIABLES nor EXPRESSIONS (other than lambda expressions)
>  in the car of a list. Discussion below is necessarily hypothetical
>  about a world that did do these things.
>
> If you could do (setq my-structure '(a b c)) and then
> do (my-structure 2) => C, the problems are several.  Note only does it
> appear to give linguistic preference to one of many operators, as
> mentioned above, but it also means that you get "other things" falling
> through because many things that ought to be caught as errors will fall
> through.  This includes not only (setq my-structure "abc") which maybe
> some PEOPLE would be happy about but some APPLICATIONS would not be,
> but also, more insidiously, people who do (setq id '(lambda (x) x))
> and then (numberp (id 2)) => NIL

This is an important issue that I have been quite aware of! Compressing
a language necessarily means that there exist more correct programs!
But I am prepared to accept this. :-) I don't expect my language to be
used in nuclear reactors, only in games for Win32. And maybe I'll end
up the only person using it, who knows. ;-)

[Hmm, I can't see anyone accidentally quoting a lambda. We would use
square brackets for lambdas anyway: [x|x]. But I see your point, of
course.]

Then again, people have told me that C++'s static type checking is good
because it catches bugs very early. The tables have certainly turned ;-)
From: Joe Marshall
Subject: Re: Beyond CL?
Date: 
Message-ID: <1x5wnkdm.fsf@ccs.neu.edu>
"Tron3k" <······@gmail.com> writes:

> This is an important issue that I have been quite aware of! Compressing
> a language necessarily means that there exist more correct programs!

Nonsense.
From: Ray Dillinger
Subject: Re: Beyond CL?
Date: 
Message-ID: <d0wKe.8549$p%3.36010@typhoon.sonic.net>
Joe Marshall wrote:
> "Tron3k" <······@gmail.com> writes:
> 
> 
>>This is an important issue that I have been quite aware of! Compressing
>>a language necessarily means that there exist more correct programs!
> 
> 
> Nonsense.

I offer as an example TECO.  There are some versions of TECO where
EVERY ascii sequence is a valid program (although most are not very
useful ones) and a traditional rite of passage is trying to figure
out the semantics of your own name.

				Bear
From: Kent M Pitman
Subject: Re: Beyond CL?
Date: 
Message-ID: <umznpb3xc.fsf@nhplace.com>
Ray Dillinger <····@sonic.net> writes:

> Joe Marshall wrote:
> > "Tron3k" <······@gmail.com> writes:
> >
> >>This is an important issue that I have been quite aware of! Compressing
> >>a language necessarily means that there exist more correct programs!
> > Nonsense.
> 
> I offer as an example TECO.  There are some versions of TECO where
> EVERY ascii sequence is a valid program

No, where every ASCII character is a command.  That doesn't mean that
every sequence is a valid program any more than every machine word is
a valid operation on the machine means every sequence of instructions
is a valid machine language program.  In ITS TECO, which has the property
you describe, it was relatively easy to stop the interpreter because
different commands take and return different number of args, and you could
often get wrong-type-arg commands. Also, a number of command read characters
after them that must be in a well-defined sequence. e.g., q-reg specs require
a fairly carefully controlled set of things after, so any of the i, q, 
control-] and other commands would stop if you screwed up there.  And then
there are commands that pop the stack that stop a program if you overpop,
commands that require certain kinds of data integrity, etc.

> (although most are not very
> useful ones) and a traditional rite of passage is trying to figure
> out the semantics of your own name.

IMO, this is apocryphal.  I can imagine someone trying to before they
understood the language better, but it's a bit like trying to figure out
what your name and address will compute in assembly code.  Odds are, it
will just segfault.

I do think that there's a concept I've called "sparseness"/"denseness" in
a language definition, which has to do with whether how far away the valid
operations are from one another... You're describing TECO in my terminology
as ultradense, but I don't think it is. I think there are few languages 
that are as dense as you say.

A dialect of Lisp called MuLisp used to have only four kinds of errors, one
of which was out of memory, and I can't recall the other 3.  But I do recall
a lot of things evaluated to NIL, such that you could do something vaguely
of the form:
 (defun add (x y)
   (or (+ x y) (list '+ x y)))
to implement a crude kind of symbolic math such that
 (+ 3 4) => 7
 (+ 'a 4) => (+ a 4)
I suppose you could claim that such a language was quite dense.

But if that's what you mean by compressing, then it's trivially true and
also uninteresting.  That is, if you mean that compressing a language is
defined as "making it more dense", then it's hardly suprising that 
compressing a language leads to a denser language since that's definitional.
I don't think that denseness is a property of text, it's a property of how
semantics is determined and how invalid programs are implicitly separated
from valid ones according to the rules of semantics.

On the other hand, if what you mean to be saying there are more of is
"countable programs", that is, that the number of goedel numbers
assigned to programs in your dense language is larger than the number
of goedel numbers assigned to programs in a more sparse language, then
I think you're barking up the wrong tree.  As long as there are
countably many programs, and you don't manage to perform some sort of
diagonalization proof that distinguishes one of these languages from
another, I think you're going to find they have pretty much the same
number of programs ultimately accessible to them...

You might also be confused by thinking that if the language is compressed
it can get more programs into a finite address space, but a finite address
space is reduced by the size of its interpreter.  And any language that
doesn't have an interpreter starts with enough space that you can write
an interpreter as a user program and then load it with data that would have
been your program, so you end up yet again not managing to defeat the 
ever-applicable third law of thermodynamics ("you can't get out of the game").
From: Adrian Kubala
Subject: Re: Beyond CL?
Date: 
Message-ID: <slrndfn1f5.4l6.adrian-news@sixfingeredman.net>
Kent M Pitman <······@nhplace.com> schrieb:
> I do think that there's a concept I've called "sparseness"/"denseness" in
> a language definition, which has to do with whether how far away the valid
> operations are from one another...

There's another related concept loose/strict, which I think maybe the OP
was talking about. For example: pointer arithmetic is looser, GC is
stricter. Dynamic typing is looser, static typing is stricter. NaN is
loose, throwing an exception is strict. I want to say that loose means
more valid programs, but as you pointed out turing equivalence rules
that out...

Loose programs seem necessarily dense because there are fewer invalid
programs to space out the valid ones. On the other hand, a strict
language can be dense too, which is what TECO sounds like from your
description -- the semantics are strict but two valid, completely
different programs could still differ in only one or two characters.

> I think you're going to find they have pretty much the same number of
> programs ultimately accessible to them...

But maybe we could say that loose languages have more valid programs of
a given size?
From: Bruce Hoult
Subject: Re: Beyond CL?
Date: 
Message-ID: <bruce-5D640F.13161311082005@news.clear.net.nz>
In article <····················@typhoon.sonic.net>,
 Ray Dillinger <····@sonic.net> wrote:

> Joe Marshall wrote:
> > "Tron3k" <······@gmail.com> writes:
> > 
> > 
> >>This is an important issue that I have been quite aware of! Compressing
> >>a language necessarily means that there exist more correct programs!
> > 
> > 
> > Nonsense.
> 
> I offer as an example TECO.  There are some versions of TECO where
> EVERY ascii sequence is a valid program (although most are not very
> useful ones) and a traditional rite of passage is trying to figure
> out the semantics of your own name.

I thought there was nothing worse than TECO.  But then I had to use 
SPEED (on AOS/VS).

TECO with arbitrary limitations!!!!

-- 
Bruce |  41.1670S | \  spoken |          -+-
Hoult | 174.8263E | /\ here.  | ----------O----------
From: Kent M Pitman
Subject: Re: Beyond CL?
Date: 
Message-ID: <uirydb3pw.fsf@nhplace.com>
Bruce Hoult <·····@hoult.org> writes:

> I thought there was nothing worse than TECO.
                              ^^^^^
               typo.  I think you meant "better" here. ;)

Ok, ok, even I can be a troll once in a while.
But TECO was a lovely language with a really special charm
and quite a bit of power...

And lest you think this fact has nothing to do with Lisp:

I wrote a tiny Lisp compiler in ITS Teco that would compile (into TECO,
of course) my Zmail [Lisp Machine mail reader] init file so that I could
share its filters in my PDP10-based ZBABYL mail reader [BABYL + my Zmail
extensions] directly without editing. ;)

That stuff is still available on those PDP10 emulators that are 
running around.
From: Ulrich Hobelmann
Subject: Re: Beyond CL?
Date: 
Message-ID: <3k202eFscjqeU2@individual.net>
Tron3k wrote:
> [Hmm, I can't see anyone accidentally quoting a lambda. We would use
> square brackets for lambdas anyway: [x|x]. But I see your point, of
> course.]

But why a custom syntax just for anonymous functions?  Why not [] 
for array subscripts as well, or for lists?

What's wrong with just keeping the good Lisp syntax, and writing
(fn (x) x) ? (this one like Arc)

> Then again, people have told me that C++'s static type checking is good
> because it catches bugs very early. The tables have certainly turned ;-)

In a way, yes.  But C++ is a pain to write, anyway.  Look at SML 
if you want a nicer type system (but still with enough 
restrictions that dynamic typing might be easier).

-- 
XML is a prime example of retarded innovation.
	-- Erik Meijer and Peter Drayton, Microsoft Corporation
From: André Thieme
Subject: Re: Beyond CL?
Date: 
Message-ID: <dbk21a$2sb$2@ulric.tng.de>
Ulrich Hobelmann schrieb:

> In a way, yes.  But C++ is a pain to write, anyway.  Look at SML if you 
> want a nicer type system (but still with enough restrictions that 
> dynamic typing might be easier).

What restrictions are you thinking about?


Andr�
-- 
From: Ulrich Hobelmann
Subject: Re: Beyond CL?
Date: 
Message-ID: <3k6g6mFsqt3uU1@individual.net>
Andr� Thieme wrote:
> Ulrich Hobelmann schrieb:
> 
>> In a way, yes.  But C++ is a pain to write, anyway.  Look at SML if 
>> you want a nicer type system (but still with enough restrictions that 
>> dynamic typing might be easier).
> 
> What restrictions are you thinking about?

I would like something like Haskell type classes, or the ability 
to compare tags (instead of whole objects), meaning if "datatype 
foo = bar | baz of int", then it would be nice to group foo:s into 
lists by checking tag equality (like "(eq blub (car foo))" or 
something like that in Lisp).  I can't really remember where I 
needed that, but I ended up writing pattern-matching code for the 
whole comparison, where one EQ line would have been enough. 
Equality types could be better, too.

In Lisp you can just write whatever you want, and that's nice.

-- 
XML is a prime example of retarded innovation.
	-- Erik Meijer and Peter Drayton, Microsoft Corporation
From: Kenny Tilton
Subject: Re: Beyond CL?
Date: 
Message-ID: <ALFCe.5150$Y54.3917@twister.nyc.rr.com>
Tron3k wrote:

> I decided a while ago my Lisp will be case-sensitive. It will be
> created with Win32 game development in mind, and I don't want people
> calling Win32 or OpenGL functions in all small case,..

OK, you get one chance to explain the, um, ignorance of that remark. I 
would give you zero, but you are such a genius that you have in two days 
created a language better than Lisp. (URL, plz.)

Now for the bad news: the only folks sucked into your thread are the 
same ones who fall for every troll, so we have you down for a C- at this 
point. Keep them in the air for another week and we'll make it a B.

-- 
Kenny

Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film

"I've wrestled with reality for 35 years, Doctor, and I'm happy to state 
I finally won out over it."
     Elwood P. Dowd, "Harvey", 1950
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121695922.616644.108260@g14g2000cwa.googlegroups.com>
Kenny Tilton wrote:
> Tron3k wrote:
>
> > I decided a while ago my Lisp will be case-sensitive. It will be
> > created with Win32 game development in mind, and I don't want people
> > calling Win32 or OpenGL functions in all small case,..
>
> OK, you get one chance to explain the, um, ignorance of that remark. I
> would give you zero, but you are such a genius that you have in two days
> created a language better than Lisp. (URL, plz.)
>
> Now for the bad news: the only folks sucked into your thread are the
> same ones who fall for every troll, so we have you down for a C- at this
> point. Keep them in the air for another week and we'll make it a B.

I am not a troll! I'm here because I love Lisp and I *actually am*
making a new one! What's the problem with everyone here? This is not
what I expected in comp.lang.lisp. I thought Lispy people *loved*
playing around with new ideas for Lisps. Personally I find it *so* fun,
I do it all the time. On the train I carry around a clipboard to jot
down new Lisp ideas *every single day*. What is going on here?

And you tell me what looks better:

···@createprocessasuser
···@CreateProcessAsUser

And before you ask, I am not going to change it to
create-process-as-user. I have my reasons.
From: Joe Marshall
Subject: Re: Beyond CL?
Date: 
Message-ID: <oe90m4cb.fsf@ccs.neu.edu>
"Tron3k" <······@gmail.com> writes:

> I thought Lispy people *loved* playing around with new ideas for
> Lisps.

We do.

> On the train I carry around a clipboard to jot down new Lisp ideas
> *every single day*.

They may be new to you, but what makes you think they are new to us?

> What is going on here?

I suggest you go back through the comp.lang.lisp archives.

> And you tell me what looks better:
>
> ···@createprocessasuser
> ···@CreateProcessAsUser
>
> And before you ask, I am not going to change it to
> create-process-as-user. I have my reasons.

You tell me what looks better:

four score and seven years ago our fathers brought forth on this
continent a new nation, conceived in liberty and dedicated to the
proposition that all men are created equal.

FouR scoRe and sEvEn yeaRs aGo our fAthErs brOuGht fOrth on This
conTinenT A new nAtIon, coNceiVed in liberty and dediCated tO the
ProPosiTion thAt All meN aRe cReated equaL.
From: Bruce Stephens
Subject: Re: Beyond CL?
Date: 
Message-ID: <87slybxcqu.fsf@cenderis.demon.co.uk>
Joe Marshall <···@ccs.neu.edu> writes:

> "Tron3k" <······@gmail.com> writes:

[...]

>> And you tell me what looks better:
>>
>> ···@createprocessasuser
>> ···@CreateProcessAsUser
>>
>> And before you ask, I am not going to change it to
>> create-process-as-user. I have my reasons.
>
> You tell me what looks better:
>
> four score and seven years ago our fathers brought forth on this
> continent a new nation, conceived in liberty and dedicated to the
> proposition that all men are created equal.
>
> FouR scoRe and sEvEn yeaRs aGo our fAthErs brOuGht fOrth on This
> conTinenT A new nAtIon, coNceiVed in liberty and dediCated tO the
> ProPosiTion thAt All meN aRe cReated equaL.

The first one, of course.  And I find the first one much easier to
read than

FOUR SCORE AND SEVEN YEARS AGO OUR FATHERS BROUGHT FORTH ON THIS
CONTINENT A NEW NATION, CONCEIVED IN LIBERTY AND DEDICATED TO THE
PROPOSITION THAT ALL MEN ARE CREATED EQUAL.

which is the kind of thing that Common Lisp seems to produce by
default.  I agree that SillyUseOfCaps doesn't really help; fortunately
Emacs has glasses-mode which helps a little when it's inflicted on
you, but I don't think that's an acceptable reason for designing it
into a language.
From: Pascal Costanza
Subject: Re: Beyond CL?
Date: 
Message-ID: <3k46egFsbgkoU1@individual.net>
Bruce Stephens wrote:
> Joe Marshall <···@ccs.neu.edu> writes:
> 
>>You tell me what looks better:
>>
>>four score and seven years ago our fathers brought forth on this
>>continent a new nation, conceived in liberty and dedicated to the
>>proposition that all men are created equal.
>>
>>FouR scoRe and sEvEn yeaRs aGo our fAthErs brOuGht fOrth on This
>>conTinenT A new nAtIon, coNceiVed in liberty and dediCated tO the
>>ProPosiTion thAt All meN aRe cReated equaL.
> 
> The first one, of course.  And I find the first one much easier to
> read than
> 
> FOUR SCORE AND SEVEN YEARS AGO OUR FATHERS BROUGHT FORTH ON THIS
> CONTINENT A NEW NATION, CONCEIVED IN LIBERTY AND DEDICATED TO THE
> PROPOSITION THAT ALL MEN ARE CREATED EQUAL.
> 
> which is the kind of thing that Common Lisp seems to produce by
> default.

But it's not what we type in, so most of the time we see all lower case. 
Sometimes, it can be helpful that Common Lisp responses in all upper 
case because it makes it easier to distinguish between what you typed in 
and what Common Lisp responded. But that's probably only a very small 
advantage.


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Bruce Stephens
Subject: Re: Beyond CL?
Date: 
Message-ID: <87slya2yyq.fsf@cenderis.demon.co.uk>
Pascal Costanza <··@p-cos.net> writes:

> Bruce Stephens wrote:

[...]

>> FOUR SCORE AND SEVEN YEARS AGO OUR FATHERS BROUGHT FORTH ON THIS
>> CONTINENT A NEW NATION, CONCEIVED IN LIBERTY AND DEDICATED TO THE
>> PROPOSITION THAT ALL MEN ARE CREATED EQUAL.
>> which is the kind of thing that Common Lisp seems to produce by
>> default.
>
> But it's not what we type in, so most of the time we see all lower
> case. Sometimes, it can be helpful that Common Lisp responses in all
> upper case because it makes it easier to distinguish between what
> you typed in and what Common Lisp responded. But that's probably
> only a very small advantage.

Yeah, I'd have thought that really would be a small advantage.  Don't
Common Lisp programmers see sexprs from implementations quite a bit?
(I don't program much in CL; the most lisp I do is for hacking around
with Emacs, and that's case-sensitive, of course.)  

In that movie that was going around last week the guy did things like
macroexpand to check that a macro looked right, and that kind of
thing.  I'd assume that it would be much more common for users to see
output from Lisps than it is for (say) C++.  Not that ALL UPPER CASE
is any worse than many C++ ambiguous STL type errors that C++
programmers are liable to have inflicted on them.

I still find the upper case default to be awkward to read (more so
than I imagine I'd find all lower case).  Do people get used to it, or
do they (as I do) have (setf (readtable-case *readtable*) :invert) in
their initialisation files?

For what it's worth, I find the case much more offputting than all the
parentheses.  So much so that I think if I were preparing Lisp in a
Box I'd configure the defaults to show lower case (by setting
readtable as above, presuming I've got it right), and to hell with all
the textbooks and things that show the upper case default.  Does Lisp
in a Box in fact do that?

[...]
From: Pascal Bourguignon
Subject: Re: Beyond CL?
Date: 
Message-ID: <87ek9ub677.fsf@thalassa.informatimago.com>
Bruce Stephens <············@cenderis.demon.co.uk> writes:
> Yeah, I'd have thought that really would be a small advantage.  Don't
> Common Lisp programmers see sexprs from implementations quite a bit?
> (I don't program much in CL; the most lisp I do is for hacking around
> with Emacs, and that's case-sensitive, of course.)  
>
> In that movie that was going around last week the guy did things like
> macroexpand to check that a macro looked right, and that kind of
> thing.  I'd assume that it would be much more common for users to see
> output from Lisps than it is for (say) C++.  Not that ALL UPPER CASE
> is any worse than many C++ ambiguous STL type errors that C++
> programmers are liable to have inflicted on them.

No infliction: their choice!


> I still find the upper case default to be awkward to read (more so
> than I imagine I'd find all lower case).  Do people get used to it, or
> do they (as I do) have (setf (readtable-case *readtable*) :invert) in
> their initialisation files?

Instead of inverting, which is dangerous because you can get source
files with random capitalizations, you could use *print-case*:

[34]> (setf *print-case* :downcase)
:downcase
[35]> '(hello world)
(hello world)
[36]> (setf *print-case* :capitalize)
:Capitalize
[37]> '(hello world)
(Hello World)
[38]> (setf *print-case* :upcase)
:UPCASE
[39]> '(hello world)
(HELLO WORLD)


> For what it's worth, I find the case much more offputting than all the
> parentheses.  So much so that I think if I were preparing Lisp in a
> Box I'd configure the defaults to show lower case (by setting
> readtable as above, presuming I've got it right), and to hell with all
> the textbooks and things that show the upper case default.  Does Lisp
> in a Box in fact do that?

Better to keep the readtable-case to :upcase, to avoid any problem to
newbies trying to load lisp sources from case-careless programmers.
Use: (setf *print-case* :downcase)


-- 
__Pascal Bourguignon__                     http://www.informatimago.com/
From: Pascal Costanza
Subject: Re: Beyond CL?
Date: 
Message-ID: <3k74rsFt0kppU2@individual.net>
Bruce Stephens wrote:

> I still find the upper case default to be awkward to read (more so
> than I imagine I'd find all lower case).  Do people get used to it, or
> do they (as I do) have (setf (readtable-case *readtable*) :invert) in
> their initialisation files?

I haven't actually cared about this. People are able to adapt to all 
kinds of things, I don't know why this should be an issue at all. Syntax 
is boring.


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Bruce Stephens
Subject: Re: Beyond CL?
Date: 
Message-ID: <87k6jla612.fsf@cenderis.demon.co.uk>
Pascal Costanza <··@p-cos.net> writes:

[...]

> I haven't actually cared about this [case]. People are able to adapt
> to all kinds of things, I don't know why this should be an issue at
> all. Syntax is boring.

I find all upper case a bit distracting.

I assumed that it was known that all upper case was worse for reading
than all lower case, but a quick google search suggests things are
more subtle than that.  In particular, it appears
<http://www.psych.utoronto.ca/~muter/pmuter1.htm> that all upper case
is better for searching, but all lower case is better for reading
continuous text.  Reading output from programming systems is
presumably closer to "searching".  

Familiarity presumably has a large impact, which would explain why
nobody worries after a while.  On the other hand, that suggests that
if you're preparing PCL or Lisp in a Box (presumably intended for
those not yet scarred by (or "accustomed to") seeing lots of upper
case) then changing the default strikes me as worthwhile.  (Maybe
that's just me, though; mostly people tend to complain about the
parentheses rather than case.)

[...]
From: Matthias Buelow
Subject: Re: Beyond CL?
Date: 
Message-ID: <86oe8x43bp.fsf@drjekyll.mkbuelow.net>
Joe Marshall <···@ccs.neu.edu> writes:

>You tell me what looks better:
>
>four score and seven years ago our fathers brought forth on this
>continent a new nation, conceived in liberty and dedicated to the
>proposition that all men are created equal.
>
>FouR scoRe and sEvEn yeaRs aGo our fAthErs brOuGht fOrth on This
>conTinenT A new nAtIon, coNceiVed in liberty and dediCated tO the
>ProPosiTion thAt All meN aRe cReated equaL.

You have to go with the times...

4 Sk0r3Z & 7 y3arZ ag0 0uR f4tH0rZ br!ngz0rd 4tH 0n tH!Zz
k0nt1n3nT a n00b n4t10n, k0nC3iV30r3d iN l!b3rTy & d3d!k4t0rd 2 d4
pr0p0siT10n tH4t 4lL l337 kReW R kR3at0rd eQuaL!!!!11111

mkb.
From: Kenny Tilton
Subject: Re: Beyond CL?
Date: 
Message-ID: <tGPCe.1554$Ow4.741072@twister.nyc.rr.com>
Tron3k wrote:
> Kenny Tilton wrote:
> 
>>Tron3k wrote:
>>
>>
>>>I decided a while ago my Lisp will be case-sensitive. It will be
>>>created with Win32 game development in mind, and I don't want people
>>>calling Win32 or OpenGL functions in all small case,..
>>
>>OK, you get one chance to explain the, um, ignorance of that remark. 

...snip...

> And you tell me what looks better:
> 
> ···@createprocessasuser
> ···@CreateProcessAsUser

Bzzzt! The answer was "no one calls anything in all small case". The 
code can be typed in any case you like, unless of course you have opted 
for a case-sensitive Lisp.

Don't get me wrong. You are in good company. It is hard to find a 
Lispnik who is not working on their own Lisp.

I thought I heard you say something about Common Lisp needing fixing, 
which is what had me howling.

Instead I gather you are so taken with yourself that (a) you think you 
can do better than fifty years of development by the people who created 
the language you say you like and (b) you want the world to know all 
about it.

That's fine, long tradition of that, many a bozo has preceded you, carry on.

-- 
Kenny

Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film

"I've wrestled with reality for 35 years, Doctor, and I'm happy to state 
I finally won out over it."
     Elwood P. Dowd, "Harvey", 1950
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121700518.428113.283440@g49g2000cwa.googlegroups.com>
Kenny Tilton wrote:
> Tron3k wrote:
> > Kenny Tilton wrote:
> >
> >>Tron3k wrote:
> >>
> >>
> >>>I decided a while ago my Lisp will be case-sensitive. It will be
> >>>created with Win32 game development in mind, and I don't want people
> >>>calling Win32 or OpenGL functions in all small case,..
> >>
> >>OK, you get one chance to explain the, um, ignorance of that remark.
>
> ...snip...
>
> > And you tell me what looks better:
> >
> > ···@createprocessasuser
> > ···@CreateProcessAsUser
>
> Bzzzt! The answer was "no one calls anything in all small case". The
> code can be typed in any case you like, unless of course you have opted
> for a case-sensitive Lisp.

I don't want people to type things in any case they like. That's the
point.

> Don't get me wrong. You are in good company. It is hard to find a
> Lispnik who is not working on their own Lisp.

You're wrong. Almost everybody I've met here is not working on their
own Lisp - you can tell from their reactions to things.

> I thought I heard you say something about Common Lisp needing fixing,
> which is what had me howling.
>
> Instead I gather you are so taken with yourself that (a) you think you
> can do better than fifty years of development by the people who created
> the language you say you like and (b) you want the world to know all
> about it.

That's true. I believe I can do better than them. Why?

There is a time in the future when people look back at Common Lisp and
*laugh* - just like we Lispers laugh at the weaknesses of Fortran,
Basic, C, etc. I want to find out why they're laughing.

> That's fine, long tradition of that, many a bozo has preceded you, carry on.

I'll remember you said this.
From: Kenny Tilton
Subject: Re: Beyond CL?
Date: 
Message-ID: <TRQCe.1559$Ow4.743299@twister.nyc.rr.com>
Tron3k wrote:
> Kenny Tilton wrote:
> 
>>Tron3k wrote:
>>
>>>Kenny Tilton wrote:
>>>
>>>
>>>>Tron3k wrote:
>>>>
>>>>
>>>>
>>>>>I decided a while ago my Lisp will be case-sensitive. It will be
>>>>>created with Win32 game development in mind, and I don't want people
>>>>>calling Win32 or OpenGL functions in all small case,..
>>>>
>>>>OK, you get one chance to explain the, um, ignorance of that remark.
>>
>>...snip...
>>
>>
>>>And you tell me what looks better:
>>>
>>>···@createprocessasuser
>>>···@CreateProcessAsUser
>>
>>Bzzzt! The answer was "no one calls anything in all small case". The
>>code can be typed in any case you like, unless of course you have opted
>>for a case-sensitive Lisp.
> 
> 
> I don't want people to type things in any case they like. That's the
> point.

But it is not what you said. You think you said that, which just makes 
my point: you do not know Lisp, yet you have set out to fix it. Shades 
of Ilias, fondly remembered, carry on, we needed a laugh around here.

> 
> 
>>Don't get me wrong. You are in good company. It is hard to find a
>>Lispnik who is not working on their own Lisp.
> 
> 
> You're wrong. Almost everybody I've met here is not working on their
> own Lisp - you can tell from their reactions to things.

Anyone not working on their own Lisp is working on their own version of 
Cells. Hey, make Cells a built-in with optimizations and I will do a 
promo spot for you.

> 
> 
>>I thought I heard you say something about Common Lisp needing fixing,
>>which is what had me howling.
>>
>>Instead I gather you are so taken with yourself that (a) you think you
>>can do better than fifty years of development by the people who created
>>the language you say you like and (b) you want the world to know all
>>about it.
> 
> 
> That's true. I believe I can do better than them. Why?
> 
> There is a time in the future when people look back at Common Lisp and
> *laugh* - just like we Lispers laugh at the weaknesses of Fortran,
> Basic, C, etc. I want to find out why they're laughing.
> 
> 
>>That's fine, long tradition of that, many a bozo has preceded you, carry on.
> 
> 
> I'll remember you said this.
> 

The good news is that we will not notice when you do /not/ deliver a new 
Lisp. Unless you are Tron enough to swing by and admit your folly.

Anyway, have fun. Do you have a name for your Lisp? That is more 
important than the syntax or semantics.

-- 
Kenny

Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film

"I've wrestled with reality for 35 years, Doctor, and I'm happy to state 
I finally won out over it."
     Elwood P. Dowd, "Harvey", 1950
From: Steven E. Harris
Subject: Re: Beyond CL?
Date: 
Message-ID: <q94fyubevxc.fsf@xenon.gnostech.com>
Kenny Tilton <·······@nyc.rr.com> writes:

> Anyone not working on their own Lisp is working on their own version
> of Cells.

That certainly applies to me. But, lest you fear, I wasn't trying to
improve Cells, only understand how it works.

-- 
Steven E. Harris
From: Don Geddis
Subject: Re: Beyond CL?
Date: 
Message-ID: <874qaqv176.fsf@sidious.geddis.org>
"Tron3k" <······@gmail.com> wrote on 18 Jul 2005 08:2:
> There is a time in the future when people look back at Common Lisp and
> *laugh*

I agree with you.  Programming at this level of abstraction is far too much
work.  Surely programming in the future will be much more like on Star Trek,
where you basically have a conversation with the computer.  And it asks
clarifying questions if your request was ambiguous.

In some sense, the interface to a future computer should probably be something
like the current interface between a tech manager and a superstar programmer.
Think of the conversations that the manager has with the programmer about some
new project; that's ideally the kind of interaction we all ought to have when
programming our computers directly.

Sadly, that's a long way off.

Meanwhile, all of your suggestions so far seem to make no progress in that
direction.  Moreover, most of them don't even seem like an improvement over
Common Lisp, which we already had ten years ago.  Your designs, while
different, seem to be going backwards in quality from the current state of
the art.

> - just like we Lispers laugh at the weaknesses of Fortran,
> Basic, C, etc. I want to find out why they're laughing.

Not because of age.  Keep in mind that Lisp began around the same time as
Fortran, and much of the core remains from those days.  Basic and C were
designed much more recently than (older versions of) Lisp, and on their first
day were already inferior to the Lisp of the day as a general purpose
programming language.

You seem to be confusing "different" with "better".  NOBODY is claiming that
Common Lisp is the best language that will ever be invented.  Even John
McCarthy (the original inventor of Lisp) describes it more as a "local
optimum".

Still, it is VERY well designed, and you making random changes to the design
(based, it appears, mostly on your "personal preferences") are unlikely to
result in a superior design.  You don't even understand why the current
decisions were made, so you don't appreciate the tradeoffs when you attempt
to make different ones.

Your ideas so far are not new.  They are well known, and have been deliberately
rejected (for good reasons) long ago.

        -- Don
_______________________________________________________________________________
Don Geddis                  http://don.geddis.org/               ···@geddis.org
A democracy cannot exist as a permanent form of government. It can only
exist until a majority of voters discover that they can vote themselves
largess out of the public treasury.
	-- Alexander Tyler, eighteenth-century Scottish historian
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121819829.612984.127430@g49g2000cwa.googlegroups.com>
Don Geddis wrote:
> Meanwhile, all of your suggestions so far seem to make no progress in that
> direction.  Moreover, most of them don't even seem like an improvement over
> Common Lisp, which we already had ten years ago.  Your designs, while
> different, seem to be going backwards in quality from the current state of
> the art.

You said in your second post to be more open-minded. And yet here you
say that Lisp-1 is objectively worse than Lisp-n. Hmm.

I make no claim to be open-minded. I think Lisp-1 is objectively better
because it looks better. That's all I care about at this stage. I'm
only designing this language for myself anyway.

> > - just like we Lispers laugh at the weaknesses of Fortran,
> > Basic, C, etc. I want to find out why they're laughing.
>
> Not because of age.  Keep in mind that Lisp began around the same time as
> Fortran, and much of the core remains from those days.  Basic and C were
> designed much more recently than (older versions of) Lisp, and on their first
> day were already inferior to the Lisp of the day as a general purpose
> programming language.

That's nice...

> You seem to be confusing "different" with "better".

Oops! Got those two mixed up. Thanks a lot.

> NOBODY is claiming that
> Common Lisp is the best language that will ever be invented.

They better not! ;-)

> Still, it is VERY well designed, and you making random changes to the design
> (based, it appears, mostly on your "personal preferences") are unlikely to
> result in a superior design.  You don't even understand why the current
> decisions were made, so you don't appreciate the tradeoffs when you attempt
> to make different ones.

How do you know I don't appreciate the tradeoffs and am *willfully
choosing to make them*? That is in fact what I'm doing. And my goal is
a language which I prefer, yes. I make no apologies for that.

> Your ideas so far are not new.  They are well known, and have been deliberately
> rejected (for good reasons) long ago.

Scheme seems to have picked up one of them...
From: Ulrich Hobelmann
Subject: Re: Beyond CL?
Date: 
Message-ID: <3k6gq9Fsrq5hU1@individual.net>
Tron3k wrote:
> You said in your second post to be more open-minded. And yet here you
> say that Lisp-1 is objectively worse than Lisp-n. Hmm.

This is a funny point.  Sometime ago I was absolutely sure that 
Lisp-1 is the only thing that makes sense.  After a bit Lisp I got 
used to ugly FUNCALLs and #'(lambda ...) stuff.  Now I think that 
multiple namespaces simply make more sense.  Ok, for functions 
that get passed as parameters it doesn't really matter.  Writing 
(map +1 '(1 2 3)) IS nice, but writing #'+1 in that case isn't too 
bad either.  And for everything else separate namespaces are even 
nicer, IMHO.

Well, I don't want to start a flamewar here; probably there's no 
*objective* superiority, just reasons why there are people on both 
sides of the issue.

> I make no claim to be open-minded. I think Lisp-1 is objectively better
> because it looks better. That's all I care about at this stage. I'm
> only designing this language for myself anyway.

Honest enough, and not the worst thing to do when designing a 
language, IMHO.  I don't think it will accomplish much, but it's 
certainly fun and you might learn a lot doing it.  I guess in two 
years we'll either see your advertising for the cool new Lisp in 
here, or we'll see you a new Common Lisp convert ;)

-- 
XML is a prime example of retarded innovation.
	-- Erik Meijer and Peter Drayton, Microsoft Corporation
From: Marcus Breiing
Subject: Re: Beyond CL?
Date: 
Message-ID: <x29lg4ansvxm1@breiing.com>
* Tron3k

> I think Lisp-1 is objectively better because it looks better.

The human aesthetic sense evolved in an environment marked by a
peculiar shortage of programming language design work.

Marcus
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121867630.257830.214320@o13g2000cwo.googlegroups.com>
Marcus Breiing wrote:
> * Tron3k
>
> > I think Lisp-1 is objectively better because it looks better.
>
> The human aesthetic sense evolved in an environment marked by a
> peculiar shortage of programming language design work.

Did you ever stop to think about why aesthetics is so important in
mathematics?

Read this: http://www.paulgraham.com/taste.html

I copied the following from that article:

"...Copernicus' aesthetic objections to [equants] provided one
essential motive for his rejection of the Ptolemaic system...."

- Thomas Kuhn, The Copernican Revolution

"All of us had been trained by Kelly Johnson and believed fanatically
in his insistence that an airplane that looked beautiful would fly the
same way."

- Ben Rich, Skunk Works

"Beauty is the first test: there is no permanent place in this world
for ugly mathematics."

- G. H. Hardy, A Mathematician's Apology
From: Kent M Pitman
Subject: Re: Beyond CL?
Date: 
Message-ID: <ubr4xsev4.fsf@nhplace.com>
"Tron3k" <······@gmail.com> writes:

> Did you ever stop to think about why aesthetics is so important in
> mathematics?

To create a barrier to entry?

(Personally, I think Mathematics' POOR sends of aestethic in NOTATION
 is the principal reason that more people don't easily learn Mathematics.)
From: Pascal Costanza
Subject: Re: Beyond CL?
Date: 
Message-ID: <3k744qFt0kbiU1@individual.net>
Tron3k wrote:
> Marcus Breiing wrote:
> 
>>* Tron3k
>>
>>>I think Lisp-1 is objectively better because it looks better.
>>
>>The human aesthetic sense evolved in an environment marked by a
>>peculiar shortage of programming language design work.
> 
> Did you ever stop to think about why aesthetics is so important in
> mathematics?
> 
> Read this: http://www.paulgraham.com/taste.html
> 
> I copied the following from that article:
> 
> "...Copernicus' aesthetic objections to [equants] provided one
> essential motive for his rejection of the Ptolemaic system...."
> 
> - Thomas Kuhn, The Copernican Revolution
> 
> "All of us had been trained by Kelly Johnson and believed fanatically
> in his insistence that an airplane that looked beautiful would fly the
> same way."
> 
> - Ben Rich, Skunk Works
> 
> "Beauty is the first test: there is no permanent place in this world
> for ugly mathematics."
> 
> - G. H. Hardy, A Mathematician's Apology

"In Dead Ringers, the character of Elliot says, 'There should be a 
beauty contest for the inside of the body.' As human beings, our sense 
of what�s beautiful or repulsive is only skin deep, even when it comes 
to our own bodies. I�m always interested in taking something which seems 
repulsive at the beginning of the film and making it appear beautiful 
and inevitable by the end. I can�t achieve that with every viewer, but 
with some people I succeed." - David Cronenberg


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121869572.653015.193110@g49g2000cwa.googlegroups.com>
Pascal Costanza wrote:
> Tron3k wrote:
> > Marcus Breiing wrote:
> >
> >>* Tron3k
> >>
> >>>I think Lisp-1 is objectively better because it looks better.
> >>
> >>The human aesthetic sense evolved in an environment marked by a
> >>peculiar shortage of programming language design work.
> >
> > Did you ever stop to think about why aesthetics is so important in
> > mathematics?
> >
> > Read this: http://www.paulgraham.com/taste.html
> >
> > I copied the following from that article:
> >
> > "...Copernicus' aesthetic objections to [equants] provided one
> > essential motive for his rejection of the Ptolemaic system...."
> >
> > - Thomas Kuhn, The Copernican Revolution
> >
> > "All of us had been trained by Kelly Johnson and believed fanatically
> > in his insistence that an airplane that looked beautiful would fly the
> > same way."
> >
> > - Ben Rich, Skunk Works
> >
> > "Beauty is the first test: there is no permanent place in this world
> > for ugly mathematics."
> >
> > - G. H. Hardy, A Mathematician's Apology
>
> "In Dead Ringers, the character of Elliot says, 'There should be a
> beauty contest for the inside of the body.' As human beings, our sense
> of what´s beautiful or repulsive is only skin deep, even when it comes
> to our own bodies. I´m always interested in taking something which seems
> repulsive at the beginning of the film and making it appear beautiful
> and inevitable by the end. I can´t achieve that with every viewer, but
> with some people I succeed." - David Cronenberg

It's interesting that you post this. As something of an amateur
biologist, I find the human body sometimes to be the epitome of elegant
design. Consider the human heart: a miraculous pump that works every
second of our entire lives.

Your quote actually brings up something I've been thinking about:
syntax. At times, I've considered the syntax I'm creating to be ugly,
even though it's shorter. I wondered if that was a warning sign, but
decided to go ahead with it and see how it feels over time. Indeed, it
seems to get more beautiful as I play around with it and get accustomed
to it: an interesting parallel to your quote.

And yet the sharp-quote notation of CL continues to grate on me. Thus I
believe your quote does not apply in this case, because I have been
using it for quite a while.

More from Paul Graham:
In math, every proof is timeless unless it contains a mistake. So what
does Hardy mean when he says there is no permanent place for ugly
mathematics? He means the same thing Kelly Johnson did: if something is
ugly, it can't be the best solution. There must be a better one, and
eventually someone will discover it.

That, I think, goes to the heart of the matter.
From: Don Geddis
Subject: Re: Beyond CL?
Date: 
Message-ID: <8764v5t3yq.fsf@sidious.geddis.org>
"Tron3k" <······@gmail.com> wrote on 20 Jul 2005 07:2:
> As something of an amateur biologist, I find the human body sometimes to be
> the epitome of elegant design.

Why do we have an appendix?  Why can't we regenerate lost limbs?  Why must
we have limited lifespans?  Why can't we reliably memorize random data if
desired?

Even if it were an elegant design, the human body was designed for quite a
different environment than the one most of us currently live in.  There are
numerous ways that a new designer could make a better body for modern humans
in modern society.

Your thoughts on biology seem to have the same quality as those on programming
language design.

        -- Don
_______________________________________________________________________________
Don Geddis                  http://don.geddis.org/               ···@geddis.org
Children are natural mimics who act like their parents despite every effort to
teach them good manners.
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1122046685.286201.319190@g14g2000cwa.googlegroups.com>
Don Geddis wrote:
> "Tron3k" <······@gmail.com> wrote on 20 Jul 2005 07:2:
> > As something of an amateur biologist, I find the human body sometimes to be
> > the epitome of elegant design.
>
> Why do we have an appendix?  Why can't we regenerate lost limbs?  Why must
> we have limited lifespans?  Why can't we reliably memorize random data if
> desired?
>
> Even if it were an elegant design, the human body was designed for quite a
> different environment than the one most of us currently live in.  There are
> numerous ways that a new designer could make a better body for modern humans
> in modern society.
>
> Your thoughts on biology seem to have the same quality as those on programming
> language design.

Consider, also: Fear. For some reason, when we are required to speak in
front of an audience, our brain assumes we are in imminent danger, and
activates an age-old mechanism, causing adrenaline to be released into
the bloodstream. Our muscle cells start to increase their metabolic
rate in preparation for a fight-or-flight response.

Yes, we are maladapted to the environment we are in. But this is
precisely why I threw in the key word "sometimes" in my statement
above.

But there is a way to see the beauty even in the failings of nature.
Evolution itself is a beautiful thing, a thing of such simplicity
yielding the most immensely complex structures in the universe.

However, I think it is important to note that the following are
actually examples of good design: Not being able to regenerate lost
limbs, having a limited lifespan, and not being able to memorize random
data. Design is a process of making tradeoffs to achieve optimal
results. Being able to regenerate limbs would mean that every cell in
the body would have to contain a complete map of the body: currently,
as the fetus grows, the cells of the body are essentially laid out
algorithmically; that is, there is no map of all the blood vessels in
the body contained in our DNA. Having an infinite lifespan might be
nice for us, but it wouldn't be good design from evolution's
standpoint: it is not one of evolution's design criteria. Finally,
being able to memorize random data would most likely lead to
compromises in other areas of intelligence.
From: Don Geddis
Subject: Re: Beyond CL?
Date: 
Message-ID: <87mzobpwbp.fsf@sidious.geddis.org>
I wrote:
>> Why do we have an appendix?  Why can't we regenerate lost limbs?  Why must
>> we have limited lifespans?  Why can't we reliably memorize random data if
>> desired?

"Tron3k" <······@gmail.com> wrote on 22 Jul 2005:
> However, I think it is important to note that the following are
> actually examples of good design: Not being able to regenerate lost
> limbs, having a limited lifespan, and not being able to memorize random
> data.

Oddly (and perhaps for the first time since you started posting in c.l.l),
I agree with you :-).

> Design is a process of making tradeoffs to achieve optimal results.

You know, it's very interesting that you wrote this.  In some sense, this
is the message that people have been trying to get across to you in the
discussion on programming language features.

When you first started posting here, you expressed things like "this feature
in Common Lisp is broken, and my new lisp will have it fixed."  And you
naturally got a lot of resistance.

The truth, as you seem to recognize here, is that every design decision has
tradeoffs, and it is rare that any single decision is in all ways superior
to an alternative.

Since Common Lisp is a very well-designed language, hardly anything you'll
find there is a mistake.  So, if you would do some of it differently, it is
important to figure out why the CL committee made the choices they did, and
what about your situation is different (possibly resulting in a different
choice being superior for your situation).

> Being able to regenerate limbs would mean that every cell in the body would
> have to contain a complete map of the body: currently, as the fetus grows,
> the cells of the body are essentially laid out algorithmically; that is,
> there is no map of all the blood vessels in the body contained in our
> DNA.

This isn't the reason that regeneration doesn't work.  DNA holds sufficient
information to recreate limbs, and (almost) every cell has the complete DNA
in it.  In theory, you could just run that "computation" you describe again,
and grow a new limb.  Sure, you might not wind up with EXACTLY the same
network of blood vessels as the first time, but I suspect hardly anyone would
complain.  All that matter is that the new limb works, and DNA holds enough
data to grow one.

The real answer, in this specific case, appears to be one of energy tradeoffs.
You have to devote some body machinery to prepare for regenerating limbs, and
(just as you mentioned above) nothing comes for free.  Spending energy on
building that machinery is energy taken away from other possible uses.

Starfish regrow arms (using "only" DNA).  Sharks regrow teeth.  Humans regrow
some things (fingernails, hair, skin), but nothing too complex.

It is far from impossible to design a human-like mammal that could regrow
limbs.  But you'd probably have to give up something else, perhaps intellect.
At the very least, it would almost certainly require higher energy intake.
So current humans are probably better at surviving drought and famine than
such a redesigned human would be.

Of course, in modern society, limb damage from accidents is now more likely
than starvation, so perhaps different design tradeoffs would now be preferred
for us, compared to the conditions under which we evolved.

> Having an infinite lifespan might be nice for us, but it wouldn't be
> good design from evolution's standpoint: it is not one of evolution's
> design criteria.

It winds up being good for the species (to have a diverse gene pool to
respond to unexpected environmental changes), at the expense of the
individual.  As an individual myself, I don't approve of this tradeoff :-).

> Finally, being able to memorize random data would most likely lead to
> compromises in other areas of intelligence.

Probably not except for energy intake.

As it is, today I rely on external tools (writing, computers) for this kind
of thing.  In the near future, cybernetic implants will probably make the
computational tools appear to be as much a part of your body as the organic
stuff.

        -- Don
_______________________________________________________________________________
Don Geddis                  http://don.geddis.org/               ···@geddis.org
Eagles may soar, free and proud, but weasels never get sucked into jet engines.
From: Pascal Bourguignon
Subject: Re: Beyond CL?
Date: 
Message-ID: <87vf2zvau1.fsf@thalassa.informatimago.com>
Don Geddis <···@geddis.org> writes:
> This isn't the reason that regeneration doesn't work.  DNA holds sufficient
> information to recreate limbs, and (almost) every cell has the complete DNA
> in it.  In theory, you could just run that "computation" you describe again,
> and grow a new limb.  Sure, you might not wind up with EXACTLY the same
> network of blood vessels as the first time, but I suspect hardly anyone would
> complain.  All that matter is that the new limb works, and DNA holds enough
> data to grow one.
>
> The real answer, in this specific case, appears to be one of energy tradeoffs.
> You have to devote some body machinery to prepare for regenerating limbs, and
> (just as you mentioned above) nothing comes for free.  Spending energy on
> building that machinery is energy taken away from other possible uses.

I'm not sure energy tradeoff really matters here.  How much energy
would it take anyway?  I bet I've got enough energy stored in fat to
grow another arm.


> Starfish regrow arms (using "only" DNA).  Sharks regrow teeth.  Humans regrow
> some things (fingernails, hair, skin), but nothing too complex.
>
> It is far from impossible to design a human-like mammal that could regrow
> limbs.  But you'd probably have to give up something else, perhaps intellect.
> At the very least, it would almost certainly require higher energy intake.
> So current humans are probably better at surviving drought and famine than
> such a redesigned human would be.
>
> Of course, in modern society, limb damage from accidents is now more likely
> than starvation, so perhaps different design tradeoffs would now be preferred
> for us, compared to the conditions under which we evolved.

What's clear is that from an individual point of view, it would be a
good thing to be able to regrow damaged organs (don't limit yourself
to limb, if you need a new heart, grow one on the other side until it
can take over).

But from a Darwinian point of view, it's probably better to eliminate
from the gene pool animals who are dumb enough to lose a limb or clog
their arteries.


>> Having an infinite lifespan might be nice for us, but it wouldn't be
>> good design from evolution's standpoint: it is not one of evolution's
>> design criteria.
>
> It winds up being good for the species (to have a diverse gene pool to
> respond to unexpected environmental changes), at the expense of the
> individual.  As an individual myself, I don't approve of this tradeoff :-).
>
>> Finally, being able to memorize random data would most likely lead to
>> compromises in other areas of intelligence.
>
> Probably not except for energy intake.
>
> As it is, today I rely on external tools (writing, computers) for this kind
> of thing.  In the near future, cybernetic implants will probably make the
> computational tools appear to be as much a part of your body as the organic
> stuff.

Animals of the world unite against gene pools!  ;-)


-- 
__Pascal Bourguignon__                     http://www.informatimago.com/
Our enemies are innovative and resourceful, and so are we. They never
stop thinking about new ways to harm our country and our people, and
neither do we. -- Georges W. Bush
From: Kent M Pitman
Subject: Re: Beyond CL?
Date: 
Message-ID: <u7jffihms.fsf@nhplace.com>
Pascal Bourguignon <···@informatimago.com> writes:

> > > Don Geddis <···@geddis.org> writes:
> > The real answer, in this specific case, appears to be one of
> > energy tradeoffs.  You have to devote some body machinery to
> > prepare for regenerating limbs, and (just as you mentioned above)
> > nothing comes for free.
> 
> I'm not sure energy tradeoff really matters here.  How much energy
> would it take anyway?  I bet I've got enough energy stored in fat to
> grow another arm.

I agree with Pascal that the energy argument seems weak.  HOWEVER,
there is a meta-argument of similar kind that seems stronger to me.
It might be that DNA can only do so many experiments at once, and that
any time it tried to evolve organisms that could do this regrowth, it
did it at the expense of doing other changes that could have saved it
more immediately.  If 98% of the time when you lose a limb, you die
before you would have had TIME to regrow a new one, it could be that
the survival value of regrowing one is small, while the survival value
of doing other things is higher, and so evolution has preferred those
other more short-term issues.  Only in a relatively luxurious
environment like modern earth where we don't spend all our time
keeping from being killed ruthlessly by predators might it matter to
do such long term planning.

Keeping on topic, the analogy might be that the energy required to build
a Lisp with a few better-named operators, a few operators with better
syntax, or a slightly better or more formal semantics would have killed
Lisp by robbing energy from efforts to do FFI, sockets, CORBA, COM, DLL,
etc. which were really things that were needed for short term commercial
survival.  That doesn't mean that in a luxurious time it might not be good
to tend to other matters, but evolution is very particular about what
time you ask it to work its magic and whose sharp teeth are waiting to
shred you into little pieces if you're looking the wrong way at the wrong
time.

> But from a Darwinian point of view, it's probably better to eliminate
> from the gene pool animals who are dumb enough to lose a limb or clog
> their arteries.

It might just be that the timelines are mismatched, as I described above.

The issue of clogging or not clogging the arteries is surely much too
specific for evolution to care about, and ill-expressed.  Evolution as
a process tends to find things that "tolerate a situation" or "don't
tolerate a situation".  It doesn't condemn people for getting their
arteries clogged.  It rewards or condemns people (or animals) for
being able to cope with the situation of excess food, whether such
people do it by (a) eating less, (b) having wider arteries [or
otherwise better capacity for tolerating "cloggage"], (c) eating foods
that counteract the clogging, (d) doing something cool that
compensates for their problem (e.g., developing a nanotech cure for
the problem). Evolution does NOT reward or condemn people or animals
for having or not having a certain design or history--just for having
or not having certain abilities or capabilities, however acquired.

Just my opinion, of course.
From: Ulrich Hobelmann
Subject: Re: Beyond CL?
Date: 
Message-ID: <3kjctaFu7me8U1@individual.net>
Kent M Pitman wrote:
> Keeping on topic, the analogy might be that the energy required to build
> a Lisp with a few better-named operators, a few operators with better
> syntax, or a slightly better or more formal semantics would have killed
> Lisp by robbing energy from efforts to do FFI, sockets, CORBA, COM, DLL,
> etc. which were really things that were needed for short term commercial
> survival.

No.  First of all Lisp *can* evolve by renaming etc..  At least 
Scheme undergoes a new standardization every couple of years 
(they're working on R6RS now).  Old applications could be 
supported by a compatibility package.

And a lack of CORBA and others would only have frozen Lisp 
success, so that now there wouldn't be the commercial Lisp 
landscape.  Being an idea rather than merely an organism, however, 
Lisp would have survived even then, and maybe it would have 
emerged in a cleaner form.

 > Evolution does NOT reward or condemn people or animals
> for having or not having a certain design or history--just for having
> or not having certain abilities or capabilities, however acquired.

Like assembly language the human mind is turing complete.  Now if 
that only meant a million people with computers could produce 
decent content, and not millions useless blogs :)

Sometimes a program an write stuff that's just as good.

-- 
XML is a prime example of retarded innovation.
	-- Erik Meijer and Peter Drayton, Microsoft Corporation
From: Cameron MacKinnon
Subject: Re: Beyond CL?
Date: 
Message-ID: <3dadnYP9HsX3Y3nfRVn-3w@rogers.com>
Kent M Pitman wrote:
> Pascal Bourguignon <···@informatimago.com> writes:
> 
>>But from a Darwinian point of view, it's probably better to eliminate
>>from the gene pool animals who are dumb enough to lose a limb or clog
>>their arteries.
> 
> 
> It might just be that the timelines are mismatched, as I described above.

One must not forget that natural selection has essentially no bearing on 
mutations that affect us after our childbearing years. Since most people 
are finished having children before they manage to clog their arteries, 
only bizarro secondary effects (how does losing a parent early affect 
reproduction rate?) would drive selection.

-- 
Cameron MacKinnon
Toronto, Canada
From: Don Geddis
Subject: Re: Beyond CL?
Date: 
Message-ID: <87vf2yon8z.fsf@sidious.geddis.org>
I wrote:
>> The real answer, in this specific case, appears to be one of energy
>> tradeoffs.  You have to devote some body machinery to prepare for
>> regenerating limbs, and (just as you mentioned above) nothing comes for
>> free.  Spending energy on building that machinery is energy taken away
>> from other possible uses.

Pascal Bourguignon <···@informatimago.com> wrote on Mon, 25 Jul 2005:
> I'm not sure energy tradeoff really matters here.  How much energy
> would it take anyway?  I bet I've got enough energy stored in fat to
> grow another arm.

The argument was intended to be more subtle than that.  Not that it would
take too much energy to actually regrow the limb; if you've lost one, surely
you're willing to spend that energy at that time.  It's that, in order to
prepare to be _able_ to regrow things when required, your limbs and organs
can't be as optimized for ordinary life as they are.  They'd need to keep
certain things in reserve, and build excess machinery that would almost never
be useful.  That's an energy/tradeoff cost, and one that (for higher mammals)
probably isn't worth the suboptimal performance during ordinary times.

But in any case, if you don't like that argument, I've got others.  This
article:
        http://www.chrcrm.org/medal03.htm
suggests that children can regrow (small) parts of their fingers, and that
perhaps it is a strong adult immune system which prevents regeneration in
mature individuals.  In particular, that fast-growing cells are generally
signs of cancer, so being strong against cancer means that you can't offer
regeneration.

I think the truth is that the evolutionary justification for this lack of
ability isn't yet completely clear.  (Except that the problem _isn't_ lack
of information; DNA does have enough data to plan out a limb or organ
regrowth.)

        -- Don
_______________________________________________________________________________
Don Geddis                  http://don.geddis.org/               ···@geddis.org
Analyzing humor is like dissecting a frog: Nobody really enjoys it and the frog
generally dies as a result.
From: Cameron MacKinnon
Subject: Re: Beyond CL?
Date: 
Message-ID: <SaGdnQw8Ur2UnnjfRVn-sg@rogers.com>
Don Geddis wrote:
> 
> "Tron3k" <······@gmail.com> wrote on 22 Jul 2005:
> 
>>Design is a process of making tradeoffs to achieve optimal results.
> 
> 
> You know, it's very interesting that you wrote this.  In some sense, this
> is the message that people have been trying to get across to you in the
> discussion on programming language features.
> 
> When you first started posting here, you expressed things like "this feature
> in Common Lisp is broken, and my new lisp will have it fixed."  And you
> naturally got a lot of resistance.
> 
> The truth, as you seem to recognize here, is that every design decision has
> tradeoffs, and it is rare that any single decision is in all ways superior
> to an alternative.
> 
> Since Common Lisp is a very well-designed language, hardly anything you'll
> find there is a mistake.  So, if you would do some of it differently, it is
> important to figure out why the CL committee made the choices they did, and
> what about your situation is different (possibly resulting in a different
> choice being superior for your situation).

How is someone who wasn't there supposed to do that? Some things can be 
guessed at and some may be given a plausible explanation by primary or 
secondary sources, to be sure. But, like all committees, CL's was 
composed of people who didn't always act altruistically (protecting 
their employers' and their customers' codebases), who occasionally 
engaged in horse trading ("I'll support your favourite dumb-ass feature 
if you support mine") and who didn't always have the patience to shout 
down others' stridently argued bad ideas.

Kent Pitman is often willing to do a very good job of explaining what he 
can remember of the process, but he isn't an expert on every aspect of 
the spec -- who could be? It's perfectly understandable that he, and 
others, would be unwilling to sow discord by exposing areas of the 
committee's internal decisions (even supposing that members knew what 
other members were thinking) that were driven by the base motives 
mentioned above.

Further, the committee did its work in a very different environment from 
the one we face today -- there was special purpose Lisp hardware, and a 
vastly different memory speed hierarchy was the norm, to name two examples.

In another post in this thread, Don Geddis wrote:

> There are a few topics out there which, in hindsight, the designers might have
> done slightly differently now that they have 20 years of experience to examine.
> But those topics are few and far between, and you haven't come close to finding
> any of them.

Just to clarify, do you mean 20 years of experience using CL, or 20 
years of "progress" in CS generally? While there's a lot that mainstream 
CS could still learn from CL (and vice versa), I find an element of the 
c.l.l community very quick to dismiss any suggestion of change. I'd be 
very surprised  if, twenty years from now, a typical CS undergraduate 
couldn't point out (what will then be) obvious shortcomings in CL. In 
thirty, it may well be those in grade school. Given CL's current lack of 
evolution, irrelevancy is assured.

Don't misunderstand me: I'm not defending the pseudonymous poster's 
particular ideas, and I suspect that the CL committee did about as good 
a job as could be done by a similarly composed committee. Further, their 
work has held up remarkably well. But time marches on, notwithstanding 
reactionary elements in the CL community who wish to draw no lessons 
from the millions of man-years of experience that the broader 
programming community has amassed since.


-- 
Cameron MacKinnon
Toronto, Canada
From: Kent M Pitman
Subject: Re: Beyond CL?
Date: 
Message-ID: <u64uywtoi.fsf@nhplace.com>
Cameron MacKinnon <··········@clearspot.net> writes:

> Further, the committee did its work in a very different environment
> from the one we face today -- there was special purpose Lisp hardware,

Which represented, I'd guess, about 15% of the voting on the committee.
Not exactly a voting majority.

> and a vastly different memory speed hierarchy was the norm, to name
> two examples.

I'm curious how you think this change should manifest in the language.
I don't see it.  The language tries hard to steer clear of memory model
issues at all.
 
> In another post in this thread, Don Geddis wrote:
> 
> > There are a few topics out there which, in hindsight, the
> > designers might have done slightly differently now that they have
> > 20 years of experience to examine.  But those topics are few and
> > far between, and you haven't come close to finding any of them.
> 
> Just to clarify, do you mean 20 years of experience using CL, or 20
> years of "progress" in CS generally?

I'll guess it generously meant either.  Don will doubtless speak for
himself, though.

> While there's a lot that mainstream CS could still learn from CL
> (and vice versa), I find an element of the c.l.l community very
> quick to dismiss any suggestion of change.

Where exactly ought this change have factored into CL?

> I'd be very surprised  if, twenty years from now, a typical CS
> undergraduate couldn't point out (what will then be) obvious
> shortcomings in CL.

I'll be thrilled to see that level of study and interest in CL.

> In thirty, it may well be those in grade
> school.

Surely there will be things people can point to.  There are things people
can point to now.  But, in general, the shortcomings you can point to now
are not things where we've learned a lot since CL--they're things we should
have noticed at the outset.  It's not that technology and knowledge haven't
changed, I just don't see how that technology and knowledge changes basic
truths about expressivity of a language.

> Given CL's current lack of evolution, irrelevancy is assured.

That's a pretty sweeping statement.  How so?

> Don't misunderstand me: I'm not defending the pseudonymous poster's
> particular ideas, and I suspect that the CL committee did about as
> good a job as could be done by a similarly composed
> committee. Further, their work has held up remarkably well. 

I'm not worried about any of this, but it's nice of you to say.

> But time marches on, notwithstanding reactionary elements in the CL
> community who wish to draw no lessons from the millions of man-years
> of experience that the broader programming community has amassed
> since.

It's not that I don't believe you.  But I'd feel better if you were lobbying
for any one of these major lessons to be adopted in some concrete way, rather
than merely shaming the community in general for its alleged non-responsiveness
to the alleged wisdom of an alleged million man-years of effort ... I find
myself with just a tiny bit of doubt that all million of those man-years
are of the same quality as some of the man-years that went into CL, but I
certainly acknowledge some smaller number that has yielded interesting results.
What I'm not sure is whether those results, however interesting,
dictate specific design changes for CL that any third grader can or will
be able to see as unambiguously right, and that the entire community of 
CL is stubbornly resisting.  So I'll feel better about tht claim when you 
make it concrete...
From: Cameron MacKinnon
Subject: Re: Beyond CL?
Date: 
Message-ID: <v8WdnTj655wr9HjfRVn-3A@rogers.com>
Kent M Pitman wrote:
> Cameron MacKinnon <··········@clearspot.net> writes:
> 
>>While there's a lot that mainstream CS could still learn from CL
>>(and vice versa)...
...
> Surely there will be things people can point to.  There are things people
> can point to now.  But, in general, the shortcomings you can point to now
> are not things where we've learned a lot since CL--they're things we should
> have noticed at the outset.  It's not that technology and knowledge haven't
> changed, I just don't see how that technology and knowledge changes basic
> truths about expressivity of a language.

For the most part, I'm happy with the expressivity of CL, the main 
exceptions being efficient bit/byte/word manipulation and memory layouts 
of structured data -- the old saw about being able to feel the bits 
between one's toes must have been about some other dialect. But beyond 
the expressivity of a language (being able to write code to do x)  are 
the builtins; being provided with functions that do x. The vaunted 
library of builtins which supposedly separate CL from the Scheme 
heathens appears to satisfy a shrinking percentage of what modern 
programmers expect in a general purpose language.

>>Given CL's current lack of evolution, irrelevancy is assured.
> 
> 
> That's a pretty sweeping statement.  How so?

See above. Newbies may be able to accept that one or two of their 
infrastructure requirements are implementation specific and not 
addresses by the standard. But as that list grows, they quickly come to 
the conclusion that their applications won't be written in Common Lisp 
at all, but rather an implementation specific superset with none of the 
appeal of writing in a standardized language. Just as an example, it 
seems that it isn't unusual for a typical application writer to feel he 
needs sockets, threads, Unicode and an FFI. None of those four things 
are exotic technology, but the Lisp newbie quickly realizes that an app 
with all four of these will be less than easily portable, that others' 
code collections with similar requirements may not be composable, 
etcetera. If a newbie discovers all of the wondrous properties of Lisp 
before he comes upon this hard truth, he may stay. But I suspect that a 
lot of potential converts just shake their heads and end up elsewhere.


>>But time marches on, notwithstanding reactionary elements in the CL
>>community who wish to draw no lessons from the millions of man-years
>>of experience that the broader programming community has amassed
>>since.
> 
> It's not that I don't believe you.  But I'd feel better if you were lobbying
> for any one of these major lessons to be adopted in some concrete way, rather
> than merely shaming the community in general for its alleged non-responsiveness
> to the alleged wisdom of an alleged million man-years of effort

I've lobbied on specific issues here in the past, as have many others. 
I'm certainly not going to regurgitate any of them in this thread. At 
the meta level, I've lobbied for a process for change.

> I find myself with just a tiny bit of doubt that all million of those man-years
> are of the same quality as some of the man-years that went into CL, but I
> certainly acknowledge some smaller number that has yielded interesting results.

He also serves who merely provides a bad example. We can learn what kind 
of application the typical programmer wishes to create, the wheels he'll 
likely reinvent, the features that he wishes were standardized, 
idiomatic bugs etcetera.

> What I'm not sure is whether those results, however interesting,
> dictate specific design changes for CL that any third grader can or will
> be able to see as unambiguously right, and that the entire community of 
> CL is stubbornly resisting.  So I'll feel better about tht claim when you 
> make it concrete...

The conventional wisdom here seems to be that CL is perfect and 
timeless. The community really seems to have taken as dogma the bon mot 
about being at a local maximum. It may well have been at one time, but 
the fitness function changes continuously. The long list of newbies who 
keep showing up and asking about a short list of issues should be 
demonstrating this to us. For those who advocate Lisp-1, I agree that 
they ought to know where to get it. But the other recurring issues, 
which plainly ought to be addressed by standards, aren't. And I don't 
believe it is enough to argue that the standard doesn't preclude 
implementation specific enhancements for issues which are not 
implementation or platform specific.

-- 
Cameron MacKinnon
Toronto, Canada
From: Duane Rettig
Subject: Re: Beyond CL?
Date: 
Message-ID: <4hdeilay8.fsf@franz.com>
Cameron MacKinnon <··········@clearspot.net> writes:

> Kent M Pitman wrote:
> > What I'm not sure is whether those results, however interesting,
> > dictate specific design changes for CL that any third grader can or will
> > be able to see as unambiguously right, and that the entire community
> > of CL is stubbornly resisting.  So I'll feel better about tht claim
> > when you make it concrete...
> 
> 
> The conventional wisdom here seems to be that CL is perfect and
> timeless.

This is not true at all.  There are a large number of issues with the
spec, ranging from minor cleanup items (various), through more major
consistency issues (e.g. types), all the way to out and out missing
pieces (e.g. FFI, defsystem, etc).  Most have been discussed in this
NG.

> The community really seems to have taken as dogma the bon
> mot about being at a local maximum.

The "locality" of the maximum may be the confusing factor here; if
the pain taken to get up a particular hill is great enough, the
decision to descend again to get to the next higher hill might be
put off indefinitely.  Or, if this hill were conquered at great
pains on a giant tandem/parallel bicycle with differential steering
and 50 humans powering and steering it, then it might be dangerous
to take it back down the hill (or anywhere, for that matter) with
as few as 5 humans, since even if they were strong enough to power
the bike, there would be not much chance of really controlling the
direction of the bike; we may thus end up on the bottom of the hill
and no chance of getting up a larger one.

No, it's not a question of being at a local maximum; it is a
question of what vehicle we'll use to get to the next one.

 It may well have been at one time,
> but the fitness function changes continuously. The long list of
> newbies who keep showing up and asking about a short list of issues
> should be demonstrating this to us.

Are these the same newbies that ask for free lisps only?  I think that
the $600 USD (or so) price for entrance onto the J13 bicycle might be
prohibitive for them.

 For those who advocate Lisp-1, I
> agree that they ought to know where to get it. But the other recurring
> issues, which plainly ought to be addressed by standards, aren't. And
> I don't believe it is enough to argue that the standard doesn't
> preclude implementation specific enhancements for issues which are not
> implementation or platform specific.

Agreed.  So what vehicle do you suggest we use to get off of this hill?

-- 
Duane Rettig    ·····@franz.com    Franz Inc.  http://www.franz.com/
555 12th St., Suite 1450               http://www.555citycenter.com/
Oakland, Ca. 94607        Phone: (510) 452-2000; Fax: (510) 452-0182   
From: Matthias Buelow
Subject: Re: Beyond CL?
Date: 
Message-ID: <3kla0mFuo2h8U1@news.dfncis.de>
Cameron MacKinnon <··········@clearspot.net> wrote:

>seems that it isn't unusual for a typical application writer to feel he 
>needs sockets, threads, Unicode and an FFI. None of those four things 

First note that these things aren't in ANSI C either...

>are exotic technology, but the Lisp newbie quickly realizes that an app 
>with all four of these will be less than easily portable, that others' 
>code collections with similar requirements may not be composable, 
>etcetera. If a newbie discovers all of the wondrous properties of Lisp 
>before he comes upon this hard truth, he may stay. But I suspect that a 
>lot of potential converts just shake their heads and end up elsewhere.

The main problem is imho that there is only one standard - the ANSI
spec. To be truly useful, all vendors and implementors should agree
on an extended standard, one that goes beyond the core language.
Threads and sockets etc. don't belong in the language specification.
They aren't in ANSI C either. So one would require some extended
specification, not necessarily a full-blown standard but certainly
a coherent document much more than individual SRFI-style "CLRFIs",
which would include a common subset of the OS API, defined per
system type.

For example, on Unix, that would include the entire POSIX/UNIXxx
codified subset of the OS API. To make this predictable and portable,
all implementors should chose exactly the same names and conventions
as used in the C API, and try to model the Lisp constructs for
parameters and return values to be as close (conceptually) to their
C equivalents as possible. Then the programmer can use the wealth
of existing documentation, books, tutorials, FAQs etc. to program
in Lisp instead of in C (or C++, Perl, ...) Since all vendors agree
on something that essentially has been codified for them already,
there will be a high conformity of the API between Lisp implementations
and no squabbling about pet problems such as naming issues. Ideally,
the different implementations would then be 100% source compatible.

The way it is now, each implementation provides its own mechanism
for accessing runtime library functions and system calls. They are
often treated poorly, incomplete, not documented, carelessly
implemented and subject to bitrot (like in cmucl), and sometimes
even arbitrarily renamed from their C lib equivalents. That situation
isn't very helpful for a productive development environment.

mkb.
From: Cameron MacKinnon
Subject: Re: Beyond CL?
Date: 
Message-ID: <y4Odnd6QOLib5njfRVn-oQ@rogers.com>
Matthias Buelow wrote:
> Cameron MacKinnon <··········@clearspot.net> wrote:
> 
> 
>>seems that it isn't unusual for a typical application writer to feel he 
>>needs sockets, threads, Unicode and an FFI. None of those four things 
> 
> 
> First note that these things aren't in ANSI C either...

Is that your benchmark? Because let me tell you, if Lisp network code 
was as portable as C, you wouldn't hear a peep out of me. Also, I'd bet 
more of our infrastructure would be written in Lisp instead of C.

I'd like to award a little gold star to every ***** who points out on 
c.l.l that sockets "isn't in ANSI C" and figures he's just scored a 
debating point. It must be difficult to blurt out such a thing while 
studiously ignoring the fact that the entire Internet is written in C, 
running on big and little endian 16, 32 and 64 bit processors. I've 
ported (recompiled) and written lots and lots of network code in C, and 
the only compatibility problem I can EVER remember having (aside from 
Microsoft's stupid, deliberate and minor Winsock variations, which are 
essentially the exception that prove the rule) was one function call 
(socket?) from code from some ancient AT&T UNIX that had one more 
parameter than everywhere else. I found documentation and fixed it in 
under a minute, pre-Google.

FFI? Well the reason that FFI isn't in ANSI C is that C is the 'F' in 
everyone else's FFI. When you're the lingua franca, everyone else ports 
to YOU. I even find the name funny. Foreign? Java calls it Native 
Methods, which I find more honest linguistically. Lisp calling C 
"Foreign" conjures up the image of two visible minorities in amongst a 
huge crowd of natives, with one saying to the other "it would be nice if 
we didn't have to deal with all these foreigners."


-- 
Cameron MacKinnon
Toronto, Canada
From: Matthias Buelow
Subject: Re: Beyond CL?
Date: 
Message-ID: <3klcktFuovu1U1@news.dfncis.de>
Cameron MacKinnon <··········@clearspot.net> wrote:

>I'd like to award a little gold star to every ***** who points out on 
>c.l.l that sockets "isn't in ANSI C" and figures he's just scored a 
>debating point. It must be difficult to blurt out such a thing while 
>studiously ignoring the fact that the entire Internet is written in C, 
>running on big and little endian 16, 32 and 64 bit processors. I've 
>ported (recompiled) and written lots and lots of network code in C, and 

Apparently you didn't read my posting farther than that one first line.

mkb.
From: Kent M Pitman
Subject: Re: Beyond CL?
Date: 
Message-ID: <uhdeizcng.fsf@nhplace.com>
Cameron MacKinnon <··········@clearspot.net> writes:

> The conventional wisdom here seems to be that CL is perfect and
> timeless.

If you can find even one person who alleges that CL is perfect, I'll
be surprised.  I think you're engaged in hyperbole here and weakening
your point in the process.

What I personally feel is that "the cost of opening CL for fixes
through ANSI would be high and the benefit would be low".  Future
change can and should come through other means.  ANSI is an extremely
expensive tool that I personally think we should be done with.  Just
my opinion.

Additional libraries can be added in modular fashion by individual
vendors or by coalitions of people who need something to span vendors.

So far as I know, no one has shown that CL is so broken in its core
that fixing it requires revisiting the original standard.

That's a far cry from a claim of perfection.

But maybe you're talking about someone else than me.  Or maybe you're
misunderstanding what the people you are talking about are saying.
From: Cameron MacKinnon
Subject: Re: Beyond CL?
Date: 
Message-ID: <Sc2dnQ6lOd6Q6XvfRVn-iQ@rogers.com>
Kent M Pitman wrote:
> Cameron MacKinnon <··········@clearspot.net> writes:
> 
> 
>>The conventional wisdom here seems to be that CL is perfect and
>>timeless.
> 
> 
> If you can find even one person who alleges that CL is perfect, I'll
> be surprised.  I think you're engaged in hyperbole here and weakening
> your point in the process.

In my observation of c.l.l, when someone questions the utility of a 
feature or desires increased standardization, someone else will 
invariably defend the status quo. Some of these defenses are eloquent 
and convincing, others merely inane ("Oh, but C doesn't specify that 
either.")  This is not typically followed by a bunch of posters agreeing 
with the advocate of change. So while there may not be one person who 
thinks CL is perfect, there is always somebody willing to defend any 
given facet as being perfect.

Then there is "local maximum" maxim, which claims that there is no small 
change which would improve the language; in effect, that the language is 
perfect for a particular problem subspace.

The effect of all this is to give the impression that any proposal will 
be sunk by a coalition of the apathetic and the hostile. From the 
viewpoint of an advocate of change, the difference between that and "CL 
is perfect" is insignificant.

> What I personally feel is that "the cost of opening CL for fixes
> through ANSI would be high and the benefit would be low".  Future
> change can and should come through other means.  ANSI is an extremely
> expensive tool that I personally think we should be done with.  Just
> my opinion.

I agree wholeheartedly, and I've said so in 
<······················@golden.net> (25 Mar 2004) and 
<······················@golden.net> (14 Apr 2004)


-- 
Cameron MacKinnon
Toronto, Canada
From: Duane Rettig
Subject: Re: Beyond CL?
Date: 
Message-ID: <4r7dlflkz.fsf@franz.com>
Cameron MacKinnon <··········@clearspot.net> writes:

> Kent M Pitman wrote:
> > Cameron MacKinnon <··········@clearspot.net> writes:
> >
> 
> >>The conventional wisdom here seems to be that CL is perfect and
> >>timeless.
> > If you can find even one person who alleges that CL is perfect, I'll
> 
> > be surprised.  I think you're engaged in hyperbole here and weakening
> > your point in the process.
> 
> In my observation of c.l.l, when someone questions the utility of a
> feature or desires increased standardization, someone else will
> invariably defend the status quo. Some of these defenses are eloquent
> and convincing, others merely inane ("Oh, but C doesn't specify that
> either.")  This is not typically followed by a bunch of posters
> agreeing with the advocate of change. So while there may not be one
> person who thinks CL is perfect, there is always somebody willing to
> defend any given facet as being perfect.
> 
> 
> Then there is "local maximum" maxim, which claims that there is no
> small change which would improve the language; in effect, that the
> language is perfect for a particular problem subspace.

Another more subtle but even more powerful maxim is the can-of-worms
problem.  Let's first set aside the obvious need to define standards
or standard practices for new features not considered in the original
spec - What is left is areas where CL has either internally conflicting
or controversial specifications, and we might figure that there could be
an effort to fix these areas.  But what does it mean to "fix" an area
that is controversial?  We look around at how badly other languages have
botched things in our eyes, and become fearful of the possibility (or
probability, to be realistic) that a large number of people with 
little experience in the Lisp style, but with the mindset that CL is
broken and must be fixed, will form a majority and institute a concensus
of change-for-change's-sake.  This is the opposite end of the spectrum,
and much worse than the problem of bearing with the idiosynchrosies
that currently exist.

To bring back the part we first shelved in considering the above
paragraph, there are quite a few areas in which CL could be enhanced
by way of some lightweight or de-facto standard.  I think that many of
us who are continuing to try to advance the language (no, it's not
at all static) are working in that area, and there are quite a few
modules and proposals already available or in progress.  What form
that standardization takes, I don't know.  CLRFI is one possibility,
and I also know that Kent has been mulling over a proposal he would like
us all to consider.  Addition of liberaries, especially ones that have
been tested on all of the major CL platforms, is also a Good Thing,
and allows CL to move forward.

> The effect of all this is to give the impression that any proposal
> will be sunk by a coalition of the apathetic and the hostile. From the
> viewpoint of an advocate of change, the difference between that and
> "CL is perfect" is insignificant.

If you are advocating change for change's sake, you can expect to meet
resistance.  If you have changes in the area people are more likely to
agree on, then the way you propose those changes will make a difference
as to how they are received.

> > What I personally feel is that "the cost of opening CL for fixes
> > through ANSI would be high and the benefit would be low".  Future
> > change can and should come through other means.  ANSI is an extremely
> > expensive tool that I personally think we should be done with.  Just
> > my opinion.

> I agree wholeheartedly, and I've said so in
> <······················@golden.net> (25 Mar 2004) and
> <······················@golden.net> (14 Apr 2004)

I also agree.  ANSI is our baseline; let's move on from there using
another vehicle.

-- 
Duane Rettig    ·····@franz.com    Franz Inc.  http://www.franz.com/
555 12th St., Suite 1450               http://www.555citycenter.com/
Oakland, Ca. 94607        Phone: (510) 452-2000; Fax: (510) 452-0182   
From: Matthias Buelow
Subject: Re: Beyond CL?
Date: 
Message-ID: <86mzo9e9l2.fsf@drjekyll.mkbuelow.net>
Cameron MacKinnon <··········@clearspot.net> writes:

>The effect of all this is to give the impression that any proposal
>will be sunk by a coalition of the apathetic and the hostile. From the
>viewpoint of an advocate of change, the difference between that and
>"CL is perfect" is insignificant.

From my very limited understanding, "improving" the standard
(radically) will never work. The only way it'll work is force. Spawn a
new dialect and make it exactly like you want it, simply ignoring the
boneheads which oppose any change. The result will be that either it's
different enough to form a completely new community (cf. Scheme), or
it's significantly better than the standard, in which situation it'll
de-facto kill the standard, or it, and the standard will get merged to
produce a new one (rather unlikely, imho).

mkb.
From: Greg Menke
Subject: Re: Beyond CL?
Date: 
Message-ID: <m3y87tgupe.fsf@athena.pienet>
Matthias Buelow <···@incubus.de> writes:

> Cameron MacKinnon <··········@clearspot.net> writes:
> 
> >The effect of all this is to give the impression that any proposal
> >will be sunk by a coalition of the apathetic and the hostile. From the
> >viewpoint of an advocate of change, the difference between that and
> >"CL is perfect" is insignificant.
> 
> From my very limited understanding, "improving" the standard
> (radically) will never work. The only way it'll work is force. Spawn a
> new dialect and make it exactly like you want it, simply ignoring the
> boneheads which oppose any change. The result will be that either it's
> different enough to form a completely new community (cf. Scheme), or
> it's significantly better than the standard, in which situation it'll
> de-facto kill the standard, or it, and the standard will get merged to
> produce a new one (rather unlikely, imho).
> 
> mkb.

Or get the vendors to work out the additional standards between them
because the users who buy their products want the new features.  If all
we're talking about is some kind of portable FFI, a sockets API and
other examples of that sort of thing, theres not much reason to be
talking about new dialects or formal standards activity.

All thats necessary are paying customers telling the vendors what they
want.

Gregm
From: Kent M Pitman
Subject: Re: Beyond CL?
Date: 
Message-ID: <ur7dltjk5.fsf@nhplace.com>
Matthias Buelow <···@incubus.de> writes:

> Cameron MacKinnon <··········@clearspot.net> writes:
> 
> >The effect of all this is to give the impression that any proposal
> >will be sunk by a coalition of the apathetic and the hostile. From the
> >viewpoint of an advocate of change, the difference between that and
> >"CL is perfect" is insignificant.
> 
> From my very limited understanding, "improving" the standard
> (radically) will never work. The only way it'll work is force. Spawn a
> new dialect and make it exactly like you want it, simply ignoring the
> boneheads which oppose any change.

I would word this differently, but I agree.

I don't think what you're proposing is force.  I don't see a need to
cast aspersions on people who stay or to laud those who go.  Everyone
can do what best serves them.  The market need not move in lockstep.

> The result will be that either it's
> different enough to form a completely new community (cf. Scheme), or
> it's significantly better than the standard, in which situation it'll
> de-facto kill the standard, or it, and the standard will get merged to
> produce a new one (rather unlikely, imho).

Or there will be two options instead of one.
From: Kent M Pitman
Subject: Re: Beyond CL?
Date: 
Message-ID: <uvf2xtjn5.fsf@nhplace.com>
Cameron MacKinnon <··········@clearspot.net> writes:

> Kent M Pitman wrote:
> > Cameron MacKinnon <··········@clearspot.net> writes:
> >
> >>The conventional wisdom here seems to be that CL is perfect and
> >>timeless.
> > If you can find even one person who alleges that CL is perfect, I'll
> > be surprised.  I think you're engaged in hyperbole here and weakening
> > your point in the process.
> 
> In my observation of c.l.l, when someone questions the utility of a
> feature or desires increased standardization, someone else will
> invariably defend the status quo.

If you read carefully, what people mostly defend is an attempt to change
the standard.  And not because they are defending its details as much as
because they know the cost of changing the standard is generally so huge
as to not be worth it.  I, at least, and the others I can think of who 
do likewise, usually are just saying "work first with vendors", "work for
consensus among vendors", see if that's enough.

A change to ANSI standard may be understood computationally as taking
a WITHOUT-INTERRUPTS on the entire Lisp community.  The entire
community is frozen at huge cost every time you lock down the
"multiprocessor" which is "the Lisp community".  Just as a large
multiprocessing system should take careful locks just on the things it
wants to change, so you want with modern standards.  Just lock the
thing you need to change, and if you don't need to lock anything,
don't.  Adding a feature can be done with no locking.  (This is what
we mostly advocate, since it largely seems to work.)  Synchronized
change to a part of the language can be done without breaking the rest
of the language.  (This largely has not even been needed, but would be
preferrable to ANSI.)  As far as I am aware, unless there's been a
procedural change since I was working with it, ANSI has no provision
for opening a standard to make only a limited set of changes.  As soon
as you form a committee to make some change and enable it to do so, it
can make ANY change.  Since everyone these days has some hobby horse,
you're likely to spend a lot of time answering to that, and making a
mess of things.  And I use the term "mess" not to say "you're breaking
my pretty standard" but rather just to say "you're not likely to get
anything coherent in finite time unless it's a brand new thing".  And
if all you want is a brand new thing, you don't have to open the standard
for repair--just do that to start with as an addition.

It's just procedure we are defending, and mostly because we now much
better understand the huge costs of entering into formal standardization.

> Some of these defenses are eloquent
> and convincing, others merely inane ("Oh, but C doesn't specify that
> either.")  This is not typically followed by a bunch of posters
> agreeing with the advocate of change. So while there may not be one
> person who thinks CL is perfect, there is always somebody willing to
> defend any given facet as being perfect.

IMO, they are just trying to say that languages can succeed with merely "tools
available" and that ANSI-ness is not the answer to success or failure.

> Then there is "local maximum" maxim, which claims that there is no
> small change which would improve the language; in effect, that the
> language is perfect for a particular problem subspace.

No.  There is no small change that would make the language so much
better as to be worth the cost of having changed the standard.  I'd
bet the mere cost of changing the standard at all would measure in the
millions of dollars.  My personal take is that those same millions are
better spent making product rather than merely pulling the existing
applications back into conformance.  (I suppose hungry programmers who
can't find applications to write but are looking for Lisp consulting
work might disagree.)
From: Ray Dillinger
Subject: Re: Beyond CL?
Date: 
Message-ID: <uCzKe.8593$p%3.36164@typhoon.sonic.net>
Kent M Pitman wrote:

> No.  There is no small change that would make the language so much
> better as to be worth the cost of having changed the standard.  

Inheritance order in classes with multiple inheritance comes as
close as anything I can think of to an outright bug in CL's ANSI
standard.  It is *definitely* something that could be improved.
But whether the improvement would be worth another round of
ANSI standardization?  That's a judgement call I don't want to
make, and in practice it doesn't seem to mess people up bad
enough to warrant the work.

					Bear
From: Kent M Pitman
Subject: Re: Beyond CL?
Date: 
Message-ID: <uek91ayr1.fsf@nhplace.com>
Ray Dillinger <····@sonic.net> writes:

> Kent M Pitman wrote:
> 
> > No.  There is no small change that would make the language so much
> > better as to be worth the cost of having changed the standard.
> 
> Inheritance order in classes with multiple inheritance comes as
> close as anything I can think of to an outright bug in CL's ANSI
> standard.  It is *definitely* something that could be improved.

I haven't heard that cited as even a bug, much less a bug of such 
extraordinary kind.  If you're going to try to make this case, you
should elaborate with some examples of your perception of the problem.
I really haven't a clue what you're talking about.

There are some who have said we should have left some orderings undefined
rather than define a total ordering, but I was an advocate then and continue
to be today of a theory that defining a total ordering makes things
more portable.  To my knowledge, that's the only part of the inheritance
order that's even mildly controversial.

> But whether the improvement would be worth another round of
> ANSI standardization?

I can't speak for any committee, but I can say that as a personal
guess, I'd probably put money on a bet that any reasonable standards
committee would dismiss any criticism of the sort you're describing
in favor of the status quo.

> That's a judgement call I don't want to make, and in practice it
> doesn't seem to mess people up bad enough to warrant the work.

I'd have written: "in practice, most people seem to either actively
like it or to not express an opinion one way or another".

But seriously, you can't just make a criticism of that magnitude and
not offer examples.
From: Paul F. Dietz
Subject: Re: Beyond CL?
Date: 
Message-ID: <ZN2dnZ2dnZ3t27_9nZ2dnZStZt-dnZ2dRVn-yJ2dnZ0@dls.net>
Kent M Pitman wrote:
> Ray Dillinger <····@sonic.net> writes:

>>Inheritance order in classes with multiple inheritance comes as
>>close as anything I can think of to an outright bug in CL's ANSI
>>standard.  It is *definitely* something that could be improved.
> 
> 
> I haven't heard that cited as even a bug, much less a bug of such 
> extraordinary kind.  If you're going to try to make this case, you
> should elaborate with some examples of your perception of the problem.
> I really haven't a clue what you're talking about.

I would have gone with the algorithm Dylan used, which is
like the CLOS algorithm but requiring monotonicity (so that
the CPL of a subclass is a proper supersequence of the CPL
of any of its superclasses.)

However, this can be added by adding a new metaclass; it doesn't
require incompatible changes in the standard.

	Paul
From: Bruce Hoult
Subject: Re: Beyond CL?
Date: 
Message-ID: <bruce-F4C04F.01034712082005@news.clear.net.nz>
In article <···········································@dls.net>,
 "Paul F. Dietz" <·····@dls.net> wrote:

> Kent M Pitman wrote:
> > Ray Dillinger <····@sonic.net> writes:
> 
> >>Inheritance order in classes with multiple inheritance comes as
> >>close as anything I can think of to an outright bug in CL's ANSI
> >>standard.  It is *definitely* something that could be improved.
> > 
> > 
> > I haven't heard that cited as even a bug, much less a bug of such 
> > extraordinary kind.  If you're going to try to make this case, you
> > should elaborate with some examples of your perception of the problem.
> > I really haven't a clue what you're talking about.
> 
> I would have gone with the algorithm Dylan used, which is
> like the CLOS algorithm but requiring monotonicity (so that
> the CPL of a subclass is a proper supersequence of the CPL
> of any of its superclasses.)
> 
> However, this can be added by adding a new metaclass; it doesn't
> require incompatible changes in the standard.

C3 is better than either the CLOS or Dylan algorithms.  We're planning 
to convert Gwydion Dylan to C3 sometime soon.

Here's a test program, which can no doubt be easily converted to CL :-)  
It does not distinguish between the Dylan and CLOS linearizations, which 
are identical in this case.

http://www.gwydiondylan.org/cgi-bin/viewcvs.cgi/trunk/examples/linearizat
ion/linearization.dylan?rev=9533&view=markup

-- 
Bruce |  41.1670S | \  spoken |          -+-
Hoult | 174.8263E | /\ here.  | ----------O----------
From: Paul F. Dietz
Subject: Re: Beyond CL?
Date: 
Message-ID: <nY2dnV1rmvowzmbfRVn-ow@dls.net>
Bruce Hoult wrote:

> Here's a test program, which can no doubt be easily converted to CL :-)  
> It does not distinguish between the Dylan and CLOS linearizations, which 
> are identical in this case.

It's my understanding that if the Dylan linearization succeeds, it's
identical to the CLOS linearization.

	Paul
From: Ray Dillinger
Subject: Re: Beyond CL?
Date: 
Message-ID: <UDKKe.8673$p%3.36278@typhoon.sonic.net>
Kent M Pitman wrote:
> Ray Dillinger <····@sonic.net> writes:
> 
> 
>>Kent M Pitman wrote:
>>
>>
>>>No.  There is no small change that would make the language so much
>>>better as to be worth the cost of having changed the standard.
>>
>>Inheritance order in classes with multiple inheritance comes as
>>close as anything I can think of to an outright bug in CL's ANSI
>>standard.  It is *definitely* something that could be improved.
> 
> 
> I haven't heard that cited as even a bug, much less a bug of such 
> extraordinary kind.  If you're going to try to make this case, you
> should elaborate with some examples of your perception of the problem.
> I really haven't a clue what you're talking about.

Under some circumstances, you can have a class A that inherits
behavior from B and then C...  but a class D which is a subclass of
class A inherits behavior from C and then B!

So, generic functions that treat an A as if it were a C, may treat
a B as if it were a D, even though B is a subclass of A.  I think
that this is confusing behavior and I can't imagine a reason for
anyone to do it on purpose.

Can you?

				Bear
From: Joe Marshall
Subject: Re: Beyond CL?
Date: 
Message-ID: <zms8cp4y.fsf@ccs.neu.edu>
Cameron MacKinnon <··········@clearspot.net> writes:

> For the most part, I'm happy with the expressivity of CL, the main
> exceptions being efficient bit/byte/word manipulation and memory
> layouts of structured data -- the old saw about being able to feel the
> bits between one's toes must have been about some other dialect. 

I find CL to be fairly good with bit/byte/word manipulation.  The
biggest drawback is that you can't easily represent 32-bits without
boxing and sometimes you have to poke around for
implementation-specific code (there are a few
implementation-specific things that are common to nearly all
implementations, so it's still `portable' in some bizarre sense).

> Just as an example, it seems that it isn't unusual for a typical
> application writer to feel he needs sockets, threads, Unicode and an
> FFI. None of those four things are exotic technology, but the Lisp
> newbie quickly realizes that an app with all four of these will be
> less than easily portable, that others' code collections with
> similar requirements may not be composable, etcetera. If a newbie
> discovers all of the wondrous properties of Lisp before he comes
> upon this hard truth, he may stay. But I suspect that a lot of
> potential converts just shake their heads and end up elsewhere.

This is something that *would* be a good thing to standardize.
From: Pascal Bourguignon
Subject: Re: Beyond CL?
Date: 
Message-ID: <87pstd9xyn.fsf@thalassa.informatimago.com>
"Tron3k" <······@gmail.com> writes:
> And yet the sharp-quote notation of CL continues to grate on me. Thus I
> believe your quote does not apply in this case, because I have been
> using it for quite a while.

Yes, it's ugly.  I prefer the more rounded parentheses of:

     (mapcar (function 1+) (list 1 2 3 4))

-- 
__Pascal Bourguignon__                     http://www.informatimago.com/

There is no worse tyranny than to force a man to pay for what he does not
want merely because you think it would be good for him. -- Robert Heinlein
From: Matthias Buelow
Subject: Re: Beyond CL?
Date: 
Message-ID: <86hdep3ypr.fsf@drjekyll.mkbuelow.net>
"Tron3k" <······@gmail.com> writes:

>Marcus Breiing wrote:
>> * Tron3k
>>
>> > I think Lisp-1 is objectively better because it looks better.
>>
>> The human aesthetic sense evolved in an environment marked by a
>> peculiar shortage of programming language design work.
>
>Did you ever stop to think about why aesthetics is so important in
>mathematics?

This is talking about axiomatic beauty and elegant proofs and the like
and that's got nothing to do with visual appearance.

I think the #' is useful, if not for keeping function and variable
values apart, then at least for visual disambiguation. Since the
S-expression syntax is so "clean" it's nice to have some clues
interspersed for getting a faster understanding on what's going on.
Sharp-quote is such a clue (similar to notational conventions like
*foo* for global variables or +bar+ for constants).
This isn't so much of an issue in languages that have a lot more
(algebraic) syntax because there is already enough visual entropy for
the brain to cling on when reading the source text.

mkb.
From: Pascal Bourguignon
Subject: Re: Beyond CL?
Date: 
Message-ID: <87ackh9iu7.fsf@thalassa.informatimago.com>
Matthias Buelow <···@incubus.de> writes:

> "Tron3k" <······@gmail.com> writes:
>
>>Marcus Breiing wrote:
>>> * Tron3k
>>>
>>> > I think Lisp-1 is objectively better because it looks better.
>>>
>>> The human aesthetic sense evolved in an environment marked by a
>>> peculiar shortage of programming language design work.
>>
>>Did you ever stop to think about why aesthetics is so important in
>>mathematics?
>
> This is talking about axiomatic beauty and elegant proofs and the like
> and that's got nothing to do with visual appearance.
>
> I think the #' is useful, if not for keeping function and variable
> values apart, then at least for visual disambiguation. 

(defun f (f) (value f))
(defvar f (lambda (f) (funcall (function f) (quote f))))

(list (function f) (value f) (quote f)) ;-)

-- 
__Pascal Bourguignon__                     http://www.informatimago.com/
Until real software engineering is developed, the next best practice
is to develop with a dynamic system that has extreme late binding in
all aspects. The first system to really do this in an important way
is Lisp. -- Alan Kay
From: André Thieme
Subject: Re: Beyond CL?
Date: 
Message-ID: <dbn3lb$m8s$2@ulric.tng.de>
Pascal Bourguignon schrieb:

> (defun f (f) (value f))
> (defvar f (lambda (f) (funcall (function f) (quote f))))
> 
> (list (function f) (value f) (quote f)) ;-)

Don't forget to add (defclass f () (f))


Andr�
-- 
From: jayessay
Subject: Re: Beyond CL?
Date: 
Message-ID: <m3irz5z84t.fsf@rigel.goldenthreadtech.com>
"Tron3k" <······@gmail.com> writes:

> Don Geddis wrote:
>
> > (based, it appears, mostly on your "personal preferences") are unlikely to
> > result in a superior design.  You don't even understand why the current
> > decisions were made, so you don't appreciate the tradeoffs when you attempt
> > to make different ones.
> 
> How do you know I don't appreciate the tradeoffs and am *willfully
> choosing to make them*?

Because you haven't provided any evidence for this.


/Jon

-- 
'j' - a n t h o n y at romeo/charley/november com
From: Pascal Costanza
Subject: Re: Beyond CL?
Date: 
Message-ID: <3k74n6Ft0kppU1@individual.net>
Tron3k wrote:

> I make no claim to be open-minded. I think Lisp-1 is objectively better
> because it looks better. That's all I care about at this stage. I'm
> only designing this language for myself anyway.

You are contradicting yourself. An objective statement is a statement 
everyone, without exception, has to agree with. Mathematical proofs are 
of this kind: Starting from certain axioms you come to certain 
conclusions, and when your proof is correct within a mathematical 
system, one has to agree with its correctness. (One may not agree to the 
axioms or to the approach taken, but these are meta-question.)

"I think Lisp-1 is objectively better." is an oxymoron. Either you think 
it's better, or it is objectively better. If it were objectively better, 
you wouldn't have to just think so.

It seems to me that people find it annoying that you present as 
objective facts what are actually just subjective preferences.


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121869764.811023.305110@g14g2000cwa.googlegroups.com>
Pascal Costanza wrote:
> "I think Lisp-1 is objectively better." is an oxymoron.

No, it isn't. As I am not God, I do not know for sure which one really
*is* objectively better. It's my *guess* that it really is Lisp-1.
From: jayessay
Subject: Re: Beyond CL?
Date: 
Message-ID: <m3ek9tz7u9.fsf@rigel.goldenthreadtech.com>
"Tron3k" <······@gmail.com> writes:

> Pascal Costanza wrote:
> > "I think Lisp-1 is objectively better." is an oxymoron.
> 
> No, it isn't. As I am not God, I do not know for sure which one really
> *is* objectively better. It's my *guess* that it really is Lisp-1.

So, clearly you could be objectively wrong.  The evidence for both
sides of this has already been presented and gone over ad nauseum.
You're not providing anything new.  For me, all this evidence that has
been presented indicates that you are simply wrong.


/Jon

-- 
'j' - a n t h o n y at romeo/charley/november com
From: Kent M Pitman
Subject: Re: Beyond CL?
Date: 
Message-ID: <u3bq9scs4.fsf@nhplace.com>
"Tron3k" <······@gmail.com> writes:

> Pascal Costanza wrote:
> > "I think Lisp-1 is objectively better." is an oxymoron.
> 
> No, it isn't. As I am not God, I do not know for sure which one really
> *is* objectively better. It's my *guess* that it really is Lisp-1.

See my response to Pascal where I defend the idea that this is not an
oxymoron, but at the same time conclude that there is no uniquely
determined notion of aesthetic.

Pascal is right that your statement has a problem.  He's wrong that the
problem with the statement is that it's an oxymoron.  The problem with the
statement is not that it's internally contradictory (an oxymoron), but
that it contains too little information to be meaningfully and usefully
evaluated. :)

If you were God, that wouldn't fix the problem, except insofar as it would
give you the option to tell people that you had a preference for one of
the many possible rules of aesthetic and that you would strike them down
with thunderbolts if they decided to blaspheme by pointing to the other
options in aesthetic that you hadn't blessed as your favorite.

But might doesn't always make right, and if there is a God (certainly open
to question), it's probably a good thing that he stays out of as many
aesthetic discussions as he does...

Or maybe God would have created a universe with a canonical notion of
aesthetic.  Hmmm.  That would have to be a universe of zero
dimensions.  Even in a one-dimensional world, you'd have people (one
dimensional caricatures of us though they may be) fighting over
whether 0-dimensional things were too "boring" to be considered
aesthetic and whether 1-dimensional things were better. Or whether
1-dimensional things were too complex and whether 0-dimensional things
were "simpler".  That fight is avoided only in a zero-dimensional world.
That's right, a single point.

But _being_ that point, you'd find it hard to step outside and _see_ 
the point.  Which would make such a universe kind of pointless, no? ;)
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121910068.558580.174800@g49g2000cwa.googlegroups.com>
Kent M Pitman wrote:
> "Tron3k" <······@gmail.com> writes:
>
> > Pascal Costanza wrote:
> > > "I think Lisp-1 is objectively better." is an oxymoron.
> >
> > No, it isn't. As I am not God, I do not know for sure which one really
> > *is* objectively better. It's my *guess* that it really is Lisp-1.
>
> See my response to Pascal where I defend the idea that this is not an
> oxymoron, but at the same time conclude that there is no uniquely
> determined notion of aesthetic.

You are *absolutely* wrong in a very real and mathematical sense.

In programming language design, aesthetic is creating a language with
the smallest number of basic orthogonal components. I don't have to
tell you that in Common Lisp, FUNCTION is a special operator.

We know that Lisp-1 and Lisp-2 are equivalent in that they can each be
embedded in the other. The only significant difference between them is
that Lisp-1 requires fewer orthogonal elements at the most basic level.

So, if you're going to do anything, you should create a true Lisp-1,
and then allow users to embed a Lisp-2 (or Lisp-N) in it. That is
aesthetically more pleasing than making it Lisp-N at the level of its
basic axioms.
From: Kent M Pitman
Subject: Re: Beyond CL?
Date: 
Message-ID: <u8y00c4bz.fsf@nhplace.com>
"Tron3k" <······@gmail.com> writes:

> Kent M Pitman wrote:
> > "Tron3k" <······@gmail.com> writes:
> >
> > > Pascal Costanza wrote:
> > > > "I think Lisp-1 is objectively better." is an oxymoron.
> > >
> > > No, it isn't. As I am not God, I do not know for sure which one really
> > > *is* objectively better. It's my *guess* that it really is Lisp-1.
> >
> > See my response to Pascal where I defend the idea that this is not an
> > oxymoron, but at the same time conclude that there is no uniquely
> > determined notion of aesthetic.
> 
> You are *absolutely* wrong in a very real and mathematical sense.
> 
> In programming language design, aesthetic is creating a language with
> the smallest number of basic orthogonal components. I don't have to
> tell you that in Common Lisp, FUNCTION is a special operator.

I didn't realize what an expert you were in programming language design.
I'll defer to your superior knowledge on such matters.

> We know that Lisp-1 and Lisp-2 are equivalent in that they can each be
> embedded in the other. The only significant difference between them is
> that Lisp-1 requires fewer orthogonal elements at the most basic level.

And, of course, there are no other matters involved in programming
language design whatsoever.  How silly of me to have overlooked this
obvious truth.  That certainly simplifies matters.
 
> So, if you're going to do anything, you should create a true Lisp-1,
> and then allow users to embed a Lisp-2 (or Lisp-N) in it. That is
> aesthetically more pleasing than making it Lisp-N at the level of its
> basic axioms.

Oh, that funny feeling I have in Lisp-1 is "being pleased".  I see.
Thanks for setting me straight on this.

I feel better now.
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121911336.617305.146850@g14g2000cwa.googlegroups.com>
Kent M Pitman wrote:
> "Tron3k" <······@gmail.com> writes:
>
> > Kent M Pitman wrote:
> > > "Tron3k" <······@gmail.com> writes:
> > >
> > > > Pascal Costanza wrote:
> > > > > "I think Lisp-1 is objectively better." is an oxymoron.
> > > >
> > > > No, it isn't. As I am not God, I do not know for sure which one really
> > > > *is* objectively better. It's my *guess* that it really is Lisp-1.
> > >
> > > See my response to Pascal where I defend the idea that this is not an
> > > oxymoron, but at the same time conclude that there is no uniquely
> > > determined notion of aesthetic.
> >
> > You are *absolutely* wrong in a very real and mathematical sense.
> >
> > In programming language design, aesthetic is creating a language with
> > the smallest number of basic orthogonal components. I don't have to
> > tell you that in Common Lisp, FUNCTION is a special operator.
>
> I didn't realize what an expert you were in programming language design.
> I'll defer to your superior knowledge on such matters.
>
> > We know that Lisp-1 and Lisp-2 are equivalent in that they can each be
> > embedded in the other. The only significant difference between them is
> > that Lisp-1 requires fewer orthogonal elements at the most basic level.
>
> And, of course, there are no other matters involved in programming
> language design whatsoever.  How silly of me to have overlooked this
> obvious truth.  That certainly simplifies matters.
>
> > So, if you're going to do anything, you should create a true Lisp-1,
> > and then allow users to embed a Lisp-2 (or Lisp-N) in it. That is
> > aesthetically more pleasing than making it Lisp-N at the level of its
> > basic axioms.
>
> Oh, that funny feeling I have in Lisp-1 is "being pleased".  I see.
> Thanks for setting me straight on this.
>
> I feel better now.

Look, I *know* you're Kent Pitman, and I'm just going to say this:

For Common Lisp, Lisp-N was the right thing. It's an industrial
strength language and we can't have mathematical perfection due to
practical considerations.

For my Lisp of Mathematical Perfection, Lisp-1 is the right thing. This
is due to the argument of axioms which I just gave. Plus my Lisp is an
experimental Lisp, and I don't see anyone using it in a nuclear
reactor. (Of course you can implement Lisp-N in it.)

Is there any complaint about that? I've even completely reversed my
position that CL should have been Lisp-1.
From: Jamie Border
Subject: Re: Beyond CL?
Date: 
Message-ID: <dbno17$2bj$1@nwrdmz02.dmz.ncs.ea.ibs-infra.bt.com>
"Tron3k" <······@gmail.com> wrote:
[..snip..]
> For my Lisp of Mathematical Perfection, [..snip..]

Or perhaps: Lisp Implementing Mathematical Perfection.

At least then you get an appropriate acronym.

Jamie
From: Kent M Pitman
Subject: Re: Beyond CL?
Date: 
Message-ID: <uhdeob6i0.fsf@nhplace.com>
"Tron3k" <······@gmail.com> writes:

> For Common Lisp, Lisp-N was the right thing. It's an industrial
> strength language and we can't have mathematical perfection due to
> practical considerations.
>
> For my Lisp of Mathematical Perfection, Lisp-1 is the right thing. This
> is due to the argument of axioms which I just gave. Plus my Lisp is an
> experimental Lisp, and I don't see anyone using it in a nuclear
> reactor. (Of course you can implement Lisp-N in it.)
> 
> Is there any complaint about that? I've even completely reversed my
> position that CL should have been Lisp-1.

Complaint?  I have no complaint whatsoever about you making whatever 
design choices you want for your own project.  I think it's good for
people to try other things.

I do have a bit of a problem with you mischaracterizing Lisp, not
because I mind you having misconceptions (it's a free world), but
because if you make gross misstatements about why CL is why it is or
about what the philosophical nature of the world is and they are
allowed to stand, there is some sense that they were unanswerable or
that you were promoting something accepted as canonical truth.

There are many good reasons to make a Lisp1.  None of them is that
it is objectively, uniquely, or even mathematically best; Mathematics
is "bigger" than that. None of them is that there is a canonical
notion of perfection; Philosophy is bigger than that.  I'd have no
problem with you just saying that it satisfies your personal notion
of perfection.  I have a lot of problem with you trying to own all of
Mathematics and Philosophy.

I could ask you some questions about your Lisp-1 and how it handles 
certain well-known problems that Lisp-1's have to confront.  (Not that
are irresolvable, but that are different than the problem Lisp-N's have
to confront.)  But I am not going to address this on the merits until
we are in a discourse domain that accepts other sensibilities.

One of the things I like least about the Lisp-1/Lisp-N debate is that
frequently (though not of necessity--it's just a statistical correlation
I've noticed, not a causality) people who think Lisp-1 is best are
people who think the world has only one best way to do things or best
think about things.  It's this conflict between the Single-World-Model-1
and Single-World-Model-N people that troubles me a great deal more.
From: jayessay
Subject: Re: Beyond CL?
Date: 
Message-ID: <m3oe8wxiwo.fsf@rigel.goldenthreadtech.com>
Kent M Pitman <······@nhplace.com> writes:

> "Tron3k" <······@gmail.com> writes:
> 
> > Is there any complaint about that? I've even completely reversed my
> > position that CL should have been Lisp-1.
> 
> There are many good reasons to make a Lisp1.  None of them is that
> it is objectively, uniquely, or even mathematically best; Mathematics
> is "bigger" than that. None of them is that there is a canonical
> notion of perfection; Philosophy is bigger than that.  I'd have no
> problem with you just saying that it satisfies your personal notion
> of perfection.  I have a lot of problem with you trying to own all of
> Mathematics and Philosophy.

You know, that's a pretty good synopsis of what makes narcissists so
annoying.  The last sentence even offers a sound bite for it.


> One of the things I like least about the Lisp-1/Lisp-N debate is that
> frequently (though not of necessity--it's just a statistical correlation
> I've noticed, not a causality) people who think Lisp-1 is best are
> people who think the world has only one best way to do things or best
> think about things.  It's this conflict between the Single-World-Model-1
> and Single-World-Model-N people that troubles me a great deal more.

I think you are pretty much spot on here as well.  I tend to see this
as analogous to (or even the same as) the difference between
idealogues and "realists/pragmatists".


/Jon

-- 
'j' - a n t h o n y at romeo/charley/november com
From: Cameron MacKinnon
Subject: Re: Beyond CL?
Date: 
Message-ID: <9c6dnXhEn8534n3fRVn-tQ@rogers.com>
Kent M Pitman wrote:
> "Tron3k" <······@gmail.com> writes:
> 
> 
>>For Common Lisp, Lisp-N was the right thing. It's an industrial
>>strength language and we can't have mathematical perfection due to
>>practical considerations.
>>
>>For my Lisp of Mathematical Perfection, Lisp-1 is the right thing. This
>>is due to the argument of axioms which I just gave. Plus my Lisp is an
>>experimental Lisp, and I don't see anyone using it in a nuclear
>>reactor. (Of course you can implement Lisp-N in it.)
>>
>>Is there any complaint about that? I've even completely reversed my
>>position that CL should have been Lisp-1.
> 
> 
> Complaint?  I have no complaint whatsoever about you making whatever 
> design choices you want for your own project.  I think it's good for
> people to try other things.
> 
> I do have a bit of a problem with you mischaracterizing Lisp, not
> because I mind you having misconceptions (it's a free world), but
> because if you make gross misstatements about why CL is why it is or
> about what the philosophical nature of the world is and they are
> allowed to stand, there is some sense that they were unanswerable or
> that you were promoting something accepted as canonical truth.
> 
> There are many good reasons to make a Lisp1.  None of them is that
> it is objectively, uniquely, or even mathematically best; Mathematics
> is "bigger" than that. None of them is that there is a canonical
> notion of perfection; Philosophy is bigger than that.  I'd have no
> problem with you just saying that it satisfies your personal notion
> of perfection.  I have a lot of problem with you trying to own all of
> Mathematics and Philosophy.

In the back of R5RS is laid out the formal semantics of a particular 
Lisp-1. People have operated upon those semantics to build such useful 
things as provably correct compilers and Sussman and Wisdom's 
demonstration of chaos in Pluto's orbit with the Digital Orrery. These 
things represent, to me, towers of (in a sense) beautiful mathematics in 
the service of man. Part of the appeal is that the mathematics 
underlying the Lisp is just as formal as that of the tool built atop it 
-- the system is a solid edifice from foundation to eaves.

Now it may be that there's formally specified Lisp-2 systems in which 
equally beautiful things have been written. But the canonical Lisp-2 
dialect has semantics which are defined in prose rather than more 
formally, and which aren't as amenable to writing provably correct 
programs. To be swayed by a claim that a Lisp-2 can be as mathematically 
or philosophically beautiful as a Lisp 1, is it not unreasonable to ask 
to see the formal semantics of, and examples of beautiful code in said 
dialect?

I certainly don't mean to say that mathematical tractability is the only 
metric of beauty. Just as physical artefacts have a beauty in proportion 
to how well they fit the hand, so computer artefacts can be judged by 
how well they fit the brain, and a creditable case could perhaps be made 
that Lisp-2s are a more natural paradigm for man. But to elevate this to 
a philosophy? Pragmatism and the accumulation of idioms that have been 
found serviceable are, I suppose, a kind of philosophy underpinning 
Common Lisp. But it certainly doesn't have the elegance or the charm of 
the philosophy embodied in the 'Art of the Interpreter' series and RnRS.

I can't rule out the possibility that someone could create a Lisp-2 as 
philosophically and mathematically elegant as R5RS. But as far as I 
know, one doesn't currently exist.


-- 
Cameron MacKinnon
Toronto, Canada
From: Kent M Pitman
Subject: Re: Beyond CL?
Date: 
Message-ID: <ubr4vmbss.fsf@nhplace.com>
Cameron MacKinnon <··········@clearspot.net> writes:

> In the back of R5RS is laid out the formal semantics of a particular
> Lisp-1. People have operated upon those semantics to build such useful
> things as [...]
>
> Now it may be that there's formally specified Lisp-2 systems in which
> equally beautiful things have been written. But the canonical Lisp-2
> dialect has semantics which are defined in prose rather than more
> formally, and which aren't as amenable to writing provably correct
> programs. [...]

So the first thing you observe is that the target audience is simply
different.

Formal semantics is not my personal area of expertise, but it was an
area of interest of numerous people involved in the design of CL.
While I don't mean to say there is no value to it, it was not something
that anyone had identified as a primary goal of the design.

I daresay that if anyone had cared to try one, the committee would
have been susceptible to accommodating specific issues but I don't
think that the inability to build a formal semantics was the stumbling
block.  In part, there was an absence of identified core to CL, but
that doesn't mean there is an absence of core.  It's just that the
design of CL did not proceed by first identifying a core and then a
library as it wa done in Scheme.  That's mostly an artifact of
history, and the purpose of CL, which was to keep alive an existing
community, not to make a new one.  But indirectly an effect was that
the people with the skill of and interest in writing a formal
semantics did not have the interest in actually learning CL in order
to do that; rather, they hoped that somehow the community would
accommodate them by keeping them from having to learn the whole
language in the form others had in order to produce the semantics.  I
phrase it in this form not to say that it's unreasonable to WISH there
were money enough in the budget to accommodate them, but it was
certainly unreasonable to EXPECT that this would be done.  The CL
project almost failed for the cost of doing it in the way it was done,
and in my personal opinion would certainly have failed if this degree
of housecleaning had been attempted.  It may seem that the "obvious"
problem was the lack of layering, but the constraint of holding
community interest was paramount, and my personal feel now and
probably at the time is that any attempt to just "redesign everything"
would have wrecked any sense of coherence we had.

I do recall that there was discussion on this matter. I probably was
not involved directly in the whole discussion, but I definitely recall
that there was a large faction of people who had the specific concern
that if there were a formal semantics and a textual semantics, then
one would have to take precedence.  No matter what you think about how
cool a formal semantics is, there was a clear sense of some involved
(and not just people you'd say were uninformed souls) that it would be
hard to get community consensus that it was right because most people
don't know how to read them.  There was a worry that if they diverged,
no one would detect it in time, and that there could be a fight over
whether a semantics few could read should be allowed to dominate
English that people could read and had carefully reviewed. 

So the idea of a formal semantics as a part of the standard itself was,
I believe, considered and rejected.  For similar reasons, btw, we did not
include pictorial type trees, etc.  We didn't want there to be unnecessary
redundancy in the spec where we could avoid it because if the two parts 
diverged it would be hard to tell what the right thing was.  So we added
class precedence lists and assumed you could construct your own picture
if you needed one.  (There's probably still redundancy in the document,
but mostly because the budget for reliably removing all such was not there.
We did what we could to remove it where we could find it and affor the
time to fix it.)

But one other thing I did as part of this all was to talk to some
people in the Scheme community and find out how "big" a formal
semantics for a Scheme-like Lisp-N would be.  The answer I got was
that the big cost would be if there was a specific finite N, but that
it was only a relatively small change (a few lines) to change Scheme to
a Lisp-Omega (any number of namespaces).  I'm told that it's not adding
namespaces that creates a problem, but fixing the namespaces to a finite
number.  This is one of the reasons I made my work in 
 http://www.nhplace.com/kent/Half-Baked/
be a Lisp-Omega and just ignored the namespaces other the standard four
(variables, functions, go tags, block names) that CL has.  I believe the
people who made this claim to me, so I don't believe that the per se
addition of additional namespaces is what keeps people from writing a
formal semantics for CL.

And beyond that, it's clear that CL has some additional linguistic features
in its core, but its core is not the 978 symbols in the spec, so I don't
think you'd find that a formal semantics for CL is intractable.

I and most of the designers of CL simply did not need the formal
semantics in order to get work done, and it seems to be the case that
those who do want a formal semantics either do not believe people can
get work done without it (because they think differently) and/or they
expect the people who aren't interested in a formal semantics to do
the work of providing one.

However, I reject out of hand a claim that one couldn't write such a formal
semantics.

Incidentally, as I said, I'm not an expert in this area, so I'm just
repeating very distant memories, but I seem to recall that it ALSO
came up that there is not a single canonical notion of what a formal
semantics is.  That is, and someone can set me straight if I'm
misremembering, that there are other formal paradigms other than the
oft-cited denotational semantics which are nevertheless formal and
might be alternate formulations.  Whether such alternate formulations
admit direct transformation to denotational semantics, or whether they
exist to address the fact that such transformations would be difficult
and that for some purposes a different starting point is advisable, is
not something I can answer.

> To be swayed by a claim that a Lisp-2 can be as
> mathematically or philosophically beautiful as a Lisp 1, is it not
> unreasonable to ask to see the formal semantics of, and examples of
> beautiful code in said dialect?

It seems certainly unreasonable for me to yield that notions of
formality, and particularly of a denotational semantics, are the only
route to judging aesthetics. I do not cede to you or anyone the meaning
of the notion of aesthetic or beauty, nor even the perhaps more limited 
but probably equally overbroad notions of mathematical aesthetic or
mathematical beauty.  If mathematics is so limited as to have only one
canonical such notion, then I have done well by not wasting my life in
the exclusive pursuit of it.

Is it unreasonable of you want it?  Again no.  Is it unreasonable of
you to demand that absent this, you will not engage in discussion?
Well, it's a free forum, but personally I think this will identify you
in my book as conversationally limited if absent such a tool you are
unable to carry on any discussion.   

I suggest that if you are someone who cares about these things, and if you
have interest enough to read and post here, perhaps it would be a worthwhile
thing to begin a project to identify a formal semantics for CL as an
outboard project.  Perhaps you would detect interesting ambiguities in the
process that cannot be resolved without clarification.  The result might 
either be a guess at a probable formal semantics or it might be a set of 
possible semantics or it might be a list of questions to ask vendors.  None
of these seem terrible outcomes to me.  I don't think the CL world has been
hurt by Paul Dietz seeking to make heaps of tests and this seems to me like
more of same.

However, I should say as an aside that this didn't come up in the process of
me alleging that CL itself was the most beautiful of languages.  When 
Gabriel and I wrote the namespace paper, it began as a competition between
Lisp/Scheme.  Every time he said Scheme, it seemed to score warm fuzzies
with those reading the paper but not because of namespaces--because of other
irrelevant matters like the nature and size of the core specification, etc.
I invented the Lisp1/Lisp2 distinction as part of the drafting of that paper
in order to emphasize that there could be a lisp2-Scheme and a lisp1-CL and
that in that case, it is far less clear what people would pick as the
most elegant, and hence to show that lisp1/lisp2 is not the dominant thing
people are looking at when they want elegance.  I myself admit there is a
certain degree of elegance in Scheme, I just don't get a lot of power out
of that when actually programming and so I don't tend to prefer to program
in it  on a daily basis even though I have actually used Scheme for work at
some of my jobs.  It just never captivated me for real tasks.  But what I
find even elegant about Scheme is NOT its lisp1-ness. In fact, I personally
find that one of the least elegant parts of Scheme, mostly because of the
huge accident of various puns on syntax that it does. The language would be
more consistent if every form were a list and if the car of every form were
a special form.  That's again how I got to the space of reasoning you'll
find in my Half-Baked area (cited above) where I have operators like 
(VAR X) [or, better, in Lisp-Omega style (VAR X VALUE) and (VAR X FUNCTION)]
and (CALL (VAR F FUNCTION) (VAR X VALUE) (VAR Y VALUE)) and where X is a
macro notation that expands into the former and (F X Y) is a macro notation
that expands into the latter.  Nevertheless, my overarching point is that
you're asking about Lisp2, not about CL, and in that context, none of what
I said about ambiguities of specification apply since Lisp2 != CL and 
Lisp1 != Scheme.

> I certainly don't mean to say that mathematical tractability is the
> only metric of beauty. Just as physical artefacts have a beauty in
> proportion to how well they fit the hand, so computer artefacts can be
> judged by how well they fit the brain,

But how well they fit the brain also requires a theory of the brain.
And such is never offered by the proponents of either Scheme or Lisp1
in discussions I've had on the matter.  

I allege that in fact the brain is wired to prefer Lisp2, and I cite as
evidence the fact that people routinely choose human languages where words
have many definitions and odd syntax over languages where words have few
definitions and a simple core.  Consequently, I am inclined to believe that
the brain sees these languages as "normal", not "overly complex".  And,
in fact, I think most people would agree that Gone with the Wind would
be unbearably tedious and boring in Esperanto or Loglan.

> and a creditable case could
> perhaps be made that Lisp-2s are a more natural paradigm for man. 

I have alleged that small languages make large programs, and vice versa.
But languages are implemented or learned only once and used many times,
so I prefer to optimize program length, not language size.  I think this
is the reason people are willing to learn a language like English and
don't insist on Esparanto.  The learning size is simply not the dominating
factor--they can't bear there only being one word for things, and with
having to make huge long constructions to get across simple concepts.

That I cannot make this claim formally for lack of formal studies and
evidence does not make it false, it just makes it a complex argument to
have fairly--just as it is complex for you to have the argument absence
a formal semantics.  However, I have at least gone out on a limb and made
a guess.  Are you willing to guess how many lines a formal semantics for
a Scheme-like Lisp2 or Lisp-Omega would be (and hence to risk being wrong,
as I am risking being wrong by the claims I have made), or are you just
saying you won't go out on a limb absent more evidence? :)

> But
> to elevate this to a philosophy? Pragmatism and the accumulation of
> idioms that have been found serviceable are, I suppose, a kind of
> philosophy underpinning Common Lisp. But it certainly doesn't have the
> elegance or the charm of the philosophy embodied in the 'Art of the
> Interpreter' series and RnRS.

A great deal of both of those are as much remarks about Scheme as about
Lisp1.  There are some cutenesses about those both that would be hindered
notationally by a Lisp2, I agree, but if the world had only a lisp2 version
of either of those things, I think it would still be having the Lisp/Scheme
debate.  There is more to the split between these languages than that.
The Lisp1/Lisp2 notation was created by me, as I said, to allow more focused
conversation on the matter simply of namespace, and you're being blurry again.

> I can't rule out the possibility that someone could create a Lisp-2 as
> philosophically and mathematically elegant as R5RS. But as far as I
> know, one doesn't currently exist.

A lot of the distinction between CL and Scheme is a question of what people
think is "normal" and "unusual".  The CL community, or certainly some of
its designers (to include myself, but others self-identified during some
pretty heated debates on the matter during the design process) think that
functional programming is "unusual"... not bad, not to be avoided, but not
central to everything Lisp--just another tool in the toolbox.  As such, we
were more comfortable seeing the calling of functions "flagged" so we could
spot them among the other things we were doing--not to stigmatize them, not
because there aren't times people use them so heavily that they wish for a
more compact notation, but because we felt in a subjective way that our target
community would not be doing function-called-parameters so often that it
would be unbearable to see the flagging.  And we felt that forms like
(lambda (list) (list list list list)) might reaonably come up where we wanted
both the function and the variable without having to re-spell them, just
as speakers of human languages can parse sentences where the same word
is variously used as a noun, verb, etc. (English "Buffalo buffalo buffalo."
and Spanish "Como como como." are extreme examples.)

You're of course welcome to conclude what you like about the elegance
of CL.  But if you're interested in evaluating it based on tools that
are not there, it seems sad that you are waiting for someone else to
provide you with those tools when you're the one with the interest in
having them.

And, since you seem interested primarily in Lisp1/Lisp2, not
Scheme/CL, and the theoretical answer to what a namespace affects, I
don't think the problem is as intractable as you think, at least for
the Lisp-Omega case, because I trust the people I asked about this.  I
think the real answer is that the proponents of Lisp1 don't want to do
the exercise becuase it will make their argument for Lisp1 harder by
showing that it's not that big a deal.  It seems to me like a good
student project--doesn't anyone have bored students looking for
projects they're advising and having trouble to find projects for?  A
paper answering this question would seem a fine contribution to The
Literature and a likely thing to see accepted at a conference such as
the upcoming European Lisp & Scheme Workshop (for some future year,
the next one is in only a couple of days and probably isn't taking papers,
even if your student was very fast at writing...).
From: Joe Marshall
Subject: Re: Beyond CL?
Date: 
Message-ID: <k6jjkkog.fsf@ccs.neu.edu>
Kent M Pitman <······@nhplace.com> writes:

> However, I reject out of hand a claim that one couldn't write such a
> formal semantics [for CL].

It seems to me that grad student studying language semantics should be
able to write one.  You wouldn't need (or want) to cover the *entire*
language, just the core elements.  Writing the denotational semantics
should be about as difficult as writing a non-optimizing compiler.

> Incidentally, as I said, I'm not an expert in this area, so I'm just
> repeating very distant memories, but I seem to recall that it ALSO
> came up that there is not a single canonical notion of what a formal
> semantics is.  That is, and someone can set me straight if I'm
> misremembering, that there are other formal paradigms other than the
> oft-cited denotational semantics which are nevertheless formal and
> might be alternate formulations.  Whether such alternate formulations
> admit direct transformation to denotational semantics, or whether they
> exist to address the fact that such transformations would be difficult
> and that for some purposes a different starting point is advisable, is
> not something I can answer.

Denotational semantics is primarily concerned with what a program
*means* (what it's return value is and what side effects occur).

Operational semantics is primarily concerned with what a program *does*
(a detailed step-by-step description of execution).

There are other semantics, but these are the two popular ones.  They
can be transformed into each other, but the transformation is nasty.

> I allege that in fact the brain is wired to prefer Lisp2, and I cite as
> evidence the fact that people routinely choose human languages where words
> have many definitions and odd syntax over languages where words have few
> definitions and a simple core.  Consequently, I am inclined to believe that
> the brain sees these languages as "normal", not "overly complex".  And,
> in fact, I think most people would agree that Gone with the Wind would
> be unbearably tedious and boring in Esperanto or Loglan.

I don't think there is a natural preference.  Multiple definitions and
odd syntax are an effective technique for getting as much information
as possible through a slow, inefficient, and lossy communications
channel.  Although humans are good at this, it doesn't mean that we
wouldn't prefer something else.

Legs are an effective technique for travel, but I prefer to drive.

~jrm
From: Lieven Marchand
Subject: Re: Beyond CL?
Date: 
Message-ID: <873bq5aag4.fsf@wyrd.be>
Kent M Pitman <······@nhplace.com> writes:

> However, I reject out of hand a claim that one couldn't write such a formal
> semantics.

One can write a denotational semantics of any sufficiently specified
language. A truly courageous soul has done one of C
(http://citeseer.ist.psu.edu/papaspyrou98formal.html).

In a zeroth order approximation you implement an interpreter for the
language in purely functional scheme written in a funny way.

-- 
"a totalitarian ideology that hates freedom, rejects tolerance, 
and despises all dissent."    --- Bush describing his religious 
constituency^W^W^Wthe Iraqi resistance
From: Gorbag
Subject: Re: Beyond CL?
Date: 
Message-ID: <6Y7Fe.4$_Y4.1@bos-service2.ext.ray.com>
"Cameron MacKinnon" <··········@clearspot.net> wrote in message
···························@rogers.com...

> Now it may be that there's formally specified Lisp-2 systems in which
> equally beautiful things have been written. But the canonical Lisp-2
> dialect has semantics which are defined in prose rather than more
> formally, and which aren't as amenable to writing provably correct
> programs. To be swayed by a claim that a Lisp-2 can be as mathematically
> or philosophically beautiful as a Lisp 1, is it not unreasonable to ask
> to see the formal semantics of, and examples of beautiful code in said
> dialect?

I knew I'd seen this addressed somewhere before:

http://home.pipeline.com/~hbaker1/MetaCircular.html
From: André Thieme
Subject: Re: Beyond CL?
Date: 
Message-ID: <dbn43i$m8s$4@ulric.tng.de>
Tron3k schrieb:
> Kent M Pitman wrote:
> 
>>"Tron3k" <······@gmail.com> writes:
>>
>>
>>>Pascal Costanza wrote:
>>>
>>>>"I think Lisp-1 is objectively better." is an oxymoron.
>>>
>>>No, it isn't. As I am not God, I do not know for sure which one really
>>>*is* objectively better. It's my *guess* that it really is Lisp-1.
>>
>>See my response to Pascal where I defend the idea that this is not an
>>oxymoron, but at the same time conclude that there is no uniquely
>>determined notion of aesthetic.
> 
> 
> You are *absolutely* wrong in a very real and mathematical sense.
> 
> In programming language design, aesthetic is creating a language with
> the smallest number of basic orthogonal components. I don't have to
> tell you that in Common Lisp, FUNCTION is a special operator.
> 
> We know that Lisp-1 and Lisp-2 are equivalent in that they can each be
> embedded in the other. The only significant difference between them is
> that Lisp-1 requires fewer orthogonal elements at the most basic level.
> 
> So, if you're going to do anything, you should create a true Lisp-1,
> and then allow users to embed a Lisp-2 (or Lisp-N) in it. That is
> aesthetically more pleasing than making it Lisp-N at the level of its
> basic axioms.

Can I conclude that you want to maximize the orthogonality of components 
while keeping the set of components as small as possible?
Well, then I suggest you these components:
0   1


Andr�
-- 
From: Pascal Bourguignon
Subject: Re: Beyond CL?
Date: 
Message-ID: <873bq8adno.fsf@thalassa.informatimago.com>
Andr� Thieme <······························@justmail.de> writes:
> Can I conclude that you want to maximize the orthogonality of
> components while keeping the set of components as small as possible?
> Well, then I suggest you these components:
> 0   1

Better:  1   i


-- 
__Pascal Bourguignon__                     http://www.informatimago.com/

Nobody can fix the economy.  Nobody can be trusted with their finger
on the button.  Nobody's perfect.  VOTE FOR NOBODY.
From: Pascal Costanza
Subject: Re: Beyond CL?
Date: 
Message-ID: <3k91moFtcj0aU1@individual.net>
Tron3k wrote:

> In programming language design, aesthetic is creating a language with
> the smallest number of basic orthogonal components.

You can't beat Unlambda in that regard. See 
http://www.madore.org/~david/programs/unlambda/

If you were true, this should be your dream language. I don't think so, 
so there must be something else that is important in language design.

> We know that Lisp-1 and Lisp-2 are equivalent in that they can each be
> embedded in the other. The only significant difference between them is
> that Lisp-1 requires fewer orthogonal elements at the most basic level.
> 
> So, if you're going to do anything, you should create a true Lisp-1,
> and then allow users to embed a Lisp-2 (or Lisp-N) in it.

This has some problems when you want to import/export definitions 
between the Lisp-1 and Lisp-2 environments.

Furthermore, embedding Lisp-1 in Lisp-2 has already been a few times 
before. See for example Pseudoscheme. I am not aware of an attempt to do 
it the other way around. Note that what you are currently attempting is 
also to embed a Lisp-1 in a Lisp-2.

> That is
> aesthetically more pleasing than making it Lisp-N at the level of its
> basic axioms.

Whatever.


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: André Thieme
Subject: Re: Beyond CL?
Date: 
Message-ID: <dbn3sl$m8s$3@ulric.tng.de>
Kent M Pitman schrieb:

> If you were God, that wouldn't fix the problem, except insofar as it would
> give you the option to tell people that you had a preference for one of
> the many possible rules of aesthetic and that you would strike them down
> with thunderbolts if they decided to blaspheme by pointing to the other
> options in aesthetic that you hadn't blessed as your favorite.

If he were God and therewith almighty he could simply *make* one of them 
objectively better. That is the funny thing almightiness can do.
God could create a stone and then say "I never made this stone" and it 
would be reality. He just made a stone that he did not make.


Andr�
-- 
God knows the last number. He even knows the one which follows after the 
last one.
From: jayessay
Subject: Re: Beyond CL?
Date: 
Message-ID: <m3k6jkxiqb.fsf@rigel.goldenthreadtech.com>
Andr� Thieme <······························@justmail.de> writes:

> If he were God and therewith almighty he could simply *make* one of
> them objectively better. That is the funny thing almightiness can do.
> God could create a stone and then say "I never made this stone" and it
> would be reality. He just made a stone that he did not make.

Isn't that an argument for non-existence?  Basically a reductio?


/Jon

-- 
'j' - a n t h o n y at romeo/charley/november com
From: Jock Cooper
Subject: Re: Beyond CL?
Date: 
Message-ID: <m3wtnkc9tm.fsf@jcooper02.sagepub.com>
jayessay <······@foo.com> writes:

> Andr� Thieme <······························@justmail.de> writes:
> 
> > If he were God and therewith almighty he could simply *make* one of
> > them objectively better. That is the funny thing almightiness can do.
> > God could create a stone and then say "I never made this stone" and it
> > would be reality. He just made a stone that he did not make.
> 
> Isn't that an argument for non-existence?  Basically a reductio?
> 
If true I think it means that [Gg]od isn't subject to G�del's incompleteness 
theorem.
From: Kent M Pitman
Subject: Re: Beyond CL?
Date: 
Message-ID: <uhdeokjrn.fsf@nhplace.com>
Jock Cooper <·····@mail.com> writes:

> jayessay <······@foo.com> writes:
> 
> > Andr� Thieme <······························@justmail.de> writes:
> > 
> > > If he were God and therewith almighty he could simply *make* one of
> > > them objectively better. That is the funny thing almightiness can do.
> > > God could create a stone and then say "I never made this stone" and it
> > > would be reality. He just made a stone that he did not make.
> > 
> > Isn't that an argument for non-existence?  Basically a reductio?
> > 
> If true I think it means that [Gg]od isn't subject to G�del's incompleteness 
> theorem.

Isn't the set of things that are not subject to incompleteness fairly
mundane?  If you're really worried about this, I'd more think you'd
either disproved God, proved that even God has limitations, proved
that God has common sense enough to realize that he'd better tolerate
incompleteness in order to achieve other good things, proved that God
had been incomplete in his search for bugs in what he built, proved
that God has a good armament of legalese for disclaiming failures (a
quick Google search [appropriate since I know people who half-kiddingly
allege that Google is itself God] turns up
 http://www.exploringtheword.com/furtherstudy/guide03/3sec5b.htm
as a possible illustration), or maybe just proved that God has a 
sense of humor.

There are so many explanations that I think it's fair to say that if you
don't know which of these you've proved, you've proved very little.

It's a good thing faith does not rely on proof.
From: Aatu Koskensilta
Subject: Re: Beyond CL?
Date: 
Message-ID: <Eh7Ee.7190$ab6.6819@reader1.news.jippii.net>
Jock Cooper wrote:
> 
> If true I think it means that [Gg]od isn't subject to G�del's incompleteness 
> theorem.

God isn't a formal system in which basic arithmetical facts are provable 
and thus is trivially not subject to G�del's incompleteness theorem.

-- 
Aatu Koskensilta (················@xortec.fi)

"Wovon man nicht sprechen kann, daruber muss man schweigen"
  - Ludwig Wittgenstein, Tractatus Logico-Philosophicus
From: Joe Marshall
Subject: Re: Beyond CL?
Date: 
Message-ID: <vf35t5u1.fsf@comcast.net>
"Tron3k" <······@gmail.com> writes:

> Pascal Costanza wrote:
>> "I think Lisp-1 is objectively better." is an oxymoron.
>
> No, it isn't. As I am not God, I do not know for sure which one really
> *is* objectively better. It's my *guess* that it really is Lisp-1.

I am God.  Neither is objectively better.


-- 
~jrm
From: André Thieme
Subject: Re: Beyond CL?
Date: 
Message-ID: <dbn47c$m8s$5@ulric.tng.de>
Joe Marshall schrieb:
> "Tron3k" <······@gmail.com> writes:
> 
> 
>>Pascal Costanza wrote:
>>
>>>"I think Lisp-1 is objectively better." is an oxymoron.
>>
>>No, it isn't. As I am not God, I do not know for sure which one really
>>*is* objectively better. It's my *guess* that it really is Lisp-1.
> 
> 
> I am God.  Neither is objectively better.

Can you please send me two million dollars?
Pleeeaaasseeee


Andr�
-- 
Praying to Joe Marshall (yeah, you know where I live)
From: Kent M Pitman
Subject: Re: Beyond CL?
Date: 
Message-ID: <u7jflsd8z.fsf@nhplace.com>
Pascal Costanza <··@p-cos.net> writes [in response to Tron3k]:

> "I think Lisp-1 is objectively better." is an oxymoron. Either you
> think it's better, or it is objectively better. If it were objectively
> better, you wouldn't have to just think so.
> 
> It seems to me that people find it annoying that you present as
> objective facts what are actually just subjective preferences.

Actually...

Just because something is objectively better doesn't imply that people
know it nor does it imply broad public acceptance.

For example, the fact that science is objectively better than religion
at explaining everyday events is, for better or worse, something that
most people have to guess at.

This is not a stab at religion in general, just a note that I think
religion's job is not to explain the every day, but the beyond.  In
fact, paralleling my remark above, I'll say that I think this is an
objective truth, since we have tools and data("the scientific method")
for examining the here-and-now, every-day world, and neither tools nor
data for examining the beyond, and it seems to me objectively true
that this is so whether some god created the world and did so in an
orderly way or whether there was no god and the world simply acquired
order either utterly by accident or because it didn't work without
order and eventually self-organized into order.

But my point is that the fact of an objective truth is not the fact of
knowledge of an objective proof, and hence there is no contradiction
in having a belief about objective truth.  Geez, Fermat had a belief
about objective truth that drove the Mathematics community if not the
world at large into a frenzy for years trying to find out if his
belief was, in fact, true.  So we know as well that there's a
computational gap between truth and knowable truth, in the sense that
one person might know one but another person might take a while to
catch up.  Christopher Columbus did the same trick but using an
empirical rather than mathematical claim.  And in a lesser known case,
I myself discovered the binomial theorem empirically before I had the
algebra training in junior high school to be able to understand why it
had to be true mathematically.  That didn't make it a false belief nor
something that wasn't objectively true, it just made it harder to prove
and hence made it necessary for me to speak fuzzily about until I managed
to put it into perspective.

Almost all of social science, psychology, and so on are made today of
people making fuzzy claims about things that will probably either be
shown later to be objectively true or objectively false (by some
criterion yet to be specified).  These are ironically called the
"soft" sciences to contrast them with the so-called "hard" (i.e.,
"firm") sciences.  But they might reasonably be called the "hard"
(i.e., "difficult") sciences and the others the "easy" (i.e.,
near-term tractable) sciences because they are so complex to reason
about that it's a wonder anyone ever can.  Things like gravity and
electricity are much more orderly and subject to controlled experiment
than things like politics or serial killers, so no wonder we do better
at making broad statements about physics than about politics.  But
that doesn't mean there's no objective truth to be had, it just means
that the combinatorics of studying the "difficult" sciences is worse
and people resort to initial guesses for longer periods of time.
Natural language continues to defy analysis even though human brains
seem well-adapted to self-learning it just from examples.

Certainly the presence of an engine (the brain) that can "solve"
natural language, and the presence of job positions for editors at
newspapers correcting grammar implies that a great many things about
natural language are objectively true.  Yet researchers in natural
language parsing still speak in terms of belief not because they think
what's going on inside people's brains is magic but because the
process of reverse engineering a complex engine is just plain
difficult and so knowledge about objective truth lags the fact of
objective truth.

However, back to the orginal point, I DO think that aesthetics is a
multi-dimensional space that has no single optimal solution.  You can
optimize on any axis or any functional combination of axes.  As a 
consequence, it's nonsensical and annoying to say that there is an
objective truth that a certain paradigm is canonically or uniquely
aesthetic, and it's equally nonsensical and annoying to say that it's
non-exclusively aesthetic without specifying the axis or axes along
which you are performing your optimization, that is, your set of preferred
aesthetic criteria, which I allege are not uniquely determined.
(There are those who might disagree, but to do so they must sacrifice
a great deal of aesthetics to do so; that is, some people might allege
that there is a single canonical notion of aesthetic that applies 
everywhere, and since that's itself a subjective choice, it would be
hard to prove them wrong--only easy to prove they are in the minority.)

So is it annoying? Yeah.  Is it because of having made a subjective 
statement about what is objective?  No.  It's because incomplete information
has been given about the objective criteria being used.

Webster.com says objective means 

  3 a : expressing or dealing with facts or conditions as perceived
        without distortion by personal feelings, prejudices, or
        interpretations

This doesn't imply that the human mind and personal feelings was not 
involved in the choice of criteria for the expression, only that the
chosen criteria can be evaluated without reference to that human mind
and personal feelings.  That can never happen without reference to the
criteria, and Tron3k has not specified his criteria.

That, I allege, is what's frustrating you.

Don't you feel better now?
From: Pascal Costanza
Subject: Re: Beyond CL?
Date: 
Message-ID: <3k79tvFt0nupU1@individual.net>
Kent M Pitman wrote:

> Don't you feel better now?

A lot. ;)

Pascal

P.S.: Actually, I think you indeed made it much clearer what I tried but 
obviously failed to express. So thanks.

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Don Geddis
Subject: Re: Beyond CL?
Date: 
Message-ID: <87ackht45i.fsf@sidious.geddis.org>
"Tron3k" <······@gmail.com> wrote on 19 Jul 2005 17:3:
> You said in your second post to be more open-minded. And yet here you
> say that Lisp-1 is objectively worse than Lisp-n. Hmm.

No, again you fail to appreciate the subtlety of the situation.

You made a claim that this Lisp-n nature of Common Lisp was a "mistake", and
that you could "improve" Common Lisp by making it a Lisp-1.  I responded that
this suggestion, like most of yours, would actually make Common Lisp worse,
and not better.

You fail to understand that Lisp-1 vs. Lisp-n involves a large bag of
tradeoffs, which have effects throughout the language.  There is no objective
evaluation that says that either is better than the other; they both have
significant pros and cons.  In the Lisp community, those that like the Lisp-1
style have tended to gravitate towards Scheme, while those that prefer Lisp-n
have headed to Common Lisp.

Your suggestion that Common Lisp would be improved by making it a Lisp-1 shows
ignorance of history.  Your suggestion that Lisp-1 is better than Lisp-n show
ignorance of programming language theory.

> I make no claim to be open-minded. I think Lisp-1 is objectively better
> because it looks better. That's all I care about at this stage.

At this point, we are all well aware that your only justifications for your
design decisions are that they make you feel good.

Whether this is because you saw a good movie on the same day you thought of
the feature, or perhaps you got laid that day; it's tough to tell.  The one
thing that seems clear is that your good feelings about a design are
independent of any analysis of the pros and cons of the decision for your
(hypothetical future) community of programmers.

> How do you know I don't appreciate the tradeoffs and am *willfully
> choosing to make them*?

I can only evaluate based on what you post here, which shows a complete lack
of awareness of the thought that went into the decisions the first time.
And, when backed into a corner to justify your decisions, they always seem to
end with the very weak "it feels right to me" or "I like it."  Those aren't the
responses of someone with stronger evidence to present.

> That is in fact what I'm doing.

Hardly likely.

>> Your ideas so far are not new.  They are well known, and have been
>> deliberately rejected (for good reasons) long ago.
>
> Scheme seems to have picked up one of them...

Rejected in the Common Lisp community.  I've told you to go to the Scheme
community many times, and you seem to ignore that suggestion for no good
reason.

You're the one who began this by claiming you had ideas to improve Common Lisp.
If all you had said was "I want to make a new Lisp dialect all my own", you
would have seen a lot less resistance.

People would have still likely thought you a fool; but a more harmless one,
wasting only your own time.  Instead, you're wasting the time of c.l.l too.

        -- Don
_______________________________________________________________________________
Don Geddis                  http://don.geddis.org/               ···@geddis.org
"I think," said Christopher Robin, "that we ought to eat all our provisions
now, so we won't have so much to carry."  -- A. A. Milne
From: Pascal Bourguignon
Subject: Re: Beyond CL?
Date: 
Message-ID: <87ackib5mb.fsf@thalassa.informatimago.com>
Don Geddis <···@geddis.org> writes:

> "Tron3k" <······@gmail.com> wrote on 18 Jul 2005 08:2:
>> There is a time in the future when people look back at Common Lisp and
>> *laugh*
>
> I agree with you.  Programming at this level of abstraction is far too much
> work.  Surely programming in the future will be much more like on Star Trek,
> where you basically have a conversation with the computer.  And it asks
> clarifying questions if your request was ambiguous.

Well, it won't exactly be _programming_.

http://www-pu.informatik.uni-tuebingen.de/users/klaeren/epigrams.html

93. When someone says "I want a programming language in which I need
    only say what I wish done," give him a lollipop.

When I read this epigram earlier this afternoon, my reaction was that
when "all" will be programmed, children will be able to program saying
only what they wish done: it will just select the right program.  (But
of course, smarties will always come with new requests for which there
is no preexisting program, and they won't like the standard lollipops).

-- 
__Pascal Bourguignon__                     http://www.informatimago.com/
I need a new toy.
Tail of black dog keeps good time.
Pounce! Good dog! Good dog!
From: fireblade
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121841572.393858.72990@g49g2000cwa.googlegroups.com>
Pascal Bourguignon wrote:
> Don Geddis <···@geddis.org> writes:
>
> > "Tron3k" <······@gmail.com> wrote on 18 Jul 2005 08:2:
> >> There is a time in the future when people look back at Common Lisp and
> >> *laugh*
> >
> > I agree with you.  Programming at this level of abstraction is far too much
> > work.  Surely programming in the future will be much more like on Star Trek,
> > where you basically have a conversation with the computer.  And it asks
> > clarifying questions if your request was ambiguous.
>
> Well, it won't exactly be _programming_.
>
> http://www-pu.informatik.uni-tuebingen.de/users/klaeren/epigrams.html
>
> 93. When someone says "I want a programming language in which I need
>     only say what I wish done," give him a lollipop.
>
> When I read this epigram earlier this afternoon, my reaction was that
> when "all" will be programmed, children will be able to program saying
> only what they wish done: it will just select the right program.  (But
> of course, smarties will always come with new requests for which there
> is no preexisting program, and they won't like the standard lollipops).
>
> --
> __Pascal Bourguignon__                     http://www.informatimago.com/
> I need a new toy.
> Tail of black dog keeps good time.
> Pounce! Good dog! Good dog!

Cool , but could someone explain to me 49:
#  Giving up on assembly language was the apple in our Garden of Eden:
Languages whose use squanders machine cycles are sinful. The LISP
machine now permits LISP programmers to abandon bra and fig-leaf.
From: Joe Marshall
Subject: Re: Beyond CL?
Date: 
Message-ID: <d5pdl2s7.fsf@ccs.neu.edu>
"fireblade" <········@YAHOO.COM> writes:

> Cool , but could someone explain to me 49:
> #  Giving up on assembly language was the apple in our Garden of Eden:
> Languages whose use squanders machine cycles are sinful. The LISP
> machine now permits LISP programmers to abandon bra and fig-leaf.

Why do you think LISP Machines were so popular?
From: Pascal Bourguignon
Subject: Re: Beyond CL?
Date: 
Message-ID: <87u0ip9y8a.fsf@thalassa.informatimago.com>
"fireblade" <········@YAHOO.COM> writes:
> Cool , but could someone explain to me 49:
> #  Giving up on assembly language was the apple in our Garden of Eden:
> Languages whose use squanders machine cycles are sinful. The LISP
> machine now permits LISP programmers to abandon bra and fig-leaf.

Because with a LISP Machine, Lisp programmers go back to Eden.

-- 
__Pascal Bourguignon__                     http://www.informatimago.com/

There is no worse tyranny than to force a man to pay for what he does not
want merely because you think it would be good for him. -- Robert Heinlein
From: Matthias Buelow
Subject: Re: Beyond CL?
Date: 
Message-ID: <86ll3voumy.fsf@drjekyll.mkbuelow.net>
Ulrich Hobelmann <···········@web.de> writes:

>Like assembly language the human mind is turing complete.

I'd like to see the proof.

mkb.
From: [Invalid-From-Line]
Subject: Re: Beyond CL?
Date: 
Message-ID: <dc2rm1$5c4$1@news.sap-ag.de>
"Matthias Buelow" <···@incubus.de> wrote in message
···················@drjekyll.mkbuelow.net...
> Ulrich Hobelmann <···········@web.de> writes:
>
> >Like assembly language the human mind is turing complete.
>
> I'd like to see the proof.
>
> mkb.
I take it that is human mind alone, without pen and paper and other help.

I think that a counter example is very easy.

Just get the person to mentally do a singular value decomposition on a
random 7 x 7 matrix accurate to 7 decimal places.

Any turing complete computer can be programmed to this. I doubt whether any
person can be taught to do this.

Also human operations tend to be somewhat 'random' whereas in a turing
machine operations are 100% reliable.

Rene.
From: Matthias Buelow
Subject: Re: Beyond CL?
Date: 
Message-ID: <3kkcrsFunnf0U1@news.dfncis.de>
··············@hotmail.com wrote:

>I take it that is human mind alone, without pen and paper and other help.

I don't think that's a valid assumption. Just as the TM has an
infinite tape, the mind should be allowed infinite "helper" storage
aswell, and aids to visualize things (it's not as if the paper were
some oracle that could independently compute things, so it doesn't
really add to the computing power of the brain).

>Just get the person to mentally do a singular value decomposition on a
>random 7 x 7 matrix accurate to 7 decimal places.

Well, obviously this is possible, otherwise noone would've come up
with this. The point is not having the same "raw computing power"
as a TM but being principally able to solve the same kind of problems.
Otherwise, if you're using strict Turing completeness, no existing
machine would have this capacity, since any existing machine's only
got finite storage.

>Also human operations tend to be somewhat 'random' whereas in a turing
>machine operations are 100% reliable.

This could be explained by plain non-determinism (guessing). You
could also add some probability stuff that would give the machine
confidence if it computes the probability to be over a certain value
and so believes it to be true (although in truth it might be false.)

mkb.
From: [Invalid-From-Line]
Subject: Re: Beyond CL?
Date: 
Message-ID: <dc2vrd$8ar$1@news.sap-ag.de>
"Matthias Buelow" <···@incubus.de> wrote in message
···················@news.dfncis.de...
> ··············@hotmail.com wrote:
> >I take it that is human mind alone, without pen and paper and other help.
>
> I don't think that's a valid assumption. Just as the TM has an
> infinite tape, the mind should be allowed infinite "helper" storage
> aswell, and aids to visualize things (it's not as if the paper were
> some oracle that could independently compute things, so it doesn't
> really add to the computing power of the brain).

Your getting this around the wrong way.
For most people, giving them some paper increases the computing power of the
person.

>
> >Just get the person to mentally do a singular value decomposition on a
> >random 7 x 7 matrix accurate to 7 decimal places.
>
> Well, obviously this is possible, otherwise noone would've come up
What is obviously possible? Or do you mean that a person can do the above
calculation with a piece of paper?
I was claiming that they can't do it without aids.

> >Also human operations tend to be somewhat 'random' whereas in a turing
> >machine operations are 100% reliable.
>
> This could be explained by plain non-determinism (guessing). You
> could also add some probability stuff that would give the machine
> confidence if it computes the probability to be over a certain value
> and so believes it to be true (although in truth it might be false.)

I am not talking about adding indeterminism to a computer. I am talking
about increasing the reliability of a persons brain when performing
algorithmic steps.
One option would be to have 10 people performing the same calculation and
checking their results with each other.

I would put it that:

A person can emulate a computer, but only one with very little memory, very
slow and not very reliable.
A computer can emulate a brain, but only one with very few neurons, and very
slowly.

The structure of current CPU's and human brains, and how they work, are so
different,
that I don't think comparisons are so usefull.

Also I think that the use of computer metaphors to the behavior of neural
nets/human psychology is often misplaced.

Rene.
From: Matthias Buelow
Subject: Re: Beyond CL?
Date: 
Message-ID: <3kkletFunn1vU1@news.dfncis.de>
··············@hotmail.com wrote:

>"Matthias Buelow" <···@incubus.de> wrote in message
>···················@news.dfncis.de...
>> ··············@hotmail.com wrote:
>> >I take it that is human mind alone, without pen and paper and other help.
>>
>> I don't think that's a valid assumption. Just as the TM has an
>> infinite tape, the mind should be allowed infinite "helper" storage
>> aswell, and aids to visualize things (it's not as if the paper were
>> some oracle that could independently compute things, so it doesn't
>> really add to the computing power of the brain).
>
>Your getting this around the wrong way.
>For most people, giving them some paper increases the computing power of the
>person.

No, I think it's equivalent. It's like LR(1) parsers vs. LR(k)
parsers. Both are equivalent in the set of languages they recognize
but LR(k) is "easier" (for the grammar writer). In the same way,
paper makes solving the problem easier (there are people who can
memoize huge random numbers, and, having no (life-)time constraints
in an idealized consideration, a person surely could memoize
intermediate steps that would otherwise have been written on paper)
but to make it any different the tool itself (ie., the paper) would
have to have computational power of its own, which is doesn't.
You can't write a question on paper, and, after waiting some time,
the answer appears on the paper.

>I am not talking about adding indeterminism to a computer. I am talking
>about increasing the reliability of a persons brain when performing
>algorithmic steps.
>One option would be to have 10 people performing the same calculation and
>checking their results with each other.

That's basically what PP/BPP-machines do (probabilistic TMs), where
a majority agreement of answers is considered to be the answer of
the machine. However, PP or BPP does not add computational power
to a TM (only may make certain computations faster).

>Also I think that the use of computer metaphors to the behavior of neural
>nets/human psychology is often misplaced.

At the moment at least. Maybe we'll know more in a couple decades.

mkb.
From: Ulrich Hobelmann
Subject: Re: Beyond CL?
Date: 
Message-ID: <3kkmgoFur9jkU1@individual.net>
Matthias Buelow wrote:
> Ulrich Hobelmann <···········@web.de> writes:
> 
>> Like assembly language the human mind is turing complete.
> 
> I'd like to see the proof.

Humans can simulate a turing machine.  They can even invent one 
(well, Turing could).

-- 
XML is a prime example of retarded innovation.
	-- Erik Meijer and Peter Drayton, Microsoft Corporation
From: Matthias Buelow
Subject: Re: Beyond CL?
Date: 
Message-ID: <86iryylq9n.fsf@drjekyll.mkbuelow.net>
Ulrich Hobelmann <···········@web.de> writes:

>>> Like assembly language the human mind is turing complete.
>> I'd like to see the proof.
>
>Humans can simulate a turing machine.  They can even invent one (well,
>Turing could).

Ok, that was the easy direction.
Now for the converse.
Write a TM program that emulates the human mind.

Note that "Turing completeness" states not inclusion but equivalence
(mutual inclusion).

mkb.
From: Cameron MacKinnon
Subject: Re: Beyond CL?
Date: 
Message-ID: <eL6dncxoRKCps3jfRVn-vQ@rogers.com>
Matthias Buelow wrote:
> Ulrich Hobelmann <···········@web.de> writes:
> 
> 
>>>>Like assembly language the human mind is turing complete.
>>>
>>>I'd like to see the proof.
>>
>>Humans can simulate a turing machine.  They can even invent one (well,
>>Turing could).
> 
> 
> Ok, that was the easy direction.
> Now for the converse.
> Write a TM program that emulates the human mind.

What's so concptually difficult about a 100 billion element neural net?

> Note that "Turing completeness" states not inclusion but equivalence
> (mutual inclusion).

Yes, and? Would you like to postulate that there's things which the 
brain can compute but which are uncomputable by Turing machines?

-- 
Cameron MacKinnon
Toronto, Canada
From: Matthias Buelow
Subject: Re: Beyond CL?
Date: 
Message-ID: <86ackalofs.fsf@drjekyll.mkbuelow.net>
Cameron MacKinnon <··········@clearspot.net> writes:

>Matthias Buelow wrote:
>> Ulrich Hobelmann <···········@web.de> writes:
>>>>>Like assembly language the human mind is turing complete.
>>>>I'd like to see the proof.
>>>Humans can simulate a turing machine.  They can even invent one (well,
>> Write a TM program that emulates the human mind.
>
>What's so concptually difficult about a 100 billion element neural net?
>Yes, and? Would you like to postulate that there's things which the
>brain can compute but which are uncomputable by Turing machines?

I don't know, do you? Maybe the brain doesn't work fully digital or
discrete? Then again, maybe it does but just saying that one believes
so isn't very convincing.
But surely you could give me the proof, and relieve Ulrich of that
burden?

mkb.
From: Cameron MacKinnon
Subject: Re: Beyond CL?
Date: 
Message-ID: <_dKdnXVZztgeoHjfRVn-jw@rogers.com>
Matthias Buelow wrote:

> Maybe the brain doesn't work fully digital or discrete? Then again,
> maybe it does but just saying that one believes so isn't very
> convincing. But surely you could give me the proof, and relieve
> Ulrich of that burden?

Pluralitas non est ponenda sine neccesitate.


-- 
Cameron MacKinnon
Toronto, Canada
From: Matthias Buelow
Subject: Re: Beyond CL?
Date: 
Message-ID: <8664uylltm.fsf@drjekyll.mkbuelow.net>
Cameron MacKinnon <··········@clearspot.net> writes:

>> Maybe the brain doesn't work fully digital or discrete? Then again,
>> maybe it does but just saying that one believes so isn't very
>> convincing. But surely you could give me the proof, and relieve
>> Ulrich of that burden?
>
>Pluralitas non est ponenda sine neccesitate.

For Occam's Razor to work, you have to show that the simpler
explanation is sufficient for explaining the phenomenon.

mkb.
From: Cameron MacKinnon
Subject: Re: Beyond CL?
Date: 
Message-ID: <-JmdncYzR6zo2XjfRVn-iw@rogers.com>
Matthias Buelow wrote:
> Cameron MacKinnon <··········@clearspot.net> writes:
> 
> 
>>>Maybe the brain doesn't work fully digital or discrete? Then again,
>>>maybe it does but just saying that one believes so isn't very
>>>convincing. But surely you could give me the proof, and relieve
>>>Ulrich of that burden?
>>
>>Pluralitas non est ponenda sine neccesitate.
> 
> 
> For Occam's Razor to work, you have to show that the simpler
> explanation is sufficient for explaining the phenomenon.

Right. I asked you for a counterexample, something a brain can compute 
that a Turing machine can't. Did I miss your reply, or are you expecting 
me to continue arguing against a position that you won't defend?

-- 
Cameron MacKinnon
Toronto, Canada
From: Matthias Buelow
Subject: Re: Beyond CL?
Date: 
Message-ID: <861x5mljvv.fsf@drjekyll.mkbuelow.net>
Cameron MacKinnon <··········@clearspot.net> writes:

>Right. I asked you for a counterexample, something a brain can compute
>that a Turing machine can't. Did I miss your reply, or are you
>expecting me to continue arguing against a position that you won't
>defend?

I don't have to give a counterexample, since I haven't made any
proposition. I simply don't know if the brain can compute more than a
TM, or not. However, Ulrich said that the brain was Turing-complete,
which, while certainly being a reasonable conjecture to make, would be
a groundbreaking result if it indeed would've been proven. As far as I
know, the question is however still open.

mkb.
From: Matthias Buelow
Subject: Re: Beyond CL?
Date: 
Message-ID: <86k6joji0p.fsf@drjekyll.mkbuelow.net>
"Tron3k" <······@gmail.com> writes:

>···@createprocessasuser
>···@CreateProcessAsUser

I prefer unix:unix-fork...
scnr ;-)

mkb.
From: Pascal Bourguignon
Subject: Re: Beyond CL?
Date: 
Message-ID: <87r7dwdv4n.fsf@thalassa.informatimago.com>
"Tron3k" <······@gmail.com> writes:
> And you tell me what looks better:
>
> ···@createprocessasuser
> ···@CreateProcessAsUser
>
> And before you ask, I am not going to change it to
> create-process-as-user. I have my reasons.

We don't question this.  We question why you need a new lisp when you
can just use one Common Lisp expression to get the result you want:

[9]> (setf (readtable-case *readtable*) :preserve)
:PRESERVE
[10]> (PRINT ····@CreateProcessAsUser)

···@CreateProcessAsUser 
···@CreateProcessAsUser


-- 
__Pascal Bourguignon__                     http://www.informatimago.com/
Our enemies are innovative and resourceful, and so are we. They never
stop thinking about new ways to harm our country and our people, and
neither do we. -- Georges W. Bush
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121698797.336739.165820@g14g2000cwa.googlegroups.com>
Pascal Bourguignon wrote:
> "Tron3k" <······@gmail.com> writes:
> > And you tell me what looks better:
> >
> > ···@createprocessasuser
> > ···@CreateProcessAsUser
> >
> > And before you ask, I am not going to change it to
> > create-process-as-user. I have my reasons.
>
> We don't question this.  We question why you need a new lisp when you
> can just use one Common Lisp expression to get the result you want:
>
> [9]> (setf (readtable-case *readtable*) :preserve)
> :PRESERVE
> [10]> (PRINT ····@CreateProcessAsUser)
>
> ···@CreateProcessAsUser
> ···@CreateProcessAsUser

Oh, I see what the objection is about, then. Yeah I know you can do
that readtable thing in Common Lisp. It's not that I need a new Lisp
because of that. Let's take it as a given that I'm going to make a new
Lisp. All I'm saying is that I will make it case-sensitive since I have
to make some choice, and that's the best choice for me.
From: Pascal Bourguignon
Subject: Re: Beyond CL?
Date: 
Message-ID: <87ek9wdr3o.fsf@thalassa.informatimago.com>
"Tron3k" <······@gmail.com> writes:

> Pascal Bourguignon wrote:
>> "Tron3k" <······@gmail.com> writes:
>> > And you tell me what looks better:
>> >
>> > ···@createprocessasuser
>> > ···@CreateProcessAsUser
>> >
>> > And before you ask, I am not going to change it to
>> > create-process-as-user. I have my reasons.
>>
>> We don't question this.  We question why you need a new lisp when you
>> can just use one Common Lisp expression to get the result you want:
>>
>> [9]> (setf (readtable-case *readtable*) :preserve)
>> :PRESERVE
>> [10]> (PRINT ····@CreateProcessAsUser)
>>
>> ···@CreateProcessAsUser
>> ···@CreateProcessAsUser
>
> Oh, I see what the objection is about, then. Yeah I know you can do
> that readtable thing in Common Lisp. It's not that I need a new Lisp
> because of that. Let's take it as a given that I'm going to make a new
> Lisp. All I'm saying is that I will make it case-sensitive since I have
> to make some choice, and that's the best choice for me.

See, it's smoother like this :-)


-- 
__Pascal Bourguignon__                     http://www.informatimago.com/
Kitty like plastic.
Confuses for litter box.
Don't leave tarp around.
From: Ray Dillinger
Subject: Re: Beyond CL?
Date: 
Message-ID: <QnwKe.8567$p%3.36148@typhoon.sonic.net>
Tron3k wrote:

> I am not a troll! 

Dude, if you ever have to actually say that, the very fact
makes it strongly suspect.

> I'm here because I love Lisp and I *actually am*
> making a new one! What's the problem with everyone here? This is not
> what I expected in comp.lang.lisp. I thought Lispy people *loved*
> playing around with new ideas for Lisps. 

Well, we're here.  I for one have more allegiance to Lisp than
Common Lisp.  But the main majority on the newsgroup actually
aren't interested in any lisp other than Common Lisp, and
seem to regard the idea of other lisps as threatening.

It's probably a cultural issue; the proliferation of
incompatible dialects and fragmentation of the Lisp community
in the past were blamed for a widespread rejection of Lisp
and kicked off the standardization effort that eventually
resulted in Common Lisp, so when you start talking about a
new dialect, bad feelings about rejection, fears of
fragmentation, and emotional investment in the current
compromise all indicate that you should be stopped.

You'll find more interest in actual lispy design over on
comp.lang.scheme (the less-rigid spec tends to attract folk
who have more desire to experiment).  Also, since you're
talking about a lisp-1, it'd be less of a strain for
schemers to read your pseudocode.

Right now, there isn't really a newsgroup that is devoted
to a broadbased universe of Lisp.  I'd like to see the
groups organized as comp.lang.commonlisp and
comp.lang.scheme, leaving comp.lang.lisp for actual
discussion of lispy languages in general; but that's not
going to happen.

> Personally I find it *so* fun,
> I do it all the time. On the train I carry around a clipboard to jot
> down new Lisp ideas *every single day*. What is going on here?

What is going on here is that you are a language *designer*.
And you are talking mainly to language *users.*  of a
different language with which you intend to compete.

> And before you ask, I am not going to change it to
> create-process-as-user. I have my reasons.

Being mysterious is detrimental to your credibility.  If
you intend to use the - character as syntax for some purpose,
just say so.

I'm also a lisp designer.  I use (!fname arg1 arg2) as a
shorthand for (not (fname arg1 arg2)).  Syntax.  Big woop.

				Bear
From: Joe Marshall
Subject: Re: Beyond CL?
Date: 
Message-ID: <slycm5ed.fsf@ccs.neu.edu>
Kenny Tilton <·······@nyc.rr.com> writes:

> Now for the bad news: the only folks sucked into your thread are the
> same ones who fall for every troll...

I prefer to think that I'm easily entertained by trolls.
From: Kenny Tilton
Subject: Re: Beyond CL?
Date: 
Message-ID: <deUCe.1569$Ow4.756862@twister.nyc.rr.com>
Joe Marshall wrote:
> Kenny Tilton <·······@nyc.rr.com> writes:
> 
> 
>>Now for the bad news: the only folks sucked into your thread are the
>>same ones who fall for every troll...
> 
> 
> I prefer to think that I'm easily entertained by trolls.
> 

Too late. You have been assigned to porting Dietz's compliance test to 
TronLisp. :)

-- 
Kenny

Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film

"I've wrestled with reality for 35 years, Doctor, and I'm happy to state 
I finally won out over it."
     Elwood P. Dowd, "Harvey", 1950
From: Joe Marshall
Subject: Re: Beyond CL?
Date: 
Message-ID: <d5pfvb60.fsf@comcast.net>
Kenny Tilton <·······@nyc.rr.com> writes:

> Joe Marshall wrote:
>> Kenny Tilton <·······@nyc.rr.com> writes:
>>
>>>Now for the bad news: the only folks sucked into your thread are the
>>>same ones who fall for every troll...
>> I prefer to think that I'm easily entertained by trolls.
>>
>
> Too late. You have been assigned to porting Dietz's compliance test to
> TronLisp. :)

Good Lord!  Can't I just say 3 `Our Fathers' and 5 `Hail Marys'?



-- 
~jrm
From: Paul Dietz
Subject: Re: Beyond CL?
Date: 
Message-ID: <dbj945$3p3$1@avnika.corp.mot.com>
Kenny Tilton wrote:

> Too late. You have been assigned to porting Dietz's compliance test to 
> TronLisp. :)

That should take, oh, two days, right?  :)

	Paul
From: Joe Marshall
Subject: Re: Beyond CL?
Date: 
Message-ID: <8y02vic6.fsf@comcast.net>
Paul Dietz <············@motorola.com> writes:

> Kenny Tilton wrote:
>
>> Too late. You have been assigned to porting Dietz's compliance test
>> to TronLisp. :)
>
> That should take, oh, two days, right?  :)

It depends whether Tron lets us in on his super-secret hack that makes
this all amazingly easy.


-- 
~jrm
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121792938.400057.260890@g49g2000cwa.googlegroups.com>
Joe Marshall wrote:
> Paul Dietz <············@motorola.com> writes:
>
> > Kenny Tilton wrote:
> >
> >> Too late. You have been assigned to porting Dietz's compliance test
> >> to TronLisp. :)
> >
> > That should take, oh, two days, right?  :)
>
> It depends whether Tron lets us in on his super-secret hack that makes
> this all amazingly easy.
>
>
> --
> ~jrm

Not yet. Here's the plan:

Right now I'm making the reader. When that's done I'll make the
evaluator. Naturally, I'm doing this all in Common Lisp.

When I'm done the reader and the evaluator, then I will make a web
interface to the language. You guys will be able to evaluate anything
you want and play around with stuff. That way I can get feedback as to
which features are bad and which features need to be added.

Sounds good? If I work hard I could get a simple version up in a week,
but more likely with me being lazy it'll take a month.
From: Kenny Tilton
Subject: Re: Beyond CL?
Date: 
Message-ID: <Y_bDe.1706$Ow4.832581@twister.nyc.rr.com>
Tron3k wrote:
> Joe Marshall wrote:
> 
>>Paul Dietz <············@motorola.com> writes:
>>
>>
>>>Kenny Tilton wrote:
>>>
>>>
>>>>Too late. You have been assigned to porting Dietz's compliance test
>>>>to TronLisp. :)
>>>
>>>That should take, oh, two days, right?  :)
>>
>>It depends whether Tron lets us in on his super-secret hack that makes
>>this all amazingly easy.
>>
>>
>>--
>>~jrm
> 
> 
> Not yet. Here's the plan:
> 
> Right now I'm making the reader. When that's done I'll make the
> evaluator. Naturally, I'm doing this all in Common Lisp.

Impressive for a tired old piece of crap of a language in need of so 
much work.

> 
> When I'm done the reader and the evaluator, then I will make a web
> interface to the language. You guys will be able to evaluate anything
> you want and play around with stuff. That way I can get feedback as to
> which features are bad and which features need to be added.
> 
> Sounds good? If I work hard I could get a simple version up in a week,
> but more likely with me being lazy it'll take a month.
> 

This /is/ impressive, slipping the schedule twice in one post. OK, all 
you math types, how does this sequence continue: 2, 7, 30, ?, ?..... 
what, TronCL 2010?

Seriously, tho, I am interested in hearing about Tron's breakthru. Synopsis?

-- 
Kenny

Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film

"I've wrestled with reality for 35 years, Doctor, and I'm happy to state 
I finally won out over it."
     Elwood P. Dowd, "Harvey", 1950
From: Thomas F. Burdick
Subject: Re: Beyond CL?
Date: 
Message-ID: <xcvoe8ypntx.fsf@conquest.OCF.Berkeley.EDU>
Kenny Tilton <·······@nyc.rr.com> writes:

> Tron3k wrote:
> > When I'm done the reader and the evaluator, then I will make a web
> > interface to the language. You guys will be able to evaluate anything
> > you want and play around with stuff. That way I can get feedback as to
> > which features are bad and which features need to be added.
> > 
> > Sounds good? If I work hard I could get a simple version up in a week,
> > but more likely with me being lazy it'll take a month.
> 
> This /is/ impressive, slipping the schedule twice in one post. OK, all 
> you math types, how does this sequence continue: 2, 7, 30, ?, ?..... 
> what, TronCL 2010?

There are infinitely many, but this seems as good as any:

  (defun series (n)
    (labels ((next (s)
  	       (append s (list (* 10 (next-prime (car (last s)))))))
  	     (s (n s)
  	       (if (<= n 0)
  		   s
  		   (s (1- n) (next s)))))
      (s n '(2 7 30))))

  (series 5) => (2 7 30 310 3110 31190 311930 3119510)
  
So that's 10 months, or 8.5 years, or 8.5 decades, or ... sounds about right.

-- 
           /|_     .-----------------------.                        
         ,'  .\  / | Free Mumia Abu-Jamal! |
     ,--'    _,'   | Abolish the racist    |
    /       /      | death penalty!        |
   (   -.  |       `-----------------------'
   |     ) |                               
  (`-.  '--.)                              
   `. )----'                               
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121803376.872970.179000@o13g2000cwo.googlegroups.com>
Kenny Tilton wrote:
> Tron3k wrote:
> > Joe Marshall wrote:
> >
> >>Paul Dietz <············@motorola.com> writes:
> >>
> >>
> >>>Kenny Tilton wrote:
> >>>
> >>>
> >>>>Too late. You have been assigned to porting Dietz's compliance test
> >>>>to TronLisp. :)
> >>>
> >>>That should take, oh, two days, right?  :)
> >>
> >>It depends whether Tron lets us in on his super-secret hack that makes
> >>this all amazingly easy.
> >>
> >>
> >>--
> >>~jrm
> >
> >
> > Not yet. Here's the plan:
> >
> > Right now I'm making the reader. When that's done I'll make the
> > evaluator. Naturally, I'm doing this all in Common Lisp.
>
> Impressive for a tired old piece of crap of a language in need of so
> much work.

Hmm, I distinctly remember saying that Common Lisp is the best existing
language.

> >
> > When I'm done the reader and the evaluator, then I will make a web
> > interface to the language. You guys will be able to evaluate anything
> > you want and play around with stuff. That way I can get feedback as to
> > which features are bad and which features need to be added.
> >
> > Sounds good? If I work hard I could get a simple version up in a week,
> > but more likely with me being lazy it'll take a month.
> >
>
> This /is/ impressive, slipping the schedule twice in one post. OK, all
> you math types, how does this sequence continue: 2, 7, 30, ?, ?.....
> what, TronCL 2010?
>

Hmm, the 2 days thing I said was how long I've been working on this, I
didn't ever say "something will be done in 2 days."

While you're here, I'm wondering why you thought I was confused about
Common Lisp's case sensitivity and all that. Is it because I said I'm
making my language case sensitive? Yes, I *know* Common Lisp is
case-sensitive. I think you mistakenly assumed I thought it wasn't.
Please clarify what that was all about.
From: Kenny Tilton
Subject: Re: Beyond CL?
Date: 
Message-ID: <h%cDe.1711$Ow4.835094@twister.nyc.rr.com>
Tron3k wrote:

> Kenny Tilton wrote:
> 
>>Tron3k wrote:
>>
>>>Joe Marshall wrote:
>>>
>>>
>>>>Paul Dietz <············@motorola.com> writes:
>>>>
>>>>
>>>>
>>>>>Kenny Tilton wrote:
>>>>>
>>>>>
>>>>>
>>>>>>Too late. You have been assigned to porting Dietz's compliance test
>>>>>>to TronLisp. :)
>>>>>
>>>>>That should take, oh, two days, right?  :)
>>>>
>>>>It depends whether Tron lets us in on his super-secret hack that makes
>>>>this all amazingly easy.
>>>>
>>>>
>>>>--
>>>>~jrm
>>>
>>>
>>>Not yet. Here's the plan:
>>>
>>>Right now I'm making the reader. When that's done I'll make the
>>>evaluator. Naturally, I'm doing this all in Common Lisp.
>>
>>Impressive for a tired old piece of crap of a language in need of so
>>much work.
> 
> 
> Hmm, I distinctly remember saying that Common Lisp is the best existing
> language.
> 
> 
>>>When I'm done the reader and the evaluator, then I will make a web
>>>interface to the language. You guys will be able to evaluate anything
>>>you want and play around with stuff. That way I can get feedback as to
>>>which features are bad and which features need to be added.
>>>
>>>Sounds good? If I work hard I could get a simple version up in a week,
>>>but more likely with me being lazy it'll take a month.
>>>
>>
>>This /is/ impressive, slipping the schedule twice in one post. OK, all
>>you math types, how does this sequence continue: 2, 7, 30, ?, ?.....
>>what, TronCL 2010?
>>
> 
> 
> Hmm, the 2 days thing I said was how long I've been working on this, I
> didn't ever say "something will be done in 2 days."
> 
> While you're here, I'm wondering why you thought I was confused about
> Common Lisp's case sensitivity and all that. Is it because I said I'm
> making my language case sensitive? Yes, I *know* Common Lisp is
> case-sensitive. I think you mistakenly assumed I thought it wasn't.
> Please clarify what that was all about.

I did!!!! <sigh> Tell you what, tell us your super-hack and I will spell 
it out for you in any case.

-- 
Kenny

Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film

"I've wrestled with reality for 35 years, Doctor, and I'm happy to state 
I finally won out over it."
     Elwood P. Dowd, "Harvey", 1950
From: Tron3k
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121804325.229493.312350@g14g2000cwa.googlegroups.com>
Kenny Tilton wrote:
> Tron3k wrote:
>
> > Kenny Tilton wrote:
> >
> >>Tron3k wrote:
> >>
> >>>Joe Marshall wrote:
> >>>
> >>>
> >>>>Paul Dietz <············@motorola.com> writes:
> >>>>
> >>>>
> >>>>
> >>>>>Kenny Tilton wrote:
> >>>>>
> >>>>>
> >>>>>
> >>>>>>Too late. You have been assigned to porting Dietz's compliance test
> >>>>>>to TronLisp. :)
> >>>>>
> >>>>>That should take, oh, two days, right?  :)
> >>>>
> >>>>It depends whether Tron lets us in on his super-secret hack that makes
> >>>>this all amazingly easy.
> >>>>
> >>>>
> >>>>--
> >>>>~jrm
> >>>
> >>>
> >>>Not yet. Here's the plan:
> >>>
> >>>Right now I'm making the reader. When that's done I'll make the
> >>>evaluator. Naturally, I'm doing this all in Common Lisp.
> >>
> >>Impressive for a tired old piece of crap of a language in need of so
> >>much work.
> >
> >
> > Hmm, I distinctly remember saying that Common Lisp is the best existing
> > language.
> >
> >
> >>>When I'm done the reader and the evaluator, then I will make a web
> >>>interface to the language. You guys will be able to evaluate anything
> >>>you want and play around with stuff. That way I can get feedback as to
> >>>which features are bad and which features need to be added.
> >>>
> >>>Sounds good? If I work hard I could get a simple version up in a week,
> >>>but more likely with me being lazy it'll take a month.
> >>>
> >>
> >>This /is/ impressive, slipping the schedule twice in one post. OK, all
> >>you math types, how does this sequence continue: 2, 7, 30, ?, ?.....
> >>what, TronCL 2010?
> >>
> >
> >
> > Hmm, the 2 days thing I said was how long I've been working on this, I
> > didn't ever say "something will be done in 2 days."
> >
> > While you're here, I'm wondering why you thought I was confused about
> > Common Lisp's case sensitivity and all that. Is it because I said I'm
> > making my language case sensitive? Yes, I *know* Common Lisp is
> > case-sensitive. I think you mistakenly assumed I thought it wasn't.
> > Please clarify what that was all about.
>
> I did!!!! <sigh>

So you already clarified this for me? If so, then I'm really missing
something. I'm almost certain you're probably a better CL programmer
than I am, so there could be something important you know and I don't.
Could you just tell me? After all, we're all Lispers here; we're on the
same side.
From: Kenny Tilton
Subject: Re: Beyond CL?
Date: 
Message-ID: <85jDe.1733$Ow4.865833@twister.nyc.rr.com>
Tron3k wrote:
> So you already clarified this for me? If so, then I'm really missing
> something. I'm almost certain you're probably a better CL programmer
> than I am, so there could be something important you know and I don't.
> Could you just tell me?

Try to keep up, will you? I am jerking your chain, not you mine.

> After all, we're all Lispers here; we're on the
> same side.

No, you are an asshole doing his best to despoil c.l.l, and I am an 
asshole doing his best to hose out your crap.

But keep it up, I love how these dweebs keep rising to your bait.

ps. I know who you are.

-- 
Kenny

Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film

"I've wrestled with reality for 35 years, Doctor, and I'm happy to state 
I finally won out over it."
     Elwood P. Dowd, "Harvey", 1950
From: Harald Hanche-Olsen
Subject: Re: Beyond CL?
Date: 
Message-ID: <pcofyuaqw7s.fsf@shuttle.math.ntnu.no>
+ Kenny Tilton <·······@nyc.rr.com>:

| OK, all you math types, how does this sequence continue: 2, 7, 30, ?,

As Thomas Burdick says there are many possibilities.  One of them is

  2,7,30,127,538,2279,9654,40895,173234,733831,3108558,
  13168063,55780810,236291303,1000946022,4240075391,
  17961247586,76085065735,322301510526,1365291107839,
  5783465941882,24499154875367,103780085443350

  http://www.research.att.com/cgi-bin/access.cgi/as/njas/sequences/eisA.cgi?Anum=A097924

-- 
* Harald Hanche-Olsen     <URL:http://www.math.ntnu.no/~hanche/>
- Debating gives most of us much more psychological satisfaction
  than thinking does: but it deprives us of whatever chance there is
  of getting closer to the truth.  -- C.P. Snow
From: Peter Seibel
Subject: Re: Beyond CL?
Date: 
Message-ID: <m2wtnpfcub.fsf@gigamonkeys.com>
Pascal Costanza <··@p-cos.net> writes:

> A _good_ suggestion for improvement should cover such aspects in
> order to have some convincing power. Stuff like "switch from Lisp-n
> to Lisp-1" would have to objectively mention all the drawbacks of
> such a change. See for example the excellent discussion at
> http://www.nhplace.com/kent/Papers/Technical-Issues.html

It's been a while since I read that paper but one issue that I don't
recall it discussing which I thought of while pondering Henry Baker's
ILC talk is the issue of renaming. I was imagining a Lisp system
(probably one that used an Interlisp-style structure editor) that
supports renaming things as a simple operation. That is, if I've
defined a function FOO and have used it a bunch of places then I shoud
be able to tell Lisp, (rename 'function 'foo 'bar) (or maybe (rename
#'foo 'bar)). Then I started thinking about macros. The immediate
problem is when you have an expression like (my-macro foo) you don't
know how the symbol FOO will be used (if at all) in the expansion of
MY-MACRO. But that's presumably possible--just macro expand the thing
and then see if FOO is used as the name of a function in the
expansion. (Okay, so there's some hair even then, but probably
solvable.) But then imagine this macro, in a Lisp-2:

  (defmacro self-call (name) `(,name ,name))

Now if my program contains the expression (self-call foo) what the
heck is (rename 'function 'foo 'bar) supposed to do. After resisting
it for a while since I'm generally a Lisp-2 fan, I realized that in a
Lisp-1 this problem goes away. Which is not necessarily enough to sway
my to the Lisp-1 camp--maybe were I to implement such a mythical Lisp
system, I'd just signal an error at RENAME time that gives you an
option of replacing the SELF-CALL expression with it's expansion (with
the function calls then appropriately renamed) or something. Anyway,
just something I was thinking about, apropos nothing, really.

-Peter

-- 
Peter Seibel           * ·····@gigamonkeys.com
Gigamonkeys Consulting * http://www.gigamonkeys.com/
Practical Common Lisp  * http://www.gigamonkeys.com/book/
From: Pascal Costanza
Subject: Re: Beyond CL?
Date: 
Message-ID: <3jvs2uFrq0dpU1@individual.net>
Peter Seibel wrote:
> Pascal Costanza <··@p-cos.net> writes:
> 
>>A _good_ suggestion for improvement should cover such aspects in
>>order to have some convincing power. Stuff like "switch from Lisp-n
>>to Lisp-1" would have to objectively mention all the drawbacks of
>>such a change. See for example the excellent discussion at
>>http://www.nhplace.com/kent/Papers/Technical-Issues.html
> 
> It's been a while since I read that paper but one issue that I don't
> recall it discussing which I thought of while pondering Henry Baker's
> ILC talk is the issue of renaming. I was imagining a Lisp system
> (probably one that used an Interlisp-style structure editor) that
> supports renaming things as a simple operation. That is, if I've
> defined a function FOO and have used it a bunch of places then I shoud
> be able to tell Lisp, (rename 'function 'foo 'bar) (or maybe (rename
> #'foo 'bar)). Then I started thinking about macros.
[...]

I was going to reply that you don't need macros to create examples that 
screw you in this regard. Indeed, (apply fun args) already has the 
problem because fun could be bound to a symbol whose symbol-function 
would be called. So you would need to replace that form by something like

(apply (case fun
          (foo 'bar)
          (t fun)) args)

so that things wouldn't break at runtime.

However, then I realized that a notion of symbol-function doesn't exist 
in Scheme. Even eval is restricted in Scheme such that this problem 
cannot be recreated.


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: William D Clinger
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121662793.953593.73270@g47g2000cwa.googlegroups.com>
Pascal Costanza wrote:
> Another example is the switch from converting identifiers to all upper
> case by default (as in ANSI Common Lisp) vs. keeping case (as in Allegro
> Common Lisp "modern" mode and as proposed for R6RS Scheme). There was
> recently a discussion at comp.lang.scheme about this issue, and in a
> chat at ILC'05, Steve Haflich explained Franz's reasons for the modern
> mode. Essentially, keeping case avoids all kinds of problems with
> characters that don't have unambiguous mappings between lower case and
> upper case, and since the whole world has decided to adopt Unicode, a
> switch to case sensitivity would solve a whole bunch of problems at
> once....

Just a point of information:  I am told, by people who should know,
that the Unicode standard defines unambiguous mappings between lower
and upper case that cover all code points.  These mappings are probably
not satisfactory to all users, and they are not as easy to implement
as implementors would like, but they are unambiguously defined.

Before someone asks about Eszett, I understand it to be a fixed point
of both mappings, making it both lower case and upper case.  That may
be one of the things that a lot of people don't like.

Will
From: Pascal Costanza
Subject: Re: Beyond CL?
Date: 
Message-ID: <3k196dFrv8goU2@individual.net>
William D Clinger wrote:

> Just a point of information:  I am told, by people who should know,
> that the Unicode standard defines unambiguous mappings between lower
> and upper case that cover all code points.  These mappings are probably
> not satisfactory to all users, and they are not as easy to implement
> as implementors would like, but they are unambiguously defined.
> 
> Before someone asks about Eszett, I understand it to be a fixed point
> of both mappings, making it both lower case and upper case.  That may
> be one of the things that a lot of people don't like.

...which I think is an acceptable solution.

However, if that's the case, why the plan to switch to case sensitivity 
for R6RS?


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: William D Clinger
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121699983.919197.288570@g43g2000cwa.googlegroups.com>
Pascal Costanza wrote:
> However, if that's the case, why the plan to switch to case sensitivity
> for R6RS?

A lot of people have been pushing for case sensitivity
for a long time.  Although the reasons they give vary,
the one I hear most often is that most other languages
(programming languages, that is, not natural languages)
are case sensitive, so that Scheme's case insensitivity
creates an unnecessary barrier to interoperation with
other languages.  That's a legitimate reason.

Interoperation with Common Lisp is no longer viewed as
a major issue.  Common Lisp's idiosyncratic package
system has been more of a barrier to compatibility at
the identifier level than case sensitivity is expected
to be.  Furthermore we want to add Unicode support to
Scheme, which is likely to create incompatibility with
Common Lisp anyway.

The Unicode notion of strings is hairy, and does not
correspond to the Lisp/Scheme notion of a string as a
sequence of characters.  The Unicode notion of a code
point is pretty simple, however, and corresponds to
the Lisp/Scheme notion of a character.  Taking strings
to be sequences of code points seems to be the most
reasonable position for Common Lisp and Scheme to take,
leaving the Unicode notion of string to libraries and
i/o that can be built on top of the sequence-of-code-points
notion.  Then upper-casing or lower-casing a Lisp/Scheme
string can be done by upper-casing or lower-casing its
constituent code points independently, which is a lot
simpler than upper-casing or lower-casing the Unicode
notion of a string.

Will
From: Joe Marshall
Subject: Re: Beyond CL?
Date: 
Message-ID: <k6jom33k.fsf@ccs.neu.edu>
"William D Clinger" <··········@verizon.net> writes:

> Pascal Costanza wrote:
>> However, if that's the case, why the plan to switch to case sensitivity
>> for R6RS?
>
> A lot of people have been pushing for case sensitivity
> for a long time.  Although the reasons they give vary,
> the one I hear most often is that most other languages
> (programming languages, that is, not natural languages)
> are case sensitive, so that Scheme's case insensitivity
> creates an unnecessary barrier to interoperation with
> other languages.  That's a legitimate reason.

It's *almost* a legitimate reason.

Many (perhaps most) versions of Scheme, like CL, support
case-sensitive symbols.  It is usually the reader that folds the
case.  There is often a syntax for disabling the folding (usually a
syntax that allows you to use nearly any sequence of characters as a
symbol).

> The Unicode notion of strings is hairy, and does not
> correspond to the Lisp/Scheme notion of a string as a
> sequence of characters.  The Unicode notion of a code
> point is pretty simple, however, and corresponds to
> the Lisp/Scheme notion of a character.  Taking strings
> to be sequences of code points seems to be the most
> reasonable position for Common Lisp and Scheme to take,
> leaving the Unicode notion of string to libraries and
> i/o that can be built on top of the sequence-of-code-points
> notion.  Then upper-casing or lower-casing a Lisp/Scheme
> string can be done by upper-casing or lower-casing its
> constituent code points independently, which is a lot
> simpler than upper-casing or lower-casing the Unicode
> notion of a string.

True, but the reason that Unicode has a string-oriented casing is that
code-point oriented casing doesn't work in all languages.
From: Pascal Costanza
Subject: Re: Beyond CL?
Date: 
Message-ID: <3k225lFsadusU1@individual.net>
William D Clinger wrote:
> Pascal Costanza wrote:
> 
>>However, if that's the case, why the plan to switch to case sensitivity
>>for R6RS?

[...]

OK. Thanks for your response.

> Interoperation with Common Lisp is no longer viewed as
> a major issue.

I am surprised. Was that ever a goal?


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: William D Clinger
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121782530.746540.187400@g14g2000cwa.googlegroups.com>
Pascal Costanza quoting me:
> > Interoperation with Common Lisp is no longer viewed as
> > a major issue.
>
> I am surprised. Was that ever a goal?

We're talking about compatibility of external representations,
which was important to a number of people during the 1980s.
The most influential of these people drifted away from both
Common Lisp and Scheme during the 1990s.

Will
From: Kent M Pitman
Subject: Re: Beyond CL?
Date: 
Message-ID: <ubr4yn5de.fsf@nhplace.com>
"William D Clinger" <··········@verizon.net> writes:

> Pascal Costanza quoting me:
> > > Interoperation with Common Lisp is no longer viewed as
> > > a major issue.
> >
> > I am surprised. Was that ever a goal?
> 
> We're talking about compatibility of external representations,
> which was important to a number of people during the 1980s.
> The most influential of these people drifted away from both
> Common Lisp and Scheme during the 1990s.
  
(Is there some reason you can't say this person's name?  Perhaps he/she
 doesn't want it known that he/she is so influential?  Perhaps the drift
 is a secret? ...)

Will, I hope you're not suggesting that this loss was caused by CL's
lack of caring about these issues, or that Scheme has been the eternal
champion of such matters.  I certainly sat in lots of Scheme design
meetings where I pushed for obvious things like branch cuts, error
handling, and other "useful" things and was told that such things
would either (a) increase the size [i.e., page count] of the spec
[something some authors absolutely were not prepared to do] or (b)
make the language "too useful" [thus meaning too many people would
flock to it and it would be the end of the designers' ability to do
the things to the language that _they_ wanted to do].  

The main point I was making above is that the history of how we got to
where we are today with both languages is relatively complex, and
there are many points of view.  So perhaps we should leave history
aside and focus on what can be done today.  I almost get the sense
from your remark that you think it is a lost cause to think the CL
community would care about these matters and that's why you are being
so (over)brief.  I hope that's not so.  I think there are potentially
interesting things that both communities could gain from some back and
forth dialog about what's going on with this in the Scheme community.
I hope you will periodically share thoughts on this if you're plugged
into the ongoing process.

Here are some starter questions for you:

Symbols.  Certainly the issue of symbols is a minor irritant, but I
  haven't seen that be a stopping point for Scheme emulators written in
  CL.  I recognize that you don't like the CL package system, but why
  can't you just ignore it, make a single package you do like, and use
  that.  CL specifies nothing about the layout of symbols anyway, only
  about their accessors.  Why exactly is this a problem in callout, since
  most languages other than Scheme don't have symbols, and since in Scheme
  most people just use symbols in the CL SCHEME package and ignore other
  symbols.  If appropriately abstracted, isn't that enough?

Continuations and Tail-Calling.  There is a well-known issue over
  continuations and tail calling that I won't open here since I assume
  you're not saying that's an issue of datatype layout.

Numbers.  CL has numbers that are pretty much able to be type-compatible
  with IEEE standards, except that we have a representation for rationals.
  Scheme has a bunch of types that are non-standard.  What is the criticism
  of CL here?

Structs.  Both CL and Scheme have opaque types for which you are not 
  told the specific data layout.  Are you saying you're going to start
  specifying it, so that people can pass structs directly to other languages?
  If so, that's certainly interesting.  How will this interact with the GC?
  How will you manage tag bits?  If not, then how is what you do different
  than what CL does?

  Manifest types and pointer indirection are always issues, since
  building recursive structures in Lisp without re-tagging the
  recursive parts is tricky.  (In languages that are statically typed,
  you can build a tree or other recursive type where type info is
  implicit in variables and slots but not manifest in all the parts.
  This saves some space at the cost of some flexibility and is a
  relatively deep design trade-off that most of the Lisp family
  accepts.  In just a few seconds' thought, it isn't apparent how
  Scheme could do any better at the specification level, though
  certainly I could imagine some block compilation of Scheme doing
  better by special-casing.)

Characters and Strings.  You have remarked about Unicode in specific.
  I'd be interested to hear what you think is a problem about CL in
  Unicode.  I mention this specifically because very late in the CL
  design process, the Japanese subcommittee studied the issue of CL's
  character and string types specifically and gave us a report on what
  we would need in order to keep from being incompatible with
  international character standards at the time, including subtle issues
  like providing for external formats in files, etc. different from 
  chacter coding issues for "in core" characters.  We made all of their
  recommended changes, and while we don't provide specific support for
  Unicode, my understanding is that several conforming implementations
  do provide such compatibility.  So what's your specific critique of CL
  other than that it simply needs a new layer that defines additional
  functionality?  I hear an indictment, but no specifics.

  In general, there is nothing in CL that precludes implementations providing
  additional datatypes that are compatible in specific layout with other
  languages, but neither is there anything that requires much in the way
  of specific layouts other than strings, which are problematic for all
  languages because of the issue of char[] vs string, and whether size
  is a constant time operation or not.

Anyway, I'd like to hear your critique of where CL has ignored something it
could or should be doing, or where Scheme is adding something that perhaps
CL could add, or where the languages have simply made different choices and
what the implications of those choices are.  You seem to be commenting on
some such difference, but absent more detail I cannot figure out what, and
without more exposition I am / we are incapable of learning from what you
appear to know...
From: Takehiko Abe
Subject: Re: Beyond CL?
Date: 
Message-ID: <keke-2007051320330001@192.168.1.2>
Kent M Pitman wrote:

> Characters and Strings.  You have remarked about Unicode in specific.
>   I'd be interested to hear what you think is a problem about CL in
>   Unicode. 

It is not CL's fault. The problem is that Unicode does not really
have a notion of character. For instance, combining diacritical
marks attach to a base character and form a new character. But
it has pre-comibined characters too. So if we compare strings,
we need to normalize them to a standard form first. Also, as far
as I know there's no restriction on a combination of a base
char + combining marks (that means, we have infinite number of
'characters'.)


>  I mention this specifically because very late in the CL
>   design process, the Japanese subcommittee studied the issue of CL's
>   character and string types specifically and gave us a report on what
>   we would need in order to keep from being incompatible with
>   international character standards at the time, including subtle issues
>   like providing for external formats in files, etc. different from 
>   chacter coding issues for "in core" characters.  We made all of their
>   recommended changes, 

I think the effort is paying off. Very impressive.
From: Kent M Pitman
Subject: Re: Beyond CL?
Date: 
Message-ID: <u4qaqcd41.fsf@nhplace.com>
····@gol.com (Takehiko Abe) writes:

> Kent M Pitman wrote:
> 
> > Characters and Strings.  You have remarked about Unicode in specific.
> >   I'd be interested to hear what you think is a problem about CL in
> >   Unicode. 
> 
> It is not CL's fault. The problem is that Unicode does not really
> have a notion of character. For instance, combining diacritical
> marks attach to a base character and form a new character. But
> it has pre-comibined characters too. So if we compare strings,
> we need to normalize them to a standard form first. Also, as far
> as I know there's no restriction on a combination of a base
> char + combining marks (that means, we have infinite number of
> 'characters'.)

Is it your sense that this cannot be resolved by simply providing extra 
operators?  Or is there a better representation we need to have in order
to enable efficient comparison?
From: Takehiko Abe
Subject: Re: Beyond CL?
Date: 
Message-ID: <keke-2007051348070001@192.168.1.2>
Kent M Pitman <······@nhplace.com> wrote:

> > It is not CL's fault. The problem is that Unicode does not really
> > have a notion of character. For instance, combining diacritical
> > marks attach to a base character and form a new character. But
> > it has pre-comibined characters too. So if we compare strings,
> > we need to normalize them to a standard form first. Also, as far
> > as I know there's no restriction on a combination of a base
> > char + combining marks (that means, we have infinite number of
> > 'characters'.)
> 
> Is it your sense that this cannot be resolved by simply providing extra 
> operators?  

No. I think it can be resolved by writing new operators one
needs.

> Or is there a better representation we need to have in order
> to enable efficient comparison?

No, I don't think there is a better representation.
(meaning I can't think of any.)
From: Ray Dillinger
Subject: Re: Beyond CL?
Date: 
Message-ID: <mJ7Ne.9806$p%3.38404@typhoon.sonic.net>
Takehiko Abe wrote:
> Kent M Pitman wrote:
> 
> 
>>Characters and Strings.  You have remarked about Unicode in specific.
>>  I'd be interested to hear what you think is a problem about CL in
>>  Unicode. 
> 
> 
> It is not CL's fault. The problem is that Unicode does not really
> have a notion of character. For instance, combining diacritical
> marks attach to a base character and form a new character. But
> it has pre-comibined characters too. So if we compare strings,
> we need to normalize them to a standard form first. Also, as far
> as I know there's no restriction on a combination of a base
> char + combining marks (that means, we have infinite number of
> 'characters'.)


I guess I'm notorious as the agitator in the scheme camp who has
been arguing in favor of a character type that corresponds to
what in unicode parlance is a "grapheme," whereas the emerging
consensus seems to be for a character type that corresponds to
what in unicode parlance is a "codepoint."

My thinking is that with graphemes, you get a closer
correspondence of the programmer's and user's conception
of a character, which should smooth out some communication
and comprehension issues, plus you could bury issues of
normalization below the level of programmer attention,
because they would never change the number of characters
in a string, and when people write characters in some
non-preferred form you could just silently read them as
normalized, sparing people the brain sweat of remembering
all the proper normalization stuff themselves, so that

(equal? #\A:macron:cedilla #\A:cedilla:macron) => #t

On the downside, the character set is then infinite,
char->integer may return bignums, characters may have to
be boxed (and if we want to preserve eq?-ness, boxed
characters have to be interned as well) and the number of
*codepoints* in a string or the specific representation
of a character as a codepoint sequence becomes undefined.
So there are tradeoffs...

A complementary change that IMO ought to be made at the
same time is the conversion of string representation from
character arrays (which presume fixed-size entities as
characters and provide constant-time access) to ropes
(which make no presumption of fixed size and provide
logarithmic-time access) which would allow operations
that change the number of codepoints (including casing,
substring replacement, and normalization operations)
to be carried out on substrings without unbounded
amounts of work if the string happens to be long.

There isn't really a body of experience on this point,
because no dialects with grapheme characters have really
been tried, with or without rope-based strings.  People
note, properly, that lots of algorithms will need attention
because lots of assumptions about character and string
representation (meaning codepoint=character and string=array
and access time to characters in strings is presumed
constant) are built into the traditional implementations
or analyses of those algorithms.  So it's a fairly drastic
step.

My conclusion is that someone needs to build an experimental
lisp or two in order to see how well these alternate
formulations work before we rush to standardization in a
major dialect, but given the amount of time involved and
the pressure for the standard, I don't think that is
going to happen in time.

				Bear
From: Rob Warnock
Subject: Re: Beyond CL?
Date: 
Message-ID: <h_adnQ-hWYT2HpjeRVn-gw@speakeasy.net>
Ray Dillinger  <····@sonic.net> wrote:
+---------------
| I guess I'm notorious as the agitator in the scheme camp who has
| been arguing in favor of a character type that corresponds to
| what in unicode parlance is a "grapheme," whereas the emerging
| consensus seems to be for a character type that corresponds to
| what in unicode parlance is a "codepoint."
...
| ...when people write characters in some
| non-preferred form you could just silently read them as
| normalized, sparing people the brain sweat of remembering
| all the proper normalization stuff themselves, so that
|   (equal? #\A:macron:cedilla #\A:cedilla:macron) => #t
| 
| On the downside, the character set is then infinite,
| char->integer may return bignums, characters may have to
| be boxed (and if we want to preserve eq?-ness, boxed
| characters have to be interned as well) and the number of
| *codepoints* in a string or the specific representation
| of a character as a codepoint sequence becomes undefined.
| So there are tradeoffs...
+---------------

On the plus side, ANSI CL has never required EQ-ness" of characters,
so there's none to "preserve". [See CLHS 13.1.5 "Identity of Characters".]
The CL programmer is only promised EQL/ identity for characters and numbers.

Another plus is that CHAR-CODE-LIMIT, the "upper exclusive bound on
the value returned by the function char-code", is an non-negative
INTEGER, not a FIXNUM, so while the occasional BIGNUM is ugly, it's
not forbidden.

On the down side, you will have to provide a mapping functions
from characters to CL character codes and back, so CHAR-CODE and
CODE_CHAR can work.

And I'm not sure, but give the history [and confusion] about the topic,
you'd probably want there to *not* be any "implementation-defined
attributes" for characters. That is, make CHAR-CODE and CHAR-INT
the same, and make normalized graphemes EQL to "the same" unnormalized
graphemes.

Another choice might be to allow/provide some "implementation-defined
attributes", and then only make normalized graphemes EQL, but perhaps
allow unnormalized graphemes to be EQUAL. [In that case CHAR-CODE and
CHAR-INT *would* be different for at least some graphemes.]


-Rob

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: Ray Dillinger
Subject: Re: Beyond CL?
Date: 
Message-ID: <q2_Ne.10067$p%3.39160@typhoon.sonic.net>
Rob Warnock wrote:

> On the plus side, ANSI CL has never required EQ-ness" of characters,
> so there's none to "preserve". [See CLHS 13.1.5 "Identity of Characters".]

Hmm.  I hadn't said a thing about grapheme-characters as a model
for CL, but proceeding on the assumption that I had....

Right.  I think there was no reason to promise EQ in the first
place, except that it happened to be "presumed easy" because
characters were "presumed unboxed".

> The CL programmer is only promised EQL/ identity for characters and numbers.

That's reasonable.

> Another plus is that CHAR-CODE-LIMIT, the "upper exclusive bound on
> the value returned by the function char-code", is an non-negative
> INTEGER, not a FIXNUM, so while the occasional BIGNUM is ugly, it's
> not forbidden.

But you can't pick a finite value for char-code-limit with grapheme
characters, and there will be a bunch of integers between 0 and
char-code-limit that don't correspond to any legitimate character.

> On the down side, you will have to provide a mapping functions
> from characters to CL character codes and back, so CHAR-CODE and
> CODE_CHAR can work.

Right.  That's pretty easy.

> And I'm not sure, but give the history [and confusion] about the topic,
> you'd probably want there to *not* be any "implementation-defined
> attributes" for characters. That is, make CHAR-CODE and CHAR-INT
> the same, and make normalized graphemes EQL to "the same" unnormalized
> graphemes.

I tend to agree.  Character attributes never really found
a useful role not better served by some higher-level
abstraction explicitly defined by the program using it.

				Bear
From: Rob Warnock
Subject: Re: Beyond CL?
Date: 
Message-ID: <14qdnR9y_p9u3pTeRVn-rg@speakeasy.net>
Ray Dillinger  <····@sonic.net> wrote:
+---------------
| Rob Warnock wrote:
| > Another plus is that CHAR-CODE-LIMIT, the "upper exclusive bound on
| > the value returned by the function char-code", is an non-negative
| > INTEGER, not a FIXNUM, so while the occasional BIGNUM is ugly, it's
| > not forbidden.
| 
| But you can't pick a finite value for char-code-limit with grapheme
| characters...
+---------------

Just pick something "really big", big enough to encode enough
overstrikes to turn the whole grapheme solid black, e.g., maybe
whatever CHAR-CODE-LIMIT value could accommodate a character
composed of a 256 codepoint sequence, say.

+---------------
| ...and there will be a bunch of integers between 0 and
| char-code-limit that don't correspond to any legitimate character.
+---------------

No problemo, says the CLHS:

    Function CODE-CHAR
    code-char code => char-p
    ...
    Returns a character with the code attribute given by code.
    If no such character exists and one cannot be created,
    NIL is returned.

And likewise:

    Constant Variable CHAR-CODE-LIMIT
    ...
    The value of char-code-limit might be larger than the actual
    number of characters supported by the implementation.


-Rob

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: Marcin 'Qrczak' Kowalczyk
Subject: Re: Beyond CL?
Date: 
Message-ID: <87psrxrhvq.fsf@qrnik.zagroda>
Ray Dillinger <····@sonic.net> writes:

> I guess I'm notorious as the agitator in the scheme camp who has
> been arguing in favor of a character type that corresponds to
> what in unicode parlance is a "grapheme," whereas the emerging
> consensus seems to be for a character type that corresponds to
> what in unicode parlance is a "codepoint."

If characters were graphemes, you would have to have a separate set of
functions which manipulate strings of code points inside a character,
mirroring those for real strings.

-- 
   __("<         Marcin Kowalczyk
   \__/       ······@knm.org.pl
    ^^     http://qrnik.knm.org.pl/~qrczak/
From: Karl A. Krueger
Subject: Re: Beyond CL?
Date: 
Message-ID: <devp4f$hps$1@baldur.whoi.edu>
Marcin 'Qrczak' Kowalczyk <······@knm.org.pl> wrote:
> Ray Dillinger <····@sonic.net> writes:
>> I guess I'm notorious as the agitator in the scheme camp who has
>> been arguing in favor of a character type that corresponds to
>> what in unicode parlance is a "grapheme," whereas the emerging
>> consensus seems to be for a character type that corresponds to
>> what in unicode parlance is a "codepoint."
> 
> If characters were graphemes, you would have to have a separate set of
> functions which manipulate strings of code points inside a character,
> mirroring those for real strings.

Or provide a keyword option on all the string functions to switch
between the two behaviors, much as we have various options on the
sequence functions.

-- 
Karl A. Krueger <········@example.edu> { s/example/whoi/ }
From: Ray Dillinger
Subject: Re: Beyond CL?
Date: 
Message-ID: <TtLQe.11019$p%3.43293@typhoon.sonic.net>
Marcin 'Qrczak' Kowalczyk wrote:
> Ray Dillinger <····@sonic.net> writes:
> 
> 
>>I guess I'm notorious as the agitator in the scheme camp who has
>>been arguing in favor of a character type that corresponds to
>>what in unicode parlance is a "grapheme," whereas the emerging
>>consensus seems to be for a character type that corresponds to
>>what in unicode parlance is a "codepoint."
> 
> 
> If characters were graphemes, you would have to have a separate set of
> functions which manipulate strings of code points inside a character,
> mirroring those for real strings.
> 

No I don't.  All characters are canonicalized on read,
full stop.  No character operation produces non-canonical
characters, full stop.  Character comparisons work,
character ordering is consistent, string comparison
works, etc etc etc.  You never have to look inside or
manipulate inside, because character identity is
*CHARACTER* identity, not *CODEPOINT SEQUENCE* identity.

Information like codepoint sequence is below the level
of abstraction of characters, just like word sequence is
below the level of abstraction of bignums.  It could be
represented differently on different systems anyway, right?

The only time you really have to be aware that there is a
codepoint sequence or multiple codepoints in a character
is that you have to pick a canonicalization form when you're
defining character ports or converting characters/strings
to/from blobs for binary I/O.

				Bear

				


				
From: Marcin 'Qrczak' Kowalczyk
Subject: Re: Beyond CL?
Date: 
Message-ID: <87acizntxq.fsf@qrnik.zagroda>
Ray Dillinger <····@sonic.net> writes:

>> If characters were graphemes, you would have to have a separate
>> set of functions which manipulate strings of code points inside
>> a character, mirroring those for real strings.
>
> No I don't. All characters are canonicalized on read, full stop.
> No character operation produces non-canonical characters, full stop.
> Character comparisons work, character ordering is consistent, string
> comparison works, etc etc etc. You never have to look inside or
> manipulate inside, because character identity is *CHARACTER*
> identity, not *CODEPOINT SEQUENCE* identity.

How would you create a character with given codepoint sequence?
How would you examine codepoints of the given character?

> Information like codepoint sequence is below the level of
> abstraction of characters, just like word sequence is below
> the level of abstraction of bignums.

Depends what are you doing. How would you parse a programming language
whose lexical syntax is defined in terms of code points?

You *can* extract words of a bignum if you want.

> It could be represented differently on different systems anyway,
> right?

Unicode provides a canonical representation as a sequence of code
points. All representations in use are easily translated to and from
this representation.

> The only time you really have to be aware that there is a codepoint
> sequence or multiple codepoints in a character is that you have to
> pick a canonicalization form when you're defining character ports or
> converting characters/strings to/from blobs for binary I/O.

Not only that. Almost all Unicode algorithms are specified in terms of
code points, for example case folding and collation.

And I would not like a program to change canonicalization form without
asking. This screws up e.g. diff (we get a difference in almost every
line).

-- 
   __("<         Marcin Kowalczyk
   \__/       ······@knm.org.pl
    ^^     http://qrnik.knm.org.pl/~qrczak/
From: Greg Menke
Subject: Re: Beyond CL?
Date: 
Message-ID: <m3ll2jf9co.fsf@athena.pienet>
Marcin 'Qrczak' Kowalczyk <······@knm.org.pl> writes:

> Ray Dillinger <····@sonic.net> writes:
> 
> >> If characters were graphemes, you would have to have a separate
> >> set of functions which manipulate strings of code points inside
> >> a character, mirroring those for real strings.
> >
> > No I don't. All characters are canonicalized on read, full stop.
> > No character operation produces non-canonical characters, full stop.
> > Character comparisons work, character ordering is consistent, string
> > comparison works, etc etc etc. You never have to look inside or
> > manipulate inside, because character identity is *CHARACTER*
> > identity, not *CODEPOINT SEQUENCE* identity.
> 
> How would you create a character with given codepoint sequence?
> How would you examine codepoints of the given character?

Well despite being quite ignorant about unicode et al, perhaps wrapping
the read in a similar to with-standard-io-syntax to suitably tweak
coding assumptions is the way to handle it- much like how read handles
number base.

Gregm
From: Ray Dillinger
Subject: Re: Beyond CL?
Date: 
Message-ID: <ahbRe.11190$p%3.45001@typhoon.sonic.net>
Marcin 'Qrczak' Kowalczyk wrote:
> Ray Dillinger <····@sonic.net> writes:

>>> If characters were graphemes, you would have to have a separate
>>> set of functions which manipulate strings of code points inside
>>> a character, mirroring those for real strings.

>> No I don't. All characters are canonicalized on read, full stop.
>> No character operation produces non-canonical characters, full stop.
>> ...You never have to look inside or
>> manipulate inside, because character identity is *CHARACTER*
>> identity, not *CODEPOINT SEQUENCE* identity.

> How would you create a character with given codepoint sequence?

Beyond creating canonical characters, that's a bad idea
to do.  I treat

#\A:cedilla:macron

as syntax for a character constant; However, this doesn't
(and shouldn't) specify codepoint sequence; since cedilla
and macron are of different combining classes,
canonicalization on read will put them into whatever the
correct canonical order is. Thus,

(equal? #\A:cedilla:macron #\A:macron:cedilla) ==> #t

> How would you examine codepoints of the given character?

Why would you examine codepoints for a given character?
I mean, seriously: if it's a _CHARACTER_, the codepoints
don't matter, any more than whether an integer constant
was expressed in base 8 or base 10.

If you absolutely have to, you can convert the character
into a blob and read the words of the blob; but all that
will tell you is the codepoints for the character in the
canonicalization form you used when you converted the
character into a blob.

Conversely, if you are reading from some port and want to
know binary details about the transmission format, you can
read binary (a blob) instead of characters, and examine
the words of the blob before converting it into characters
or a string. But all that will tell you is the form in
which it was transmitted to that port; by converting it
into a character, you've already converted it to canonical
form.

> Depends what are you doing. How would you parse a programming language
> whose lexical syntax is defined in terms of code points?

In the first place, most of the "codepoint" language specs
out there exclusively use singleton codepoints that are
also graphemes.  If that is the case, I can just read
characters and it will work.  If that is not the case,
then I probably will want to make an FFI package that
supports syntax and aliases for reading codepoints as
binary blobs.

> You *can* extract words of a bignum if you want.

Yah.  But when you do so, it's not because you're working
with it as a number.  When you go after the binary words
of a bignum, you want it for purposes other than math. I
maintain that when you go after the binary words of a
character, you want it for purposes other than use as a
character.

>>It could be represented differently on different systems anyway,
>>right?

>>The only time you really have to be aware that there is a codepoint
>>sequence or multiple codepoints in a character is that you have to
>>pick a canonicalization form when you're defining character ports or
>>converting characters/strings to/from blobs for binary I/O.

> Not only that. Almost all Unicode algorithms are specified in terms of
> code points, for example case folding and collation.

Right. Those get packaged as functions to call, so that
the ordinary programmer (as opposed to the lisp system
implementor) never has to worry about the internals.

> And I would not like a program to change canonicalization form without
> asking. This screws up e.g. diff (we get a difference in almost every
> line).

So use the same normalization form to write it that you
used to read it.  If you get a diff, it means the program that
produced it contains an error in its unicode implementation.
Why do you want to worry about it in between?

				Bear
From: joesb
Subject: Re: Beyond CL?
Date: 
Message-ID: <1125596926.364189.60060@o13g2000cwo.googlegroups.com>
This idea seems perfectly fine to me.

code-point and its ordering is to character the same way bit-wise
representation and endianess is to number. IMHO. Just leave it to be
solve by System Implementor, once and for all.
From: Pascal Bourguignon
Subject: Re: Beyond CL?
Date: 
Message-ID: <87ackkdq4u.fsf@thalassa.informatimago.com>
"William D Clinger" <··········@verizon.net> writes:

> Pascal Costanza wrote:
>> However, if that's the case, why the plan to switch to case sensitivity
>> for R6RS?
>
> A lot of people have been pushing for case sensitivity
> for a long time.  Although the reasons they give vary,
> the one I hear most often is that most other languages
> (programming languages, that is, not natural languages)
> are case sensitive, so that Scheme's case insensitivity
> creates an unnecessary barrier to interoperation with
> other languages.  That's a legitimate reason.
>
> Interoperation with Common Lisp is no longer viewed as
> a major issue.  Common Lisp's idiosyncratic package
> system has been more of a barrier to compatibility at
> the identifier level than case sensitivity is expected
> to be.  Furthermore we want to add Unicode support to
> Scheme, which is likely to create incompatibility with
> Common Lisp anyway.

Please, be conscious that there are alreayd Common Lisp
implementations that handle unicode properly, so please, 
try not to do it in scheme in a totally incompatible way. :-)

[14]> (defun кремлин (арг) (case арг ((location) 'красная\ плаза)))
КРЕМЛИН
[15]> (КРЕМЛИН 'location)
|КРАСНАЯ ПЛАЗА|
[16]> 


> The Unicode notion of strings is hairy, and does not
> correspond to the Lisp/Scheme notion of a string as a
> sequence of characters.  The Unicode notion of a code
> point is pretty simple, however, and corresponds to
> the Lisp/Scheme notion of a character.  Taking strings
> to be sequences of code points seems to be the most
> reasonable position for Common Lisp and Scheme to take,
> leaving the Unicode notion of string to libraries and
> i/o that can be built on top of the sequence-of-code-points
> notion.  Then upper-casing or lower-casing a Lisp/Scheme
> string can be done by upper-casing or lower-casing its
> constituent code points independently, which is a lot
> simpler than upper-casing or lower-casing the Unicode
> notion of a string.

Indeed.

-- 
__Pascal Bourguignon__                     http://www.informatimago.com/
Kitty like plastic.
Confuses for litter box.
Don't leave tarp around.
From: Jamie Border
Subject: Re: Beyond CL?
Date: 
Message-ID: <db7uku$dvd$1@nwrdmz03.dmz.ncs.ea.ibs-infra.bt.com>
<··········@gmail.com> wrote in message:

> If speculating about the possibility of Lisps superior to CL is enough
> to get you into the Complete Fucking Morons of Programming Club, is it
> Henry Baker or John McCarthy that teaches you the secret handshake?
>

I've had a change of heart.  I want in.

Jamie
From: ·············@hotmail.com
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121419100.256194.300710@g44g2000cwa.googlegroups.com>
Jamie i would  give you an free advise .
Stop losing time with languages to be
and continue mastering CL.
Once you'll become a Jedi you may
start writing language you feel it should
look a like.
And BTW  s-expressions  stays or don't
count on my help .
From: Jamie Border
Subject: Re: Beyond CL?
Date: 
Message-ID: <db81br$o15$1@nwrdmz01.dmz.ncs.ea.ibs-infra.bt.com>
<·············@hotmail.com> wrote:

> Jamie i would  give you an free advise .
> Stop losing time with languages to be
> and continue mastering CL.

I am doing this.

> Once you'll become a Jedi you may
> start writing language you feel it should
> look a like.

When nine hundred years old I reach, care I will not.

> And BTW  s-expressions  stays or don't
> count on my help .

I don't want help.  I am possibly the least qualified person here to try to 
_do_ something, I was just asking the question.
From: lispnubi
Subject: Re: Beyond CL?
Date: 
Message-ID: <1121438402.567540.9660@g49g2000cwa.googlegroups.com>
Kenny Tilton wrote:

...These are people who look at the best
> programming language available and worry about why it is not better. A
> necessary requirement is not being smart enough to understand how much
> better the better language is, such that in their wildest dreams they
> could not be held back by said language.
>

oh yes, programming in cobol is great!

>
> --
> Kenny
>
> Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film
>
> "I've wrestled with reality for 35 years, Doctor, and I'm happy to state
> I finally won out over it."
>      Elwood P. Dowd, "Harvey", 1950
From: Jamie Border
Subject: Re: Beyond CL?
Date: 
Message-ID: <db7ujn$qrh$1@nwrdmz02.dmz.ncs.ea.ibs-infra.bt.com>
"Kenny Tilton" <·······@nyc.rr.com> wrote:

Me>> Couldn't we have a CL++
>
> Congratulations, you have earned a membership in the Complete Fucking 
> Morons of Programming Club. These are people who look at the best

a) Tell Paul Graham he's a Fucking Moron.

b) So we should have stopped at bronze tools, because they were great, and 
anybody who said "these are the best tools, can they be better?" belonged to 
the Complete Fucking Morons of Hand Tools Club?

> programming language available and worry about why it is not better. A 
> necessary requirement is not being smart enough to understand how much 
> better the better language is, such that in their wildest dreams they 
> could not be held back by said language.

No.  My employer is now trying to find a good way to use cells-gtk.  I was 
trying to convey my frustration at convincing my colleagues to use CL.

>
> The good news is that the denizens of comp.lang.lisp will leap frothing at 
> this pavlovian bell and discuss the profundities of your post for weeks to 
> come.

Or perhaps they will tell me that I am a moron.

Jamie

> -- 
> Kenny
>
> Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film
>
> "I've wrestled with reality for 35 years, Doctor, and I'm happy to state I 
> finally won out over it."
>     Elwood P. Dowd, "Harvey", 1950
> 
From: Kenny Tilton
Subject: Re: Beyond CL?
Date: 
Message-ID: <XdQBe.1094$Na6.459996@twister.nyc.rr.com>
Jamie Border wrote:

> "Kenny Tilton" <·······@nyc.rr.com> wrote:
> 
> Me>> Couldn't we have a CL++
> 
>>Congratulations, you have earned a membership in the Complete Fucking 
>>Morons of Programming Club. These are people who look at the best
> 
> 
> a) Tell Paul Graham he's a Fucking Moron.

Try to keep up, will you? Graham was drummed out of the CFMPC when he 
stopped active development of Arc.

Besides, Graham, unlike certain geniuses who pop up around here from 
time to time fully mastered Lisp and wrote a lot of Actual Code before 
attempting improvements. I should have mentioned that both those things 
disqualify someone for CFMPC. Not to worry: you are still nicely eligible.

> 
> b) So we should have stopped at bronze tools, because they were great, and 
> anybody who said "these are the best tools, can they be better?" belonged to 
> the Complete Fucking Morons of Hand Tools Club?

I'll put you down for the "Argument By Analogy Is Like Driving Backwards 
When You Get Lost" SIG.

> 
> 
>>programming language available and worry about why it is not better. A 
>>necessary requirement is not being smart enough to understand how much 
>>better the better language is, such that in their wildest dreams they 
>>could not be held back by said language.
> 
> 
> No.  My employer is now trying to find a good way to use cells-gtk.  I was 
> trying to convey my frustration at convincing my colleagues to use CL.

You are using Cells? You should have mentioned up front that you are a 
far-sighted genius. All is forgiven.

> 
> 
>>The good news is that the denizens of comp.lang.lisp will leap frothing at 
>>this pavlovian bell and discuss the profundities of your post for weeks to 
>>come.
> 
> 
> Or perhaps they will tell me that I am a moron.

Actually, I see a few have told you to get off c.l.l and go do some 
programming. Word.


-- 
Kenny

Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film

"I've wrestled with reality for 35 years, Doctor, and I'm happy to state 
I finally won out over it."
     Elwood P. Dowd, "Harvey", 1950
From: Jamie Border
Subject: Re: Beyond CL?
Date: 
Message-ID: <db9804$n7i$1@nwrdmz02.dmz.ncs.ea.ibs-infra.bt.com>
"Kenny Tilton" <·······@nyc.rr.com> wrote in message 
··························@twister.nyc.rr.com...
>
>
> Jamie Border wrote:
>
>> "Kenny Tilton" <·······@nyc.rr.com> wrote:
>>
>> Me>> Couldn't we have a CL++
>>
>>>Congratulations, you have earned a membership in the Complete Fucking 
>>>Morons of Programming Club. These are people who look at the best
>>
>>
>> a) Tell Paul Graham he's a Fucking Moron.
>
> Try to keep up, will you? Graham was drummed out of the CFMPC when he 
> stopped active development of Arc.
>
> Besides, Graham, unlike certain geniuses who pop up around here from time 
> to time fully mastered Lisp and wrote a lot of Actual Code before 
> attempting improvements. I should have mentioned that both those things 
> disqualify someone for CFMPC. Not to worry: you are still nicely eligible.

I wasn't attempting improvements.  I was attempting dialogue.

>
>>
>> b) So we should have stopped at bronze tools, because they were great, 
>> and anybody who said "these are the best tools, can they be better?" 
>> belonged to the Complete Fucking Morons of Hand Tools Club?
>
> I'll put you down for the "Argument By Analogy Is Like Driving Backwards 
> When You Get Lost" SIG.

I was trying to understand why you responded so harshly, actually.

>
>>
>>
>>>programming language available and worry about why it is not better. A 
>>>necessary requirement is not being smart enough to understand how much 
>>>better the better language is, such that in their wildest dreams they 
>>>could not be held back by said language.
>>
>>
>> No.  My employer is now trying to find a good way to use cells-gtk.  I 
>> was trying to convey my frustration at convincing my colleagues to use 
>> CL.
>
> You are using Cells? You should have mentioned up front that you are a 
> far-sighted genius. All is forgiven.

No.  I'm using it because it is good.  If I was a genius I would be 
re-writing your code.  If I was far-sighted I would have already written it.

>>
>>>The good news is that the denizens of comp.lang.lisp will leap frothing 
>>>at this pavlovian bell and discuss the profundities of your post for 
>>>weeks to come.
>>

Yeah, they did that, and I understand a lot more about the culture now.

>>
>> Or perhaps they will tell me that I am a moron.
>
> Actually, I see a few have told you to get off c.l.l and go do some 
> programming. Word.
>

Excel.

Jamie

>
> -- 
> Kenny
>
> Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film
>
> "I've wrestled with reality for 35 years, Doctor, and I'm happy to state I 
> finally won out over it."
>     Elwood P. Dowd, "Harvey", 1950
> 
From: Pascal Costanza
Subject: Re: Beyond CL?
Date: 
Message-ID: <3jpig3Fqe2phU1@individual.net>
Kenny Tilton wrote:
> 
> Jamie Border wrote:
> 
>> I've been reading about Arc[1], Paul Graham's Lisp-plus-to-be, and 
>> I've been thinking[2]..
>>
>> Couldn't we have a CL++
> 
> Congratulations, you have earned a membership in the Complete Fucking 
> Morons of Programming Club. These are people who look at the best 
> programming language available and worry about why it is not better. A 
> necessary requirement is not being smart enough to understand how much 
> better the better language is, such that in their wildest dreams they 
> could not be held back by said language.

Don't be so negative. It seems to me that it happens quite often that 
newbies understand that Lisp is such a malleable language that they 
start to imagine in what ways it could be "much better". It's an 
expression of the fact that they have probably already dreamt of what 
their preferred "perfect" language should look like and that they have 
just gotten close to being able to actually realizing that dream. It's 
an effect of the distinguishing characteristic of Lisp that they don't 
completely grasp immediately.

The next step is to understand that they can just do it by taking Lisp 
and just simply programming in it. This seems to be a separate step in 
the learning process. At least it was like that in my case, and I am not 
surprised that it's like that for other people.

> The good news is that the denizens of comp.lang.lisp will leap frothing 
> at this pavlovian bell and discuss the profundities of your post for 
> weeks to come.

Cool, eh? ;)


Pascal

-- 
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
From: Kenny Tilton
Subject: Re: Beyond CL?
Date: 
Message-ID: <UaTBe.1105$Na6.473821@twister.nyc.rr.com>
Pascal Costanza wrote:
> Kenny Tilton wrote:
> 
>>
>> Jamie Border wrote:
>>
>>> I've been reading about Arc[1], Paul Graham's Lisp-plus-to-be, and 
>>> I've been thinking[2]..
>>>
>>> Couldn't we have a CL++
>>
>>
>> Congratulations, you have earned a membership in the Complete Fucking 
>> Morons of Programming Club. These are people who look at the best 
>> programming language available and worry about why it is not better. A 
>> necessary requirement is not being smart enough to understand how much 
>> better the better language is, such that in their wildest dreams they 
>> could not be held back by said language.
> 
> 
> Don't be so negative. It seems to me that it happens quite often that 
> newbies understand that Lisp is such a malleable language that they 
> start to imagine in what ways it could be "much better".

No, that is the dilettante subset that does not have any actual ideas to 
code up. Anyone with an app to write discovers Lisp and Just Codes. No 
time for comp.lang.lisp, and anyway too astonished by the power of Lisp 
even to think of anything to change. They feel downright humbled by the 
language, and the last thing they would do is enter this august forum 
and start lecturing us on our problems.

-- 
Kenny

Why Lisp? http://lisp.tech.coop/RtL%20Highlight%20Film

"I've wrestled with reality for 35 years, Doctor, and I'm happy to state 
I finally won out over it."
     Elwood P. Dowd, "Harvey", 1950
From: Edi Weitz
Subject: Re: Beyond CL?
Date: 
Message-ID: <u4qavc3cs.fsf@agharta.de>
On Fri, 15 Jul 2005 18:23:48 GMT, Kenny Tilton <·······@nyc.rr.com> wrote:

> No, that is the dilettante subset that does not have any actual
> ideas to code up. Anyone with an app to write discovers Lisp and
> Just Codes. No time for comp.lang.lisp

So, why are you still here?  And what about me, BTW?

-- 

Lisp is not dead, it just smells funny.

Real email: (replace (subseq ·········@agharta.de" 5) "edi")
From: Thomas F. Burdick
Subject: Re: Beyond CL?
Date: 
Message-ID: <xcvwtnrowds.fsf@conquest.OCF.Berkeley.EDU>
Kenny Tilton <·······@nyc.rr.com> writes:

> No, that is the dilettante subset that does not have any actual ideas to 
> code up. Anyone with an app to write discovers Lisp and Just Codes. No 
> time for comp.lang.lisp, and anyway too astonished by the power of Lisp 
> even to think of anything to change. They feel downright humbled by the 
> language, and the last thing they would do is enter this august forum 
> and start lecturing us on our problems.

That's not a bad way of characterizing the post that started this
thread.  But there are more than two reactions one can have to
discovering Lisp.  Holy shit, if there can be something *this* much
better than what I've seen before ... maybe those unthinkable thoughts
I've had before aren'te unthinkable after all -- maybe I can figure
out how to leverage Lisp to make them possible -- and maybe there are
unthinkable things in here already that I won't be able to figure out
on my own.  This *ahem* august forum can be pretty good at providing
help in these directions, between all the lectures and the bullshit.

-- 
           /|_     .-----------------------.                        
         ,'  .\  / | Free Mumia Abu-Jamal! |
     ,--'    _,'   | Abolish the racist    |
    /       /      | death penalty!        |
   (   -.  |       `-----------------------'
   |     ) |                               
  (`-.  '--.)                              
   `. )----'