From: Erann Gat
Subject: My take on ARC
Date: 
Message-ID: <myfirstname.mylastname-2010031111110001@k-137-79-50-101.jpl.nasa.gov>
I couldn't find a good place in the ARC thread to hang this comment so I
just decided to punt and start a new thread.

One central item in Paul's talk that I didn't see anyone mention
(apologies if someone did and I just missed it) is his ideas for how to
support typing in the core language.  Paul seems to be very enamored of
minimalism and wants to keep the core language very tiny.  In particular,
he wants to keep the number of data types in the core language as small as
possible, with the ideal being that everything, including strings and
numbers, is a list.  Graham has obviously been heavily influenced by
McCarthy's original paper on Lisp, and how you can write EVAL in terms of
CONS, CAR, CDR, COND, QUOTE, ATOM and EQ (the Big Seven).  He wants to
make the core of ARC be a language that is not much bigger than that.

The problem you immediately encounter if you represent everything as a
list is how to write PRINT, because if everything including numbers and
strings is a list there is no difference internally between "abc" and (#\a
#\b #\c).  How do you know which of those two printed representations
should be used in any particular circumstance?

Paul's answer is to add three new constructs: TYPE, REP and TAG.  (I may
get some of the details wrong here.  I didn't take notes during the
presentation, and it's possible I'm getting e.g. TYPE and TAG confused.) 
The operation of TAG, TYPE and REP correspond to CONS, CAR and CDR:

(TAG <datum> <type-tag>)  --> <encapsulated datum>
(TYPE <encapulated datum>) --> <type-tag>
(REP <encapsulated datum>) --> <datum>

I think he constrains type tags to be symbols, but I could be wrong.  One
of his dictums is to not forbid anything unless logic compels you to.  His
canonical example of something that is logically required to be forbidden
is taking the CAR of a symbol.  He is apparently unaware that there have
been Lisp dialects where this was allowed.  (Drew McDermott complained
famously when he learned that taking the CAR and CDR of a fixnum was not
allowed in T.)

I don't think he actually said so, but presumably the idea is that all
operations that work on a <datum> also work on an <encapsulated datum>. 
So you can define PRINT to do the Right Thing with lists by looking at how
they have been encapsulated:

  (print (tag (quote (#\a #\b #\c)) (quote string))

would arrange to print "abc" by having PRINT examine the result of calling
TAG on its argument.

My personal take on all this is that the quest for minimalism is a
quixotic one because it invariably leads you to the lambda calculus. 
McCarthy's original repertoire of seven operations is not minimal because
they can all be constructed out of LAMBDA.  McCarthy himself rejected the
idea that there was anything particularly special about the Big Seven by
endorsing the idea that integers and floating point numbers should be
native types if the underlying machine supports them.

So far I see little in ARC that is particularly interesting
linguistically.  (But Paul is approaching the problem with a premise that
I do not accept, namely that Common Lisp and Scheme both "suck", to use
his terminology.)  Nonetheless, it may still draw people to Lisp just
because it's Paul's thing, and his charisma could easily carry the day. 
That would be a Good Thing.

E.

From: Erann Gat
Subject: Re: My take on ARC
Date: 
Message-ID: <myfirstname.mylastname-2010031150160001@k-137-79-50-101.jpl.nasa.gov>
In article
<·······································@k-137-79-50-101.jpl.nasa.gov>,
······················@jpl.nasa.gov (Erann Gat) wrote:

> I couldn't find a good place in the ARC thread to hang this comment so I
> just decided to punt and start a new thread.

Just thought of something else:

One of the premises of ARC's design is that speed doesn't matter because
today's machines are so fast.  I don't accept this premise.  Speed only
doesn't matter if you're not doing production work.  If you're doing
production work speed does matter because as soon as you start to process
in volume your machine costs are directly proportional to your speed.  If
you can process customer requests twice as fast you only need half as many
machines.  Yes, hardware is cheap, but the cost of the hardware is only a
tiny fraction of the TCO of a machine.  You have to find a place to put
it, pay for the air conditioning, pay a sysadmin to maintain it and
replace it every two years or so.  When you start to get into serious
volume a factor of 2 in speed can make a real difference in the bottom
line, which will, I think, be salient in ARC's initial target market of
spam filtering.  So I'm a little concerned about that aspect of ARC's
design too.  In particular, I think that Paul's focus on minimalism and
intellectual purity at the expense of speed is going to prevent ARC from
e.g. being able to do an AREF is O(1) time, which I think could end up
being a serious limitation.

Paul's response to this is that efficient compilation is an orthogonal
concern, and I largely agree with this.  However, Paul's method for
writing the langauge specification is to write what is essentially a
reference implementation.  The problem is that a reference implementation
doesn't include any information about which apects of the behavior of the
reference implementation are essential and which ones merely
coincidental.  (John McCarthy pointed this out during Paul's
presentation.)  By the time you get done building both numbers and arrays
out of lists it may be very hard to back a compiler optimization out of
that that will allow O(1) arefs.  And even if you succeed you've now spent
a lot of effort solving a purely artificial problem; what the net benefit
would be is still unclear to me.  Paul seems to take it on faith that if
you keep things simple then Good Things will follow.  I don't necessarily
disagree, but the problem is that keeping things simple can result in an
"impedance mismatch" with a complicated world.

E.
From: Thomas F. Burdick
Subject: Re: My take on ARC
Date: 
Message-ID: <xcvvfqj7h7k.fsf@hurricane.OCF.Berkeley.EDU>
······················@jpl.nasa.gov (Erann Gat) writes:

> In article
> <·······································@k-137-79-50-101.jpl.nasa.gov>,
> ······················@jpl.nasa.gov (Erann Gat) wrote:
> 
> > I couldn't find a good place in the ARC thread to hang this comment so I
> > just decided to punt and start a new thread.
> 
> Just thought of something else:
> 
> One of the premises of ARC's design is that speed doesn't matter because
> today's machines are so fast.

Did he actually say that, or did he say that speed was a lower
priority?  I ask because _On Lisp_ was the first book I read when I
decided to take Lisp seriously, and in there he brags that Lisp lets
you write code fast, and write fast code.

> Paul's response to this is that efficient compilation is an orthogonal
> concern, and I largely agree with this.  However, Paul's method for
> writing the langauge specification is to write what is essentially a
> reference implementation.  The problem is that a reference implementation
> doesn't include any information about which apects of the behavior of the
> reference implementation are essential and which ones merely
> coincidental.  (John McCarthy pointed this out during Paul's
> presentation.)  By the time you get done building both numbers and arrays
> out of lists it may be very hard to back a compiler optimization out of
> that that will allow O(1) arefs.

I don't think it will be easy to write an ordinary Lisp compiler for
Arc, but if he keeps potential compilability in mind, it might be a
good candidate for Smalltalk-like VM technology.

-- 
           /|_     .-----------------------.                        
         ,'  .\  / | No to Imperialist war |                        
     ,--'    _,'   | Wage class war!       |                        
    /       /      `-----------------------'                        
   (   -.  |                               
   |     ) |                               
  (`-.  '--.)                              
   `. )----'                               
From: Erann Gat
Subject: Re: My take on ARC
Date: 
Message-ID: <myfirstname.mylastname-2010031422500001@k-137-79-50-101.jpl.nasa.gov>
In article <···············@hurricane.OCF.Berkeley.EDU>,
···@hurricane.OCF.Berkeley.EDU (Thomas F. Burdick) wrote:

> ······················@jpl.nasa.gov (Erann Gat) writes:
> 
> > In article
> > <·······································@k-137-79-50-101.jpl.nasa.gov>,
> > ······················@jpl.nasa.gov (Erann Gat) wrote:
> > 
> > > I couldn't find a good place in the ARC thread to hang this comment so I
> > > just decided to punt and start a new thread.
> > 
> > Just thought of something else:
> > 
> > One of the premises of ARC's design is that speed doesn't matter because
> > today's machines are so fast.
> 
> Did he actually say that, or did he say that speed was a lower
> priority?  I ask because _On Lisp_ was the first book I read when I
> decided to take Lisp seriously, and in there he brags that Lisp lets
> you write code fast, and write fast code.

Well, you can go straight to the horse's mouth, so to speak:

http://www.paulgraham.com/hundred.html


He does concede that "some applications will still demand speed."  But my
take on the message as a whole is that his attention is focused on
applications where speed doesn't matter.

IMHO the fallacy in his reasoning is that while CPU clock rates might keep
going up that does not necessarily translate into a linear speedup in run
times.  Nowadays (and for quite a long time now actually) you have to
write your code just so in order to get all the speed out of your machine
that it's theoretically capable of.  Memory access and disk access speeds
tend to be the limiting factors in many applications.  There are
widespread reports, for example, of MCL applications running no faster on
a G5 than they did on a G4.

Again, I'm thinking about all this in terms of ARC's intended launch
application, spam filtering, which can involve an awful lot of crunching.

E.
From: Bruce Hoult
Subject: Re: My take on ARC
Date: 
Message-ID: <bruce-01B00E.15394321102003@copper.ipg.tsnz.net>
In article 
<·······································@k-137-79-50-101.jpl.nasa.gov>,
 ······················@jpl.nasa.gov (Erann Gat) wrote:

> Memory access and disk access speeds
> tend to be the limiting factors in many applications.  There are
> widespread reports, for example, of MCL applications running no faster on
> a G5 than they did on a G4.

That's strange.  I wonder what MCL is doing?

I haven't done extensive tests yet, but Gwydion Dylan seems to run about 
the same on a G5 on OSX as it does on a same-MHz Athlon (the actual MHz, 
not the marketing number) on Linux.

As a comparison, d2c runs a little slower on my 1 GHz G4 iMac than it 
does on my 700 Mhz Athlon. (hmm .. to be fair, it seems that d2c 
*itself* runs faster on the iMac, but the gcc it pipes into is slower...)

-- Bruce
From: Christophe Rhodes
Subject: Re: My take on ARC
Date: 
Message-ID: <squ163j9kx.fsf@lambda.jcn.srcf.net>
Bruce Hoult <·····@hoult.org> writes:

>  ······················@jpl.nasa.gov (Erann Gat) wrote:
>
>> Memory access and disk access speeds
>> tend to be the limiting factors in many applications.  There are
>> widespread reports, for example, of MCL applications running no faster on
>> a G5 than they did on a G4.
>
> That's strange.  I wonder what MCL is doing?

The word on the street seems to be that some of the instructions
commonly used in Lisp code but not so much in C or FORTRAN
(e.g. mcrxr) appear to be slow on the G5 -- slower than on a G4 by a
couple of orders of magnitude.  This is not the kind of thing that is
predicted in advance, and is probably better characterized by the
question "I wonder what Apple is doing?"

Christophe
-- 
http://www-jcsu.jesus.cam.ac.uk/~csr21/       +44 1223 510 299/+44 7729 383 757
(set-pprint-dispatch 'number (lambda (s o) (declare (special b)) (format s b)))
(defvar b "~&Just another Lisp hacker~%")    (pprint #36rJesusCollegeCambridge)
From: Adam Warner
Subject: Re: My take on ARC
Date: 
Message-ID: <pan.2003.10.21.09.31.41.410267@consulting.net.nz>
Hi Christophe Rhodes,

> Bruce Hoult <·····@hoult.org> writes:
> 
>>  ······················@jpl.nasa.gov (Erann Gat) wrote:
>>
>>> Memory access and disk access speeds
>>> tend to be the limiting factors in many applications.  There are
>>> widespread reports, for example, of MCL applications running no faster on
>>> a G5 than they did on a G4.
>>
>> That's strange.  I wonder what MCL is doing?
> 
> The word on the street seems to be that some of the instructions
> commonly used in Lisp code but not so much in C or FORTRAN
> (e.g. mcrxr) appear to be slow on the G5 -- slower than on a G4 by a
> couple of orders of magnitude.  This is not the kind of thing that is
> predicted in advance, and is probably better characterized by the
> question "I wonder what Apple is doing?"

<http://developer.apple.com/technotes/tn/tn2087.html>

In focusing upon what may be slower compared to the G4:

* The increase in the size of the execution pipeline is shocking, up to 23
stages verses 7 stages for the G4. Incorrect branch predictions will be
much more costly just like the Intel P4 (20 stages). Dynamic languages are
more likely to cause incorrect branch predictions.

* Data Stream Touch instructions cannot be executed speculatively.

* Loops written using existing DCBZ instructions that stride by 32 bytes
can waste up to 75% bandwidth (plus "The intent of most code that uses
DCBZ is to avoid a store miss to memory. In most cases, the G5
implementation will actually cause a store miss.")

* Tight loops can start stalling on the G5 because of higher memory
latency relative to CPU frequency.

* "Due to the increased number of execution pipeline stages and the
increased latency to memory, the time to access and acquire a lock can be
up to 2.5 times slower than on the G4. While there is little that can be
done to speed up the execution of the locks themselves, reducing the
number of locks used in your code can drastically improve its overall
performance on the G5."

* Float-to-integer and integer-to-float conversion is much more costly.

* Several instructions have been microcoded and they must be avoided to
increase performance (code may substantially increase in size).

Regards,
Adam
From: Doug Tolton
Subject: Re: My take on ARC
Date: 
Message-ID: <BbXkb.3303344$Bf5.452911@news.easynews.com>
Erann Gat wrote:

> In article
> <·······································@k-137-79-50-101.jpl.nasa.gov>,
> ······················@jpl.nasa.gov (Erann Gat) wrote:
> 
> 
>>I couldn't find a good place in the ARC thread to hang this comment so I
>>just decided to punt and start a new thread.
> 
> 
> Just thought of something else:
> 
> One of the premises of ARC's design is that speed doesn't matter because
> today's machines are so fast.  I don't accept this premise.  Speed only
> doesn't matter if you're not doing production work.  If you're doing
> production work speed does matter because as soon as you start to process
> in volume your machine costs are directly proportional to your speed.  If
> you can process customer requests twice as fast you only need half as many
> machines.  

Sorry, I very much disagree with this perspective.  I do a *great* 
amount of production work, and very few problems require more machine 
speed.  The largest cost of owning a machine and solving most business 
problems do not relate to machine speed, they are directly tied to 
programmer or operator productivity.

Processing a customer request is almost always directly proportional to 
the amount of time it takes for the operator or programmer to do their 
task.  Rarely is the computation portion the significatn determining 
factor.  Perhaps this is bigger deal at NASA, but it is just not the 
case in the business world.  If it were the case, why are people not 10 
to 20 times more efficient now than they were 5 to 7 years ago when we 
were running on 300 Mhz machines?  Because computational efficiency 
doesn't translate very well to overall efficiency.

> Yes, hardware is cheap, but the cost of the hardware is only a
> tiny fraction of the TCO of a machine.  You have to find a place to put
> it, pay for the air conditioning, pay a sysadmin to maintain it and
> replace it every two years or so.  

If this is true, why don't you write everything in Assembly?  Because it 
will take 50 times as long to get it done.  At the rate things are going 
in 5 or 10 years everyone will be running 50 to 100 ghz machines.  If 
you are contemplating building a language that will be useful on systems 
that are thousands of times faster than current machines, designing to 
optimize programmer time is a better design choice than optimizing 
machine time.  This is IMO why Lisp is experiencing a revival, it is 
*far* more efficient for programmer time than say C.

> When you start to get into serious
> volume a factor of 2 in speed can make a real difference in the bottom
> line, which will, I think, be salient in ARC's initial target market of
> spam filtering.  So I'm a little concerned about that aspect of ARC's
> design too.  In particular, I think that Paul's focus on minimalism and
> intellectual purity at the expense of speed is going to prevent ARC from
> e.g. being able to do an AREF is O(1) time, which I think could end up
> being a serious limitation.
I'm sorry, I just disagree with this conclusion.  As time advances 
machine time is becoming less and less of a factor.  Human time on the 
other hand has had very little optimization over the years.  Look at 
Visual Basic for example.  It was a shockingly simplistic language, with 
terrible optimization, yet it exploded in popularity.  The reason was 
that you could build a windowed application (albeit slow) in a very 
small fraction of the time you could build it in C.

> 
> Paul's response to this is that efficient compilation is an orthogonal
> concern, and I largely agree with this.  However, Paul's method for
> writing the langauge specification is to write what is essentially a
> reference implementation.  The problem is that a reference implementation
> doesn't include any information about which apects of the behavior of the
> reference implementation are essential and which ones merely
> coincidental.  (John McCarthy pointed this out during Paul's
> presentation.)  By the time you get done building both numbers and arrays
> out of lists it may be very hard to back a compiler optimization out of
> that that will allow O(1) arefs.  And even if you succeed you've now spent
> a lot of effort solving a purely artificial problem; what the net benefit
> would be is still unclear to me.  Paul seems to take it on faith that if
> you keep things simple then Good Things will follow.  I don't necessarily
> disagree, but the problem is that keeping things simple can result in an
> "impedance mismatch" with a complicated world.

I think he is designing for the complicated world.  I have read a great 
many of his articles, and specifically his writing is what brought me to 
Lisp.  His main focus is on making people more productive, and allowing 
programmers to work efficiently with high level concepts.  Seriously 
compiler optimization is nice, but ultimately spending one month on a 
problem that will reduce your run time by an hour is only useful if the 
time savings will be more than the invested time over the long haul.

I think this is where technology loses frequently.  From a business 
point of view, machine optimization is really a very low cost compared 
to human time.

-- 
Doug Tolton
(format t ···@~a~a.~a" "dtolton" "ya" "hoo" "com")
From: Erann Gat
Subject: Re: My take on ARC
Date: 
Message-ID: <myfirstname.mylastname-2010031352220001@k-137-79-50-101.jpl.nasa.gov>
In article <························@news.easynews.com>, Doug Tolton
<····@nospam.com> wrote:

> > One of the premises of ARC's design is that speed doesn't matter because
> > today's machines are so fast.  I don't accept this premise.  Speed only
> > doesn't matter if you're not doing production work.  If you're doing
> > production work speed does matter because as soon as you start to process
> > in volume your machine costs are directly proportional to your speed.  If
> > you can process customer requests twice as fast you only need half as many
> > machines.  
> 
> Sorry, I very much disagree with this perspective.  I do a *great* 
> amount of production work, and very few problems require more machine 
> speed.  The largest cost of owning a machine and solving most business 
> problems do not relate to machine speed, they are directly tied to 
> programmer or operator productivity.

I think this depends on what kind of business you're in.  In a marginal
business, like most Internet businesses are nowadays, this is probably
true.  But once you start to achieve real success on the scale of a Google
or an Ebay then speed starts to matter more.  Speed increases of a factor
of 2 can translate into annual savings running into many millions of
dollars.  Even by Google's standards that's real money.


> Processing a customer request is almost always directly proportional to 
> the amount of time it takes for the operator or programmer to do their 
> task.

Only if there's a human in the loop.

> > Yes, hardware is cheap, but the cost of the hardware is only a
> > tiny fraction of the TCO of a machine.  You have to find a place to put
> > it, pay for the air conditioning, pay a sysadmin to maintain it and
> > replace it every two years or so.  
> 
> If this is true, why don't you write everything in Assembly?  Because it 
> will take 50 times as long to get it done.

At some point in any optimization process you reach a point of diminishing
returns.  But that does not mean that efficiency is irrelevant.  Certainly
developer time matters also.  But it's not the only thing that matters.

> I'm sorry, I just disagree with this conclusion.  As time advances 
> machine time is becoming less and less of a factor.

Then why does Google's server farm keep growing instead of shrinking?

E.
From: Doug Tolton
Subject: Re: My take on ARC
Date: 
Message-ID: <2RYkb.6448849$cI2.918973@news.easynews.com>
Erann Gat wrote:

> In article <························@news.easynews.com>, Doug Tolton
> <····@nospam.com> wrote:
> 
> 
>>>One of the premises of ARC's design is that speed doesn't matter because
>>>today's machines are so fast.  I don't accept this premise.  Speed only
>>>doesn't matter if you're not doing production work.  If you're doing
>>>production work speed does matter because as soon as you start to process
>>>in volume your machine costs are directly proportional to your speed.  If
>>>you can process customer requests twice as fast you only need half as many
>>>machines.  
>>
>>Sorry, I very much disagree with this perspective.  I do a *great* 
>>amount of production work, and very few problems require more machine 
>>speed.  The largest cost of owning a machine and solving most business 
>>problems do not relate to machine speed, they are directly tied to 
>>programmer or operator productivity.
> 
> 
> I think this depends on what kind of business you're in.  In a marginal
> business, like most Internet businesses are nowadays, this is probably
> true.  But once you start to achieve real success on the scale of a Google
> or an Ebay then speed starts to matter more.  Speed increases of a factor
> of 2 can translate into annual savings running into many millions of
> dollars.  Even by Google's standards that's real money.
> 
I'll concede this.  I do think Optimization is important, sometimes
though I think it is over emphasized.  In all honesty the reason I like
Lisp over say Python is because I have the best of both worlds.
Unbelievable development environment coupled with blazingly fast
natively compiled code.

I think that's part of why he is striving for minimalism though, in
order to make re-implementation on different platforms / hardware
easier.  Although this is pure speculation on my part.
> 
> 
>>Processing a customer request is almost always directly proportional to 
>>the amount of time it takes for the operator or programmer to do their 
>>task.
> 
> 
> Only if there's a human in the loop.
> 
Very true.  This consideration is far less important for something like
a web site.
> 
>>>Yes, hardware is cheap, but the cost of the hardware is only a
>>>tiny fraction of the TCO of a machine.  You have to find a place to put
>>>it, pay for the air conditioning, pay a sysadmin to maintain it and
>>>replace it every two years or so.  
>>
>>If this is true, why don't you write everything in Assembly?  Because it 
>>will take 50 times as long to get it done.
> 
> 
> At some point in any optimization process you reach a point of diminishing
> returns.  But that does not mean that efficiency is irrelevant.  Certainly
> developer time matters also.  But it's not the only thing that matters.
> 
Agreed
> 
>>I'm sorry, I just disagree with this conclusion.  As time advances 
>>machine time is becoming less and less of a factor.
> 
> 
> Then why does Google's server farm keep growing instead of shrinking?
> 
I don't know enough of Google's internals to comment on this.  Since you
worked there, I'd assume you are in a better position than me to answer
this question.  If I were to hazard a guess though, it would be that
they are doing more with their farm.

-- 
Doug Tolton
(format t ···@~a~a.~a" "dtolton" "ya" "hoo" "com")
From: Kenny Tilton
Subject: Re: My take on ARC
Date: 
Message-ID: <vOXkb.14177$pT1.3430@twister.nyc.rr.com>
Doug Tolton wrote:

> Erann Gat wrote:
> 
>> In article
>> <·······································@k-137-79-50-101.jpl.nasa.gov>,
>> ······················@jpl.nasa.gov (Erann Gat) wrote:
>>
>>
>>> I couldn't find a good place in the ARC thread to hang this comment so I
>>> just decided to punt and start a new thread.
>>
>>
>>
>> Just thought of something else:
>>
>> One of the premises of ARC's design is that speed doesn't matter because
>> today's machines are so fast.  I don't accept this premise.  Speed only
>> doesn't matter if you're not doing production work.  If you're doing
>> production work speed does matter because as soon as you start to process
>> in volume your machine costs are directly proportional to your speed.  If
>> you can process customer requests twice as fast you only need half as 
>> many
>> machines.  
> 
> 
> Sorry, I very much disagree with this perspective.  I do a *great* 
> amount of production work, and very few problems require more machine 
> speed.  The largest cost of owning a machine and solving most business 
> problems do not relate to machine speed, they are directly tied to 
> programmer or operator productivity.

I know what you mean, but as you conceded later, things might be 
different for non-business apps. They are. Business apps are always 
waiting on I/O (financial number crunchers aside). I guess a lot of web 
apps are always waiting on sockets or something, on top any I/O they 
might have to do.

But one of the things I find exciting is that no matter how incredibly 
fast these things have become relative to my 16K, 700khz beginnings on 
an Apple II, folks' imaginations have managed to stay one step ahead in 
thinking of ways to max out their boxes. GUIs could have groovy 3D 
wifgets rendered from scratch in real time, music software could 
generate, compile and pipe out MIDI in real time, animations could be 
rendered in real time. I am guessing that working my system by voice 
could be made workable if enough crunching were possible.

Mind you, Arc slow would be a huge win over Java at any speed, but if 
Arc cannot in principle be made as fast as Lisp, it leaves the door open 
to a younger, faster tiger to take it out.

kenny

-- 
http://tilton-technology.com
What?! You are a newbie and you haven't answered my:
  http://alu.cliki.net/The%20Road%20to%20Lisp%20Survey
From: Rayiner Hashem
Subject: Re: My take on ARC
Date: 
Message-ID: <bn1vnj$m5m$1@news-int.gatech.edu>
> GUIs could have groovy 3D
> wifgets rendered from scratch in real time, music software could
> generate, compile and pipe out MIDI in real time, animations could be
> rendered in real time. I am guessing that working my system by voice
> could be made workable if enough crunching were possible.
To play devil's advocate:

How much programmer productivity are you willing to give up in order to make
it so that you can still code realtime GUI's in the language? 
From: Matt Curtin
Subject: Re: My take on ARC
Date: 
Message-ID: <867k2y21xy.fsf@rowlf.interhack.net>
Doug Tolton <····@nospam.com> writes:

> Sorry, I very much disagree with this perspective.  I do a *great*
> amount of production work, and very few problems require more machine
> speed.  The largest cost of owning a machine and solving most business
> problems do not relate to machine speed, they are directly tied to
> programmer or operator productivity.

As usual, we're led to an issue of the typical usage.  In my case,
speed matters because it's not just a question of programmer
productivity, but also of run-time productivity.  When performing
searches (as one would in electronic discovery or other forensic
analysis, for example) over large data sets (say, hundreds of
gigabytes), the difference between waiting a few minutes and waiting
an hour can be hundreds of dollars.

Granted, there are other things besides CPU speed in play here (most
obviously, I/O), but CPU speed does make a difference (along with
boatloads of memory) if you optimize with the understanding that I/O
is already limited.

The question is not one of absolute conditions, but of optimal
conditions.  Both development time and run time need to be "fast
enough" to satisfy the outside constraints, i.e., market conditions.
Otherwise, you lose and your code is irrelevant.

(In my case, compiled CMUCL is plenty fast.  Actually, it has been
rare that CLISP wasn't fast enough, too.)

-- 
Matt Curtin, CISSP, IAM, INTP.  Keywords: Lisp, Unix, Internet, INFOSEC.
Founder, Interhack Corporation +1 614 545 HACK http://web.interhack.com/
Author of /Developing Trust: Online Privacy and Security/ (Apress, 2001)
From: Bob Coyne
Subject: Re: My take on ARC
Date: 
Message-ID: <3F953BD1.36884641@worldnet.att.net>
Doug Tolton wrote:

> Erann Gat wrote:
>
> > One of the premises of ARC's design is that speed doesn't matter because
> > today's machines are so fast.  I don't accept this premise.  Speed only
> > doesn't matter if you're not doing production work.  If you're doing
> > production work speed does matter because as soon as you start to process
> > in volume your machine costs are directly proportional to your speed.  If
> > you can process customer requests twice as fast you only need half as many
> > machines.
>
> Sorry, I very much disagree with this perspective.  I do a *great*
> amount of production work, and very few problems require more machine
> speed.  The largest cost of owning a machine and solving most business
> problems do not relate to machine speed, they are directly tied to
> programmer or operator productivity.

It depends a lot on the type of application. Computer graphics (and numerically
intensive applications, in general) requires fast execution. It doesn't matter how
fast the programmer can write the software if the execution speed of the software
is slow -- no one will want to use/buy the software.
From: Michael Sullivan
Subject: Re: My take on ARC
Date: 
Message-ID: <1g36vj3.3bhrg972mdl0N%michael@bcect.com>
Doug Tolton <····@nospam.com> wrote:

> Erann Gat wrote:
> 
> > In article
> > <·······································@k-137-79-50-101.jpl.nasa.gov>,
> > ······················@jpl.nasa.gov (Erann Gat) wrote:
> > 
> > 
> >>I couldn't find a good place in the ARC thread to hang this comment so I
> >>just decided to punt and start a new thread.
> > 
> > 
> > Just thought of something else:
> > 
> > One of the premises of ARC's design is that speed doesn't matter because
> > today's machines are so fast.  I don't accept this premise.  Speed only
> > doesn't matter if you're not doing production work.  If you're doing
> > production work speed does matter because as soon as you start to process
> > in volume your machine costs are directly proportional to your speed.  If
> > you can process customer requests twice as fast you only need half as many
> > machines.  
> 
> Sorry, I very much disagree with this perspective.  I do a *great* 
> amount of production work, and very few problems require more machine
> speed.  The largest cost of owning a machine and solving most business
> problems do not relate to machine speed, they are directly tied to 
> programmer or operator productivity.
> 
> Processing a customer request is almost always directly proportional to
> the amount of time it takes for the operator or programmer to do their
> task.  Rarely is the computation portion the significatn determining 
> factor.  Perhaps this is bigger deal at NASA, but it is just not the 
> case in the business world.  If it were the case, why are people not 10
> to 20 times more efficient now than they were 5 to 7 years ago when we
> were running on 300 Mhz machines?  Because computational efficiency 
> doesn't translate very well to overall efficiency.

Wrong. 

I also do a *great* amount of production work.  The reason that office
machines are no more efficient than they were years ago is *not* that
speed gains in computation don't matter, but that they are doing *so*
much extraneous computation. 

A mindset has developed among programmers and testers of mass market
software that "X" time for certain kinds of operations is simply
acceptable because that's how much time it's always taken.  So every
time a new OS or database version comes out-- that's exactly how much
time they've given over to cruft and feature bloat nonsense.  It still
takes a half dozen ticks to get a typical file dialog box open on the
fastest mac hardware today, just as it did with the fastest mac hardware
available in 1989 running system 6.  Why?  Because the system is trying
to do 8 million things today that it didn't do then.  

Now making this be utterly instantaneous wouldn't save *that* much time
for my production quark/PS/Illy operators, but it would save some.
Maybe as much as 2-3%.  1/3 of that goes to the bottom line.  The point
is -- unless I want to drag the versions of everything that I use, I
can't seem to get many of those speed gains.  I estimate that if
everything that I could do 10 years ago, ran on my machine at the same
speed relative to the processor speed as it did 10 years ago with no
other changes, that my typesetting and graphics employees would be able
to turn about 15% more work.  But there is no machine available that
would do this, even though if I *could* install some very old OS on a
current mac (and run current versions of software, etc.), it might.

I know that there are a pile of algorithms in current, bog-standard
software that use O(N) where they could/should be O(1), and N^2 where
they should be N, or some very high constant time overhead, and these
things are there, because of sloppy programming, and design decisions
just like the one mentioned here.  

Applescript has no arrays or true hash tables.  What's the number one
feature request on the applescript list?  Hash arrays.  Why?  Because
lists are too bloody slow to search when things get big.  So there has
historically been a lot of space on the AS lists devoted to hacks that
get around this problem. 

Now admittedly, AS is completely interpreted, and if it were compiled
the point at which this O(N) behavior became a problem would be a lot
later.  But not so much so that you wouldn't see a lot of people trying
to hack around it.  And this is a language that is specifically designed
to be primarily job control glue for scriptable off-the-rack apps,
rather than a production general purpose language.

> > Yes, hardware is cheap, but the cost of the hardware is only a
> > tiny fraction of the TCO of a machine.  You have to find a place to put
> > it, pay for the air conditioning, pay a sysadmin to maintain it and
> > replace it every two years or so.  

> If this is true, why don't you write everything in Assembly?  Because it
> will take 50 times as long to get it done.  At the rate things are going
> in 5 or 10 years everyone will be running 50 to 100 ghz machines.  If
> you are contemplating building a language that will be useful on systems
> that are thousands of times faster than current machines, designing to
> optimize programmer time is a better design choice than optimizing 
> machine time.  This is IMO why Lisp is experiencing a revival, it is 
> *far* more efficient for programmer time than say C.

But existing lisps don't ask you to make ridiculous order of magnitude
speed tradeoffs in order to get that.   Applescript is also lot more
efficient for programmer time than C.  But some common projects end up
being so slow, that it's worth slogging through C to write them, or
maybe it's not worth writing them at all, if doing them in C or lisp
involves a lot of complication in the communication.

> > When you start to get into serious
> > volume a factor of 2 in speed can make a real difference in the bottom
> > line, which will, I think, be salient in ARC's initial target market of
> > spam filtering.  So I'm a little concerned about that aspect of ARC's
> > design too.  In particular, I think that Paul's focus on minimalism and
> > intellectual purity at the expense of speed is going to prevent ARC from
> > e.g. being able to do an AREF is O(1) time, which I think could end up
> > being a serious limitation.

> I'm sorry, I just disagree with this conclusion.  As time advances 
> machine time is becoming less and less of a factor. 

Less and less, but it's still a factor.  Constant time optimizations
have become nearly meaningless, but order of magnitude optimizations are
still important unless you only work on fairly small data sets or simple
(computationally) problems.

if I need to write an NlogN algorithm that's dependent on getting those
aref's in O(1) time, but they are O(N) time, suddenly I'm looking at my
speed curve getting ugly SQRT(N) sooner than it otherwise would.  If my
algorithm could have handled a million record data set in "reasonable"
time given an O(1) aref, suddenly it can only handle a 1000 record data
set in "reasonable" time given an O(N) aref.  That's a *monstrous*
problem for a production app that you might wish to call up for large
data sets.

The point Erann is making (I think) is that this kind of intellectual
purity doesn't necessarily reduce programmer time.  In fact, I think it
will *increase* it, because every time a programmer really needs that
O(1) aref in order to do something acceptably fast, s/he'll be
greenspunning left and right.  

You need to be *really sure* that such a thing will either hardly ever
come up, or that the programmer time saving is very valuable before that
kind of tradeoff makes sense (IMO).

It's hard for me to imagine that implementing arrays as lists is really
that efficient at saving programmer time.  Unless your goal is to appeal
primarily to novice or non-programmers -- can't it be assumed that the
user knows about arrays and lists and what their basic patterns of
access/structure look like?



Michael
From: Rayiner Hashem
Subject: Re: My take on ARC
Date: 
Message-ID: <bn1uv3$sml$1@news-int2.gatech.edu>
About speed:

I think that speed is no less important today than it was a decade ago.
Sure, machines have gotten faster, but the tasks they need to do have
gotten more complex at the same speed. When a new general purpose language
comes out, people's speed concerns aren't addressed by machines getting
faster, but the implementations of the new languages getting faster to the
point where they are close to the speed of the prevalent languages of the
time.

For example:
        - Performance-conscious developers moved from ASM to C because C compilers
          got so good (and processors got so complex) that optimizing compilers
          could beat all but the most carefully-crafted ASM. 
        - For a long time, C++ had a reputation for generating slow/bloated code
          because compilers at the time did mostly code-generation rather than
          abstraction-elimination optimizations. Even today, C++ is just only
          penetrating the numerical computing market, because advanced compilers
          have almost eliminated the performance difference between it and FORTRAN.
        - While Java still fights the "slow" label, most of that is from slow
          libraries (Swing) rather than language-level stuff. People only stopped
          complaining about the latter when highly-optimizing JIT's eliminated
          most of the performance loss from the VM.

To most people "it's fast enough" sounds like a cop-out. Lisp advocates can
shut people up about the speed of Lisp these days not because machines are
so fast (after all, even with todays fast machines, most software is too
slow!) but because Lisp has gotten very fast. If Arc is initially slow,
then it may suffer from the same problem. That might limit its scope to
web/network apps, a market where speed is perceived to be less important. 

A lot of the performance concerns come down to how different Arc really is.
If the machine model isn't all that different from CL/Scheme/Dylan, then
performance can be very good, because there is a large body of compiler
technology available for that model. So stuff like generic functions,
dynamic types, closures, etc, will be fast. However, if it goes the way of
Python in ways (for example, being able to dynamically adding fields to
objects) performance will suffer.
From: Damien R. Sullivan
Subject: Re: My take on ARC
Date: 
Message-ID: <bnaf6o$oi4$1@hood.uits.indiana.edu>
······················@jpl.nasa.gov (Erann Gat) wrote:

>One of the premises of ARC's design is that speed doesn't matter because
>today's machines are so fast.  I don't accept this premise.  Speed only

Gah.  I hate that premise.  Probably because most of the programming I've done
in the last five years has been CPU intensive.  Ironically, the model building
software wasn't the key culprit; when you're competing with hand-crafted
models which take months to build taking twice as many hours doesn't matter so
much.  Fitting into memory mattered.

But then we did web services stuff, with a max turnaround of a quarter of a
second.  Then I went to grad school and got an algorithms project of
calculating a number to as many digits as possible in a minute, on a specified
machine.  Obviously the main focus was algorithms, but compiler mattered to.
(I ended up developing in Haskell and then sticking with it because I didn't
care enough to be competitive... I'd started in ocaml, but ghc turned out to
be faster.  Probably because it was using a better bignums library.)

And now I'm in AI, which I figure can suck CPU like anything.  Even if I got
to the point where a single run didn't take much time that would just open the
door to doing lots more statistics of multiple runs, and parameter tweaking.
And we don't have lots of money to throw at hardware.  Of course I need to get
my PhD sometime too, so development time matters as well.  So Lisp vs. C++ and
Java can make sense.  But not something which didn't compile to machine code,
or which used O(N) instead of O(1)...

-xx- Damien X-) 
From: Russell Wallace
Subject: Re: My take on ARC
Date: 
Message-ID: <3f9c161f.18471719@news.eircom.net>
On Mon, 20 Oct 2003 11:50:16 -0700,
······················@jpl.nasa.gov (Erann Gat) wrote:

>One of the premises of ARC's design is that speed doesn't matter because
>today's machines are so fast.  I don't accept this premise.  Speed only
>doesn't matter if you're not doing production work. If you're doing
>production work speed does matter because as soon as you start to process
>in volume your machine costs are directly proportional to your speed.

For 90% of production work, machine costs are so overwhelmingly
dominated by human costs (and rent, bandwidth etc) that they're lost
in the noise.

Users don't care about machine efficiency. Most programmers don't care
about it either, and they're right. Saving clock cycles won't put a
penny in your paycheck. Saving development time will.

Us hardcore geeks are the ones who care about machine efficiency, and
that's because we grew up having to squeeze things into 16k of RAM;
fettered limbs grow lame. It's hard to overcome this sort of mental
crippling, but let's at least recognize it for what it is: a mental
illness induced by childhood trauma.

Now, maybe you're in the 10% - you're at NASA, and I'm quite prepared
to believe they're among the few organizations who need all the CPU
power they can get and then some - in which case the above doesn't
apply to you. But no language can appeal to everyone, and the
cycle-counting crowd are catered for by zillions of languages already;
targeting the other 90% of the market is an eminently sensible thing
to do.

-- 
"Sore wa himitsu desu."
To reply by email, remove
the small snack from address.
http://www.esatclear.ie/~rwallace
From: Bob Coyne
Subject: Re: My take on ARC
Date: 
Message-ID: <3F9C2569.C40BE121@worldnet.att.net>
Russell Wallace wrote:

> On Mon, 20 Oct 2003 11:50:16 -0700,
> ······················@jpl.nasa.gov (Erann Gat) wrote:
>
> >One of the premises of ARC's design is that speed doesn't matter because
> >today's machines are so fast.  I don't accept this premise.  Speed only
> >doesn't matter if you're not doing production work. If you're doing
> >production work speed does matter because as soon as you start to process
> >in volume your machine costs are directly proportional to your speed.
>
> For 90% of production work, machine costs are so overwhelmingly
> dominated by human costs (and rent, bandwidth etc) that they're lost
> in the noise.
>
> Users don't care about machine efficiency. Most programmers don't care
> about it either, and they're right. Saving clock cycles won't put a
> penny in your paycheck. Saving development time will.
>
> Us hardcore geeks are the ones who care about machine efficiency, and
> that's because we grew up having to squeeze things into 16k of RAM;
> fettered limbs grow lame. It's hard to overcome this sort of mental
> crippling, but let's at least recognize it for what it is: a mental
> illness induced by childhood trauma.
>
> Now, maybe you're in the 10% - you're at NASA, and I'm quite prepared
> to believe they're among the few organizations who need all the CPU
> power they can get and then some - in which case the above doesn't
> apply to you. But no language can appeal to everyone, and the
> cycle-counting crowd are catered for by zillions of languages already;
> targeting the other 90% of the market is an eminently sensible thing
> to do.

Complex problems tend to require lots of cpu power.  Since Lisp
(and presumably ARC) is well suited to complex problems, I think
it's a mistake to assume that it fits into the 90% range (even assuming
that figure is accurate).

Also, I disagree with your general point that the need for cpu cycles
is a thing of the past.  Each version of MSoft's new operating systems
seem to require more memory and faster cpu's.  Is there any reason to
expect that will stop?  And another example: computer graphics and
video games are avid consumers of cpu power (and memory). This is a
huge an growing market.  More and more data needs to be visualized and
presented visually.  That all takes a lot of cpu power.  So I don't see the need
for processing power going away any time soon (other than for stagnant/solved
areas).  Crippling a language, by assuming performance isn't important, seems
to me to be a big mistake.

- Bob
From: Kenny Tilton
Subject: Re: My take on ARC
Date: 
Message-ID: <UlWmb.8914$Gq.3766104@twister.nyc.rr.com>
Bob Coyne wrote:

> Russell Wallace wrote:
> 
> 
>>On Mon, 20 Oct 2003 11:50:16 -0700,
>>······················@jpl.nasa.gov (Erann Gat) wrote:
>>
>>
>>>One of the premises of ARC's design is that speed doesn't matter because
>>>today's machines are so fast.  I don't accept this premise.  Speed only
>>>doesn't matter if you're not doing production work. If you're doing
>>>production work speed does matter because as soon as you start to process
>>>in volume your machine costs are directly proportional to your speed.
>>
>>For 90% of production work, machine costs are so overwhelmingly
>>dominated by human costs (and rent, bandwidth etc) that they're lost
>>in the noise.
>>
>>Users don't care about machine efficiency. Most programmers don't care
>>about it either, and they're right. Saving clock cycles won't put a
>>penny in your paycheck. Saving development time will.
>>
>>Us hardcore geeks are the ones who care about machine efficiency, and
>>that's because we grew up having to squeeze things into 16k of RAM;
>>fettered limbs grow lame. It's hard to overcome this sort of mental
>>crippling, but let's at least recognize it for what it is: a mental
>>illness induced by childhood trauma.
>>
>>Now, maybe you're in the 10% - you're at NASA, and I'm quite prepared
>>to believe they're among the few organizations who need all the CPU
>>power they can get and then some - in which case the above doesn't
>>apply to you. But no language can appeal to everyone, and the
>>cycle-counting crowd are catered for by zillions of languages already;
>>targeting the other 90% of the market is an eminently sensible thing
>>to do.
> 
> 
> Complex problems tend to require lots of cpu power.  

Yes, such as generating in real-time the synthesized audio (not midi) 
more the entire Philharmonic orchestra accompanied by the Mormon 
Tabernacle choir, where the third tenor from the left, second row had a 
stuffy nose.

:)

kenny

-- 
http://tilton-technology.com
What?! You are a newbie and you haven't answered my:
  http://alu.cliki.net/The%20Road%20to%20Lisp%20Survey
From: Scott McKay
Subject: Re: My take on ARC
Date: 
Message-ID: <i8_mb.37322$Fm2.16092@attbi_s04>
"Bob Coyne" <········@worldnet.att.net> wrote in message
······················@worldnet.att.net...

> Also, I disagree with your general point that the need for cpu cycles
> is a thing of the past.  Each version of MSoft's new operating systems
> seem to require more memory and faster cpu's.  Is there any reason to
> expect that will stop?  And another example: computer graphics and
> video games are avid consumers of cpu power (and memory). This is a
> huge an growing market.  More and more data needs to be visualized and
> presented visually.  That all takes a lot of cpu power.  So I don't see
the need
> for processing power going away any time soon (other than for
stagnant/solved
> areas).  Crippling a language, by assuming performance isn't important,
seems
> to me to be a big mistake.
>

I think your example argues compelling that the raw
performance of the language is not the issue.  Microsoft
uses languages like C++ to implement all of this stuff,
right?  And it is still a pig in terms of performance.  The
sheer awfulness of Microsoft performance is purely a
function of poor engineering in the face of overly baroque
and complicated designs.

If Microsoft used languages that truly valued abstraction,
I bet that the overall performance of their products would
be way better, even if the languages themselves had
asymptotically poorer performance.
From: Bob Coyne
Subject: Re: My take on ARC
Date: 
Message-ID: <3F9C7567.3F890046@worldnet.att.net>
Scott McKay wrote:

> "Bob Coyne" <········@worldnet.att.net> wrote in message
> ······················@worldnet.att.net...
>
> > Also, I disagree with your general point that the need for cpu cycles
> > is a thing of the past.  Each version of MSoft's new operating systems
> > seem to require more memory and faster cpu's.  Is there any reason to
> > expect that will stop?  And another example: computer graphics and
> > video games are avid consumers of cpu power (and memory). This is a
> > huge an growing market.  More and more data needs to be visualized and
> > presented visually.  That all takes a lot of cpu power.  So I don't see
> the need
> > for processing power going away any time soon (other than for
> stagnant/solved
> > areas).  Crippling a language, by assuming performance isn't important,
> seems
> > to me to be a big mistake.
> >
>
> I think your example argues compelling that the raw
> performance of the language is not the issue.  Microsoft
> uses languages like C++ to implement all of this stuff,
> right?  And it is still a pig in terms of performance.  The
> sheer awfulness of Microsoft performance is purely a
> function of poor engineering in the face of overly baroque
> and complicated designs.

Raw performance at the language level is a necessary but not a
sufficient condition to achieve applications which run fast.

Microsoft can get away with poor performance in their software, but
most others can't.

> If Microsoft used languages that truly valued abstraction,
> I bet that the overall performance of their products would
> be way better, even if the languages themselves had
> asymptotically poorer performance.

Possibly, though I've seen plenty of Lisp software that runs slow even
though the language is well suited for abstraction.  So I think you want both
raw language speed and well designed software (facilitated by a language that
supports abstraction).

I'd also note that in Lisp, it's easy to create inefficient code.  This is especially
true in floating point where boxing floats (in the absence of the proper declarations
and care taken with respect to argument passing) eats up lots of cpu cycles unnecessarily.
From: Russell Wallace
Subject: Re: My take on ARC
Date: 
Message-ID: <3f9c6c44.40527453@news.eircom.net>
On Sun, 26 Oct 2003 19:51:04 GMT, Bob Coyne
<········@worldnet.att.net> wrote:

>Complex problems tend to require lots of cpu power.

Depends what you mean by "complex". If you mean "requiring lots of
complex calculations", then yes; but there are already plenty of
languages designed to handle these. However, most of the world's
programming resources are spent on problems that could be called
complex in a different sense: not needing a lot of internal
calculation, but messy, poorly specified, with requirements that map
badly onto hardware and existing languages, and change faster than
people can implement them. If you've a better language for dealing
with problems of this type, then to most people it won't matter if it
"wastes" machine resources.

>Also, I disagree with your general point that the need for cpu cycles
>is a thing of the past.  Each version of MSoft's new operating systems
>seem to require more memory and faster cpu's.

And each version of Intel's new chips has more power to run them, so
there's no problem ^.^

>And another example: computer graphics and
>video games are avid consumers of cpu power (and memory). This is a
>huge an growing market.  More and more data needs to be visualized and
>presented visually.  That all takes a lot of cpu power.

Yep, graphics engines are in the 10%. Note however that game logic
nowadays is often written in interpreted scripting languages. If they
could be persuaded to use Arc instead of a homebrew, a few more
incidents of greenspunning would be avoided.

>Crippling a language, by assuming performance isn't important, seems
>to me to be a big mistake.

Again, a minority of us hardcore geeks are the only people who think
slow execution speed cripples a language.

-- 
"Sore wa himitsu desu."
To reply by email, remove
the small snack from address.
http://www.esatclear.ie/~rwallace
From: Bob Coyne
Subject: Re: My take on ARC
Date: 
Message-ID: <3F9C78C8.B6D4658E@worldnet.att.net>
Russell Wallace wrote:

> On Sun, 26 Oct 2003 19:51:04 GMT, Bob Coyne
> <········@worldnet.att.net> wrote:
>
> >Complex problems tend to require lots of cpu power.
>
> Depends what you mean by "complex". If you mean "requiring lots of
> complex calculations", then yes; but there are already plenty of
> languages designed to handle these. However, most of the world's
> programming resources are spent on problems that could be called
> complex in a different sense: not needing a lot of internal
> calculation, but messy, poorly specified, with requirements that map
> badly onto hardware and existing languages, and change faster than
> people can implement them. If you've a better language for dealing
> with problems of this type, then to most people it won't matter if it
> "wastes" machine resources.

By complex I mean tasks that require a larger body of software -- the
opposite of one-line programs. Such systems usually require more cpu
power than smaller systems (if only because they do more which increases
the chances that some parts of them will be performance critical).

> >Also, I disagree with your general point that the need for cpu cycles
> >is a thing of the past.  Each version of MSoft's new operating systems
> >seem to require more memory and faster cpu's.
>
> And each version of Intel's new chips has more power to run them, so
> there's no problem ^.^

But the point is that there's always a need for more cpu power.  It'll
always be gobbled up.  Microsoft (and many others) have done quite
a job of doing that.  And again, graphics and other multimedia applications
can consume an infinite amount of it.  And these techniques will likely carry
over into user interfaces more generally, so they'll grow in importance.

> >And another example: computer graphics and
> >video games are avid consumers of cpu power (and memory). This is a
> >huge an growing market.  More and more data needs to be visualized and
> >presented visually.  That all takes a lot of cpu power.
>
> Yep, graphics engines are in the 10%. Note however that game logic
> nowadays is often written in interpreted scripting languages. If they
> could be persuaded to use Arc instead of a homebrew, a few more
> incidents of greenspunning would be avoided.

And game logic will get more and more comples with the AI doing more
and more.  I don't see any limits on the amount of cpu power that could
be consumed.  And this goes for user interfaces in general.  All that data
that has to be presented and accepted and interpreted requires high performance.

> >Crippling a language, by assuming performance isn't important, seems
> >to me to be a big mistake.
>
> Again, a minority of us hardcore geeks are the only people who think
> slow execution speed cripples a language.

But I'm definitely not a hardware geek!  I just want hardware and software
that runs fast so it can do more. :-)
From: Michael Sullivan
Subject: Re: My take on ARC
Date: 
Message-ID: <1g3hn2b.14k77wgdebbglN%michael@bcect.com>
Russell Wallace <················@eircom.net> wrote:

> On Sun, 26 Oct 2003 19:51:04 GMT, Bob Coyne
> <········@worldnet.att.net> wrote:

> >Crippling a language, by assuming performance isn't important, seems
> >to me to be a big mistake.
 
> Again, a minority of us hardcore geeks are the only people who think
> slow execution speed cripples a language.

Again -- it depends on just how slow you mean -- also what you mean by
"cripple".  IMO, for a general purpose language, doing something which
makes it absolutely unfeasable to use for 10-20% of problems might
reasonably be considered crippling.

Remember, we're not talking about constant time overhead that removes
low-level concerns.  In the example it was "everything is implemented as
a list" which means that a whole lot of things are going to have their
algorithmic performance slowed down by a factor of *N*.  That puts a
limit on the size of your working datasets.  

For a "beginner" language, that's not necessarily a horrible trade-off.
A novice programmer can have a lot of time saved by making their
language ridiculously simple to learn, and is less likely to be tackling
large problems.  But ARC's design goal is specifically to be a language
for good programmers.  Good programmers *will* tackle large problems,
and (other than for Q&D tasks) they will refuse to use a language where
a standard set of data-abstractions is not available, and at least
nominally efficient.

A small constant factor can be made up by shorter development time on
all but the most computationally intensive projects.  Adding an N in
there where it shouldn't be will matter even on computationally simple
problems as soon as your data sets get moderately large.  It seems like
a really bad move to build this into your general purpose language
unless it will make a proven *large* difference in development time, and
I can't see how this will.  It's really hard for me to believe that
someone as smart and savvy as Paul Graham doesn't realize this.

The only excuse I can imagine is that he has some implementation tricks
up his sleeve to make lists behave more like arrays, and that if they
don't pan out, arrays would get into the language.  IOW, the algorithmic
profile is a concern, but he's just not worrying about it at this very
early development stage.

That's understandable.  A long term plan to not bother with O(1) aref is
not (IMO) reasonable for a general purpose language.  That would put
*way* too many normal, not particularly cpu intensive, problems out of
its reach.



Michael
From: Russell Wallace
Subject: Re: My take on ARC
Date: 
Message-ID: <3f9e9bbb.26073375@news.eircom.net>
On Mon, 27 Oct 2003 18:51:53 -0500, ·······@bcect.com (Michael
Sullivan) wrote:

>Remember, we're not talking about constant time overhead that removes
>low-level concerns.  In the example it was "everything is implemented as
>a list" which means that a whole lot of things are going to have their
>algorithmic performance slowed down by a factor of *N*.  That puts a
>limit on the size of your working datasets.  

Nobody is proposing to _implement_ arrays as linked lists. What Paul
Graham is proposing to do is define the _semantics_ of them in terms
of lists. The slowdown for doing this is likely to be a small constant
overhead, and my argument is that that sort of overhead doesn't
matter.

-- 
"Sore wa himitsu desu."
To reply by email, remove
the small snack from address.
http://www.esatclear.ie/~rwallace
From: Erann Gat
Subject: Re: My take on ARC
Date: 
Message-ID: <spammers_must_die-2810031021430001@k-137-79-50-101.jpl.nasa.gov>
In article <·················@news.eircom.net>,
················@eircom.net (Russell Wallace) wrote:

> On Mon, 27 Oct 2003 18:51:53 -0500, ·······@bcect.com (Michael
> Sullivan) wrote:
> 
> >Remember, we're not talking about constant time overhead that removes
> >low-level concerns.  In the example it was "everything is implemented as
> >a list" which means that a whole lot of things are going to have their
> >algorithmic performance slowed down by a factor of *N*.  That puts a
> >limit on the size of your working datasets.  
> 
> Nobody is proposing to _implement_ arrays as linked lists.

Actually, my understanding is that that's exactly what he's proposing to do.

Here's a quote from one of Paul's papers:

> Most data structures exist because of speed.  For example, many languages
> today have both strings and lists.  Semantically, strings are more or less
> a subset of lists in which the elements are characters.  So why do you
> need a separate data type? You don't, really.  Strings only exist for
> efficiency.  But it's lame to clutter up the semantics of the language
> with hacks to make programs run faster. Having strings in a language seems
> to be a case of premature optimization. 

I predict that sooner or later Paul is going to run headlong into the fact
that lists and arrays, while they appear to be superficially similar, are
actually fundamentally different.  For example, consider the following
snippets:

(defun iterate-over-string-with-car-cdr (string)
  (if (null string)
     (values)
     (progn (do-something (car string))
            (iterate-over-string-with-car-cdr (cdr string)))))

(defun iterate-over-string-with-numerical-index (string)
  (dotimes (i (length string))
    (do-something (nth i string)))
  (values))

Proving that these two pieces of code do the "same thing" is quite
challenging.  And with a minor tweak:

(defun do-something-mysterious-with-a-string (string)
  (dotimes (i (length string))
    (do-something (nth (f i) string)))
  (values))

it turns into the halting problem.  So it while it might work for special
cases, it is impossible to fully automate the process of optimizing code
using CAR/CDR as primitives into equivalent code using AREF as a
primitive.  This is a reflection of a Fundamental Truth: numbers are an
incredibly rich computational abstraction (computationally complete in
fact) and being able to do an AREF in O(1) time is an incredibly powerful
optimization and cannot be discarded without paying a steep price.  No
matter how fast our machines get, the difference between O(1) and O(n)
will always be significant.

(The difference between CAR/CDR and AREF is precisely the difference
between sequential and random access memory, Turing Machines and Random
Access Machines.  There's a reason bubble memory never caught on.)

E.
From: Shiro Kawai
Subject: Re: My take on ARC
Date: 
Message-ID: <1bc2f7b2.0310281832.5495a49b@posting.google.com>
·················@jpl.nasa.gov (Erann Gat) wrote in message news:<··································@k-137-79-50-101.jpl.nasa.gov>...

> I predict that sooner or later Paul is going to run headlong into the fact
> that lists and arrays, while they appear to be superficially similar, are
> actually fundamentally different.  For example, consider the following

As far as I remember, he mentioned that you could assign
data tag to a structure, which would give a hint to the 
compiler or the runtime system for optimization. 
And also I assume he doesn't force everybody just to use the
"core" primitives, but just leave other stuff to a library.

So, what I understand is that the language could have
something like sequence-ref in its bundled library:

  (sequence-ref data index)

And if data has a tag that indicates underlying structure
allows O(1) access, the library can take a shortcut.
So if the algorithm better be expressed by indexed access
rather than sequential access, the application programmer
will use such library functions and count on either the
implementation doing good optimization or the runtime profiler
telling you're doing it in bad way.

Now such optimization would be difficult if everything is
so dynamic as he described at ILC, but I still think it is
an interesting problem to try to see how much he can push
into that direction.

> snippets:
> (defun iterate-over-string-with-car-cdr (string)
>   (if (null string)
>      (values)
>      (progn (do-something (car string))
>             (iterate-over-string-with-car-cdr (cdr string)))))
> 
> (defun iterate-over-string-with-numerical-index (string)
>   (dotimes (i (length string))
>     (do-something (nth i string)))
>   (values))
> Proving that these two pieces of code do the "same thing" is quite
> challenging. 

Actually, you gave an interesting example.  Yes, it is
challenging to prove two does the same thing.  And which way
do you choose to write such a code?
If a string is an O(1) array of characters, the latter behaves
better.  However, If a string is internally encoded in utf-8,
and (cdr string) shares the string body with the original
string instead of copying, then the former runs better,
except if a string is all ASCII, in which case O(1) may win.

I hope the language's bundled library have generic "fold",
so that I can write this and leave the choice to the
implementation:

  (fold (lambda (char _) (do-something char)) () string)

> And with a minor tweak:
> 
> (defun do-something-mysterious-with-a-string (string)
>   (dotimes (i (length string))
>     (do-something (nth (f i) string)))
>   (values))

And In this case, the algorithm requires indexed access, thus
I would write so and see how profiler says.  In the current
world, If I need the above with utf-8 string I first convert
it to a vector, but it'd be nice that the runtime figures
out what is wrong and suggests correction.

> No matter how fast our machines get, the difference between O(1) and O(n)
> will always be significant.

I agree on this.  The question is that whether the language
core semantics should include it in spec, or can leave
it to the standard/bundled library and optimization 
strategy.

(Actually, the if core semantics doesn't take the difference 
into account, you can't tell complexity by just looking at
the code.  I'm not sure how much Paul considers this matter).

--shiro
From: Russell Wallace
Subject: Re: My take on ARC
Date: 
Message-ID: <3f9eed38.46937708@news.eircom.net>
On Tue, 28 Oct 2003 10:21:43 -0800, ·················@jpl.nasa.gov
(Erann Gat) wrote:

>In article <·················@news.eircom.net>,
>················@eircom.net (Russell Wallace) wrote:
>
>> Nobody is proposing to _implement_ arrays as linked lists.
>
>Actually, my understanding is that that's exactly what he's proposing to do.
>
>Here's a quote from one of Paul's papers:
>
>> Most data structures exist because of speed.  For example, many languages
>> today have both strings and lists.  Semantically, strings are more or less
>> a subset of lists in which the elements are characters.  So why do you
>> need a separate data type? You don't, really.  Strings only exist for
>> efficiency.  But it's lame to clutter up the semantics of the language
>> with hacks to make programs run faster. Having strings in a language seems
>> to be a case of premature optimization. 

Yes, the key part of that is "it's lame to clutter up the _semantics_
of the language" - emphasis added; he's referring to language
semantics, not implementation details.

Anyway, whatever Paul's intent, that's the position I'm defending. I'm
not disputing an implementation will sooner or later need some data
structure that provides constant-time random access (otherwise if
you're handling a billion data points you'll take a billionfold
slowdown, which is too much even on today's hardware). I'm just
disputing that such optimization details need to clutter up the
_semantics_.

>I predict that sooner or later Paul is going to run headlong into the fact
>that lists and arrays, while they appear to be superficially similar, are
>actually fundamentally different.  For example, consider the following
>snippets:
>
>(defun iterate-over-string-with-car-cdr (string)
>  (if (null string)
>     (values)
>     (progn (do-something (car string))
>            (iterate-over-string-with-car-cdr (cdr string)))))
>
>(defun iterate-over-string-with-numerical-index (string)
>  (dotimes (i (length string))
>    (do-something (nth i string)))
>  (values))
>
>Proving that these two pieces of code do the "same thing" is quite
>challenging.

But recognizing them as common idioms for the same thing is
straightforward. However, there's an even simpler solution: Where the
compiler can't prove in advance what the access method will be, just
have CDR check at run time and convert the data structure into a
linked list if necessary, and NTH check at run time and convert the
data structure into an array if necessary. Then both those functions
will run in O(N) time rather than O(N^2) for all values of STRING.

>(The difference between CAR/CDR and AREF is precisely the difference
>between sequential and random access memory, Turing Machines and Random
>Access Machines.  There's a reason bubble memory never caught on.)

Bubble memory? I thought that was random access, just could never
match hard disks for bytes/$?

-- 
"Sore wa himitsu desu."
To reply by email, remove
the small snack from address.
http://www.esatclear.ie/~rwallace
From: Erann Gat
Subject: Re: My take on ARC
Date: 
Message-ID: <gat-2810031454160001@k-137-79-50-101.jpl.nasa.gov>
In article <·················@news.eircom.net>,
················@eircom.net (Russell Wallace) wrote:

> >(defun iterate-over-string-with-car-cdr (string)
> >  (if (null string)
> >     (values)
> >     (progn (do-something (car string))
> >            (iterate-over-string-with-car-cdr (cdr string)))))
> >
> >(defun iterate-over-string-with-numerical-index (string)
> >  (dotimes (i (length string))
> >    (do-something (nth i string)))
> >  (values))
> >
> >Proving that these two pieces of code do the "same thing" is quite
> >challenging.
> 
> But recognizing them as common idioms for the same thing is
> straightforward.

No, it isn't.  That's the whole point.  It's straightforward for a *human*
to look at *those* two pieces of code and recognize them as the same.  It
is much harder to write a program that will recognize them as the same,
and harder still (impossible actually) for either humans or programs to
recognize such equivalences in general.

> However, there's an even simpler solution: Where the
> compiler can't prove in advance what the access method will be

You're missing the point.  The compiler doesn't "prove in advance what the
access method will be".  "The access method" (whatever that means) is
*given* to the compiler.  In other words, the compiler is going to be
given code that either contains calls to CAR and CDR, or calls to
AREF/NTH, or maybe both.  I think what you're really saying is that you
will rely on the compiler to figure out that even though you've written
code in terms of CAR/CDR it can rewrite that into equivalent code that
uses AREF and change the running time from O(n) to O(1).  And what I'm
saying is that no, that is not possible, except perhaps in a few special
cases.

> Yes, the key part of that is "it's lame to clutter up the _semantics_
> of the language" - emphasis added; he's referring to language
> semantics, not implementation details.

Yes, and my point is that if you are giving an operational semantics, as
Paul seems to want to do, then it is no longer possible to make this
distinction.  Transforming an operational semantics given in terms of
CAR/CDR into an equivalent implementation in terms of AREF is AI-complete
at best, and undecidable at worst.

E.
From: Russell Wallace
Subject: Re: My take on ARC
Date: 
Message-ID: <3f9f11aa.56269040@news.eircom.net>
On Tue, 28 Oct 2003 14:54:16 -0800, ···@jpl.nasa.gov (Erann Gat)
wrote:

>I think what you're really saying is that you
>will rely on the compiler to figure out that even though you've written
>code in terms of CAR/CDR it can rewrite that into equivalent code that
>uses AREF and change the running time from O(n) to O(1).

No, I'm saying the programmer should be free to use either CDR or
AREF, whichever he thinks is more appropriate to the task at hand, and
the compiler or runtime environment can note which is actually being
used, and choose the most efficient data structure accordingly.

>Yes, and my point is that if you are giving an operational semantics, as
>Paul seems to want to do, then it is no longer possible to make this
>distinction.  Transforming an operational semantics given in terms of
>CAR/CDR into an equivalent implementation in terms of AREF is AI-complete
>at best, and undecidable at worst.

Maybe. It's been awhile since I looked at the writeups on Arc, and it
still seemed to be very much a work in progress at that point. I
suppose I agree with you that if you take the approach of "let the
compiler worry about efficiency" (IMO a good idea as far as it goes),
you have to keep the semantics clean; if you expose too much of the
guts of the implementation, high-level optimization will be very hard
to do. I don't know whether Paul is doing that; if he is, then yes, it
could be a problem.

-- 
"Sore wa himitsu desu."
To reply by email, remove
the small snack from address.
http://www.esatclear.ie/~rwallace
From: Aurélien Campéas
Subject: Re: My take on ARC
Date: 
Message-ID: <pan.2003.10.30.22.17.12.597670@wanadoo.fr>
On Tue, 28 Oct 2003 14:54:16 -0800, Erann Gat wrote:

> I think what you're really saying is that you
> will rely on the compiler to figure out that even though you've written
> code in terms of CAR/CDR it can rewrite that into equivalent code that
> uses AREF and change the running time from O(n) to O(1).  And what I'm
> saying is that no, that is not possible, except perhaps in a few special
> cases.

And what if your lists are actually implemented as dynamically resizing
arrays, just like in Python ? There is no more hint to give to the
compiler and you can CAR/CDR and AREF as much as you want, all O(1).

Aur�lien.
From: Erann Gat
Subject: Re: My take on ARC
Date: 
Message-ID: <gat-3010031528250001@k-137-79-50-101.jpl.nasa.gov>
In article <······························@wanadoo.fr>,
=?iso-8859-1?q?Aur=E9lien_Camp=E9as?= <···········@wanadoo.fr> wrote:

> On Tue, 28 Oct 2003 14:54:16 -0800, Erann Gat wrote:
> 
> > I think what you're really saying is that you
> > will rely on the compiler to figure out that even though you've written
> > code in terms of CAR/CDR it can rewrite that into equivalent code that
> > uses AREF and change the running time from O(n) to O(1).  And what I'm
> > saying is that no, that is not possible, except perhaps in a few special
> > cases.
> 
> And what if your lists are actually implemented as dynamically resizing
> arrays, just like in Python ? There is no more hint to give to the
> compiler and you can CAR/CDR and AREF as much as you want, all O(1).

That doesn't work because you can't take the CDR of a dynamically resizing
array.  The best you can do is make a displaced array corresponding to all
the elements but the first, but that has two problems:

1.  It conses.

2.  It makes vector-push-extend potentially very expensive, because if you
extend an array past its memory allocation and you have to move it then
you also have to hunt down all the arrays that are displaced to it and
move them as well.  (Well, maybe not.  Maybe you can leave a forwarding
pointer, but then AREF isn't constant time any more.)

There is no simple way to make this work out, or at least, no one knows of
one.  If you come up with one that really works you'll be famous.

E.

P.S.  That actually brings up an interesting CL question:  Is there a way
to make a displaced array share the fill pointer of the array it's
displaced to?
From: Paul Dietz
Subject: Re: My take on ARC
Date: 
Message-ID: <3FA1933B.B457F555@motorola.com>
Aur�lien Camp�as wrote:

> And what if your lists are actually implemented as dynamically resizing
> arrays, just like in Python ? There is no more hint to give to the
> compiler and you can CAR/CDR and AREF as much as you want, all O(1).


But then you can't CONS in O(1).

	Paul
From: Marcin 'Qrczak' Kowalczyk
Subject: Re: My take on ARC
Date: 
Message-ID: <pan.2003.10.30.22.52.04.562549@knm.org.pl>
On Thu, 30 Oct 2003 23:17:13 +0100, Aur�lien Camp�as wrote:

> And what if your lists are actually implemented as dynamically resizing
> arrays, just like in Python ? There is no more hint to give to the
> compiler and you can CAR/CDR and AREF as much as you want, all O(1).

No, they don't give non-destructive CDR nor CONS in O(1).

-- 
   __("<         Marcin Kowalczyk
   \__/       ······@knm.org.pl
    ^^     http://qrnik.knm.org.pl/~qrczak/
From: Aurélien Campéas
Subject: Re: My take on ARC
Date: 
Message-ID: <pan.2003.10.30.23.18.56.859424@wanadoo.fr>
On Thu, 30 Oct 2003 23:52:04 +0100, Marcin 'Qrczak' Kowalczyk wrote:

> On Thu, 30 Oct 2003 23:17:13 +0100, Aur�lien Camp�as wrote:
> 
>> And what if your lists are actually implemented as dynamically resizing
>> arrays, just like in Python ? There is no more hint to give to the
>> compiler and you can CAR/CDR and AREF as much as you want, all O(1).
> 
> No, they don't give non-destructive CDR nor CONS in O(1).

Adding an element at the end is O(1). You can look at your vector
backwards. 

And what about deques ?

hmmm, just checking, C++/STL documentation at SGI claim they have
vector-like deques, ie constant (amortized ?) time random access, whereas
Knuth speaks of them as a kind of linear list (maybe his text is a little
outdated ?).

It may be more tricky to implement CDR indeed (I am thinking about it
right now ...), but would you say impossible ?

hmmm, sorry for my ignorance, but what is a non-destructive CDR anyway ?
(as opposed to a destructive one ?)...

Aur�lien.
From: Paul F. Dietz
Subject: Re: My take on ARC
Date: 
Message-ID: <bJGdnXTgQMaMMzyiRVn-gg@dls.net>
Aur�lien Camp�as wrote:

>>No, they don't give non-destructive CDR nor CONS in O(1).
> 
> 
> Adding an element at the end is O(1). You can look at your vector
> backwards. 

Cons cells mean the lists can share structure.

(setq x (list 'x 'y 'z))
(setq y (cons 'a x))
(setq z (cons 'b x))

If you do that with vectors, you have to copy a vector at some point.

	Paul
From: Marcin 'Qrczak' Kowalczyk
Subject: Re: My take on ARC
Date: 
Message-ID: <pan.2003.10.31.00.09.00.500700@knm.org.pl>
On Fri, 31 Oct 2003 00:18:57 +0100, Aur�lien Camp�as wrote:

>> No, they don't give non-destructive CDR nor CONS in O(1).
> 
> Adding an element at the end is O(1). You can look at your vector
> backwards. 

You can't compute (cons x l) and (cons y l) and have both results
available at the same time (without copying the whole l).

-- 
   __("<         Marcin Kowalczyk
   \__/       ······@knm.org.pl
    ^^     http://qrnik.knm.org.pl/~qrczak/
From: Ray Dillinger
Subject: Re: My take on ARC
Date: 
Message-ID: <3FA0A8D5.1ABF7B6B@sonic.net>
Russell Wallace wrote:
>  
> Users don't care about machine efficiency. Most programmers don't care
> about it either, and they're right. Saving clock cycles won't put a
> penny in your paycheck. Saving development time will.
> 
> Us hardcore geeks are the ones who care about machine efficiency, and
> that's because we grew up having to squeeze things into 16k of RAM;
> fettered limbs grow lame. It's hard to overcome this sort of mental
> crippling, but let's at least recognize it for what it is: a mental
> illness induced by childhood trauma.

It's sort of like a martial-arts thing;  Having the discipline and
dedication and skill and using it, you can do profoundly amazing
things, things regarded as impossible by the common soldier. The 
discipline is good for your head and your skills and may be valued,
in and of itself as a discipline to be pursued for self-development.

But in a modern war, the common soldier in a big, resource-hungry, 
expensive tank, does not need to worry too much about an opposing 
martial artist.  Especially if the opposing martial artist is outside
the tank.  Ninjas can no longer use the cover of darkness when infrared
scanners and SODAR motion-detectors and scan the perimeters.

And that's where we are today; there are still those of us who care
about crafting beautiful, maximally-efficient code, more now as an 
artform than a way of earning money -- But nobody wants to train us
up any more; we have to train ourselves.  And the major markets 
have gone to mass-production by generals of capital commanding "soldier" 
type coders who are mostly interchangeable and able to use and tolerate 
common tools regardless of how crappy the end product they produce 
may be.  

At some point, you have to examine your direction and decide 
whether you are pursuing a disciplined artform for pleasure and 
self-development or a business for money.  If you decide that you 
are a code artist, cool!  Let's assault the mysteries of parsing
type-0 languages efficiently, see how much cool stuff we can get
in under 8k of machine code, or write compilers for new languages 
just because we feel like we have some cool language ideas we want 
to try.  And if you decide that you are a software soldier or 
businessman, then, well, I hope you make a lot of money and don't 
get too bored doing the same tedious generic crap day after day.

As for myself, I'm both to some extent - but as years go on, my 
code-fu becomes more important to me personally. Coding is still 
a decent business, but family, lovers, and friends are all more
important than business.  But coding as art is a passion, on the
same level as these other great focuses in my life.

				Bear
From: Russell Wallace
Subject: Re: My take on ARC
Date: 
Message-ID: <3fa12e37.194670496@news.eircom.net>
On Thu, 30 Oct 2003 05:56:05 GMT, Ray Dillinger <····@sonic.net>
wrote:

>But in a modern war, the common soldier in a big, resource-hungry, 
>expensive tank, does not need to worry too much about an opposing 
>martial artist.  Especially if the opposing martial artist is outside
>the tank.  Ninjas can no longer use the cover of darkness when infrared
>scanners and SODAR motion-detectors and scan the perimeters.

Heh, not a bad analogy! However...

>And that's where we are today; there are still those of us who care
>about crafting beautiful, maximally-efficient code, more now as an 
>artform than a way of earning money

There is indeed a certain beauty in code that's near optimal... but
optimal by what criterion? Machine resources is one, of course, but
there are others; and these days I find the criterion I'm interested
in is the ratio of expressive power and flexibility to intellectual
complexity. And it's hard to make code both flexible and elegant if
you're continually inserting kludges to save a few bytes here and
there.

-- 
"Sore wa himitsu desu."
To reply by email, remove
the small snack from address.
http://www.esatclear.ie/~rwallace
From: Conrad Barski
Subject: Re: My take on ARC
Date: 
Message-ID: <a4af10cf.0310201512.686a8516@posting.google.com>
> My personal take on all this is that the quest for minimalism is a
> quixotic one because it invariably leads you to the lambda calculus. 

Well put. I'm hoping this isn't what he is trying to do- Essentially
what you're saying is that McCarthy's Big Seven (plus some basic
types) is simply one way to take lambda calculus and add on top of it
to find a "sweet spot" that balances expressivity and minimalism. What
I think PG is doing is just trying to find a new "sweet spot" that
takes into account the last few decades of programming developments
and simply presents a new "special formula" that leads to some humble
improvements over the original formula for common programming idioms.
I'm no language expert, but trying to find a "true minimalism" does
seem to be impossible in a case such as this.

> So far I see little in ARC that is particularly interesting
> linguistically.

That may be true- But even finds ways to improve the linguistics by 5%
but can do it in 100 different subtle ways the net benefit may (or may
not) be worth the trouble.

> Nonetheless, it may still draw people to Lisp just
> because it's Paul's thing, and his charisma could easily carry the day. 
> That would be a Good Thing.

Although I hate to say it (because it sounds shallow) I think that the
one thing that the Lisp community could really benefit from is a
rebranding. The simple fact is that when I talk to any programmers
here in minneapolis the moment the word "Lisp" crosses my mouth
anything else I say automatically ignored. (At least I hope it's the
word "Lisp" :)

This is because programmers already carry subconscious predjudices
against it:
     1. "Boy that lisp course it took in college sure sucked"
     2. "It's been around for decades and nobody uses it? It must
suck."
     3. "Lisp is slow. That sucks."

IMHO something as silly as a rebranding of Lisp could make a HUGE
difference in the adoption of Lisp, and ARC is the best candidate I
see for this right now.

> One of the premises of ARC's design is that speed doesn't matter because
> today's machines are so fast.  I don't accept this premise.  Speed only
> doesn't matter if you're not doing production work.

True. But this is why he uses the term "100 year language" in his
articles. (I'm sure you;ve read the article with that name on his
website.) He admits that it is horribly inefficient by design and that
his core language may often only be usable as a prototyping tool for a
long, long time. But he is trying to do at least some things to try
and shorten the wait.

> However, Paul's method for
> writing the langauge specification is to write what is essentially a
> reference implementation.

...Which I agree will raise a lot of the eloquently-stated
difficulties you mention in your post.
From: Erann Gat
Subject: Re: My take on ARC
Date: 
Message-ID: <myfirstname.mylastname-2010031739410001@k-137-79-50-101.jpl.nasa.gov>
In article <····························@posting.google.com>,
·····················@yahoo.com (Conrad Barski) wrote:

> Although I hate to say it (because it sounds shallow) I think that the
> one thing that the Lisp community could really benefit from is a
> rebranding.  The simple fact is that when I talk to any programmers
> here in minneapolis the moment the word "Lisp" crosses my mouth
> anything else I say automatically ignored. (At least I hope it's the
> word "Lisp" :)
> 
> This is because programmers already carry subconscious predjudices
> against it:
>      1. "Boy that lisp course it took in college sure sucked"
>      2. "It's been around for decades and nobody uses it? It must
> suck."
>      3. "Lisp is slow. That sucks."
> 
> IMHO something as silly as a rebranding of Lisp could make a HUGE
> difference in the adoption of Lisp, and ARC is the best candidate I
> see for this right now.

Yes, I absolutely agree with this.

E.
From: Daniel Barlow
Subject: Re: My take on ARC
Date: 
Message-ID: <87ptgrpgzf.fsf@noetbook.telent.net>
·····················@yahoo.com (Conrad Barski) writes:

> rebranding. The simple fact is that when I talk to any programmers
> here in minneapolis the moment the word "Lisp" crosses my mouth
> anything else I say automatically ignored. (At least I hope it's the
> word "Lisp" :)
>
> This is because programmers already carry subconscious predjudices
> against it:
>      1. "Boy that lisp course it took in college sure sucked"
>      2. "It's been around for decades and nobody uses it? It must
> suck."
>      3. "Lisp is slow. That sucks."

I disagree.  A few years ago Franz (I think it was; someone will
correct me if I'm wrong) managed to remove the L word from any
prominent place on their web site and publicity material, and I don't
observe that it led to any immediate resurgence in Lisp popularity or
in the fortunes of Franz.  And Scheme isn't called Lisp either ...

I think that for every programmer who has an irrational objection to
Lisp another can be found who has it on their (equally as irrational)
"little-known => cool" or "retrocomputing => cool" list.  It's just
like the emacs vs vi war: emacs won't win over any vi users by
renaming itself.


-dan

-- 

   http://web.metacircles.com/cirCLe_CD - Free Software Lisp/Linux distro
From: Fred Gilham
Subject: Re: My take on ARC
Date: 
Message-ID: <u7ekx6y955.fsf@snapdragon.csl.sri.com>
> I disagree.  A few years ago Franz (I think it was; someone will
> correct me if I'm wrong) managed to remove the L word from any
> prominent place on their web site and publicity material, and I
> don't observe that it led to any immediate resurgence in Lisp
> popularity or in the fortunes of Franz.

Aren't you thinking of Harlequin when they became Xanalys?  I
specifically remember seeing their announcement and looking on their
web page, and complaining in this group that they didn't say anything
about lisp.  I think I made the rather caustic remark that if they
didn't want to sell lisp I didn't see why we should buy it from them.

(They certainly seem willing to sell lisp these days....)

-- 
Fred Gilham                                         ······@csl.sri.com
There are men in all ages who mean to govern well, but they mean to
govern. They promise to be good masters, but they mean to be masters.
                                                  --- Daniel Webster
From: Duane Rettig
Subject: Re: My take on ARC
Date: 
Message-ID: <4y8ves2mk.fsf@beta.franz.com>
Fred Gilham <······@snapdragon.csl.sri.com> writes:

> > I disagree.  A few years ago Franz (I think it was; someone will
> > correct me if I'm wrong) managed to remove the L word from any
> > prominent place on their web site and publicity material, and I
> > don't observe that it led to any immediate resurgence in Lisp
> > popularity or in the fortunes of Franz.
> 
> Aren't you thinking of Harlequin when they became Xanalys?  I
> specifically remember seeing their announcement and looking on their
> web page, and complaining in this group that they didn't say anything
> about lisp.  I think I made the rather caustic remark that if they
> didn't want to sell lisp I didn't see why we should buy it from them.
> 
> (They certainly seem willing to sell lisp these days....)

You may be right, in that all vendors of Lisps had to at least consider
that the "L" name was the problem.  But I believe that Daniel's remark
was based on something I have said in the past, where in the dark days
of the AI winter in mid-'90s, some executives at Franz (who, interestingly,
are no longer at Franz) insisted that "Lisp" be taken out of all of our
marketing and sales material, and other things substituted.  We tried
CLOS (which was, after all, a "new language", not containing the L word),
and we tried inventing new terms such as "Dynamic Objects", which was
promptly lifted by other parts of the industry and given different
meanings.

I would estimate that of those of us in the development group, we were
roughly divided evenly as to whether the removal of the lisp name was
a good approach - the views ranged from wholehearted support from some
(who are also, interestingly, no longer with Franz) to skepticism but
acceptance through desparation, to out-and-out seething anger at the
bastardization of the language and its name.  I fit in somewhere
between the second and third category - it disgusted me that people
were trying anything to sell a rose by any other name, but I was open
to the prospect of keeping my company afloat and having a paycheck.

In the long term, it is not clear if the attempt did anything or not.
We survived the AI winter, either because of or despite the name-hiding
we tried.  Another interesting development took place that helped us
to convince ourselves that the hiding wasn't the right approach; it
was the birth and stalling of Dylan - Dylan was poised to fill the 
hole in the market that Java eventually filled.  Dylan's purported
strengths (as opposed to Lisp) were its syntax which of course everyone
hated, and its name, which had no connection to the "L" word at all.
But Dylan didn't succeed in that marketplace, perhaps because the
developers weren't interested in that, but also because Dylan wasn't
ready for it. And the fact that it was Dylan's (lack of certain)
features, and the fact that it was not sold/marketed, not its name
or syntax, that finally helped us to work out where we needed to be
going with our product; we started beefing up our marketing department,
we started working on connectivity to non-lisp languages, the newly
formed world-wide-web, and we started making free lisps available
(we were the first commercial Lisp vendor to do so, and if you
recall, our first linux lisp, though not opensource, was completely
free and uninhibited).

To bring this back on-topic, I think that if Paul Graham is to succeed
in Arc, it won't be for the name only.  He'll need his language to
be robust and easy to use, but in addition, he will need a market,
and will have to be able to sell to that market, and he will need a
story which tells us all why we should use that language above our
own favorite languages.  And as simple as these criteria seem, they
are far from easy.  If Lisp's disadvantage is that non-lispers get a
bad taste in their mouths due to its name, it also at least has a
substantial market of those who don't get that bad taste, and Arc has
no such market-base yet.  In order to get the non-lisp market, like
those that Dylan was attempting to go after, he'll have to do aggressive
marketing and promotion.  In order to attract the Lisp market base,
however, he will have to walk a fine line; to attract Schemers; the base
language will have to be small and basically functional.  To attract
CL-ers, the language will have to fully allow for macro concepts that
are not bound by pure mathematical concepts, and for both it will have
to be able at some level to reason about its own code and compile to
extremely efficient binary code.  Talking about ignoring efficiency
for the long term is preposterous; any _user_ is allowed to make his
own decisions about whether to consider efficiency or not (and is
perfectly allowed to change his mind based on the requirement of the
project), but a language designer does not have that luxury - if he
decides to please some of the people none of the time, then he will
inevitably please none of the people at least some of the time, and
his language will fail.

-- 
Duane Rettig    ·····@franz.com    Franz Inc.  http://www.franz.com/
555 12th St., Suite 1450               http://www.555citycenter.com/
Oakland, Ca. 94607        Phone: (510) 452-2000; Fax: (510) 452-0182   
From: Doug Tolton
Subject: Re: My take on ARC
Date: 
Message-ID: <OJhlb.153550$ox5.1395363@news.easynews.com>
Duane Rettig wrote:

> Fred Gilham <······@snapdragon.csl.sri.com> writes:
> 
> 
>>>I disagree.  A few years ago Franz (I think it was; someone will
>>>correct me if I'm wrong) managed to remove the L word from any
>>>prominent place on their web site and publicity material, and I
>>>don't observe that it led to any immediate resurgence in Lisp
>>>popularity or in the fortunes of Franz.
>>
>>Aren't you thinking of Harlequin when they became Xanalys?  I
>>specifically remember seeing their announcement and looking on their
>>web page, and complaining in this group that they didn't say anything
>>about lisp.  I think I made the rather caustic remark that if they
>>didn't want to sell lisp I didn't see why we should buy it from them.
>>
>>(They certainly seem willing to sell lisp these days....)
> 
> 
> You may be right, in that all vendors of Lisps had to at least consider
> that the "L" name was the problem.  But I believe that Daniel's remark
> was based on something I have said in the past, where in the dark days
> of the AI winter in mid-'90s, some executives at Franz (who, interestingly,
> are no longer at Franz) insisted that "Lisp" be taken out of all of our
> marketing and sales material, and other things substituted.  We tried
> CLOS (which was, after all, a "new language", not containing the L word),
> and we tried inventing new terms such as "Dynamic Objects", which was
> promptly lifted by other parts of the industry and given different
> meanings.
> 
> I would estimate that of those of us in the development group, we were
> roughly divided evenly as to whether the removal of the lisp name was
> a good approach - the views ranged from wholehearted support from some
> (who are also, interestingly, no longer with Franz) to skepticism but
> acceptance through desparation, to out-and-out seething anger at the
> bastardization of the language and its name.  I fit in somewhere
> between the second and third category - it disgusted me that people
> were trying anything to sell a rose by any other name, but I was open
> to the prospect of keeping my company afloat and having a paycheck.
> 
> In the long term, it is not clear if the attempt did anything or not.
> We survived the AI winter, either because of or despite the name-hiding
> we tried.  Another interesting development took place that helped us
> to convince ourselves that the hiding wasn't the right approach; it
> was the birth and stalling of Dylan - Dylan was poised to fill the 
> hole in the market that Java eventually filled.  Dylan's purported
> strengths (as opposed to Lisp) were its syntax which of course everyone
> hated, and its name, which had no connection to the "L" word at all.
> But Dylan didn't succeed in that marketplace, perhaps because the
> developers weren't interested in that, but also because Dylan wasn't
> ready for it. And the fact that it was Dylan's (lack of certain)
> features, and the fact that it was not sold/marketed, not its name
> or syntax, that finally helped us to work out where we needed to be
> going with our product; we started beefing up our marketing department,
> we started working on connectivity to non-lisp languages, the newly
> formed world-wide-web, and we started making free lisps available
> (we were the first commercial Lisp vendor to do so, and if you
> recall, our first linux lisp, though not opensource, was completely
> free and uninhibited).
> 
> To bring this back on-topic, I think that if Paul Graham is to succeed
> in Arc, it won't be for the name only.  He'll need his language to
> be robust and easy to use, but in addition, he will need a market,
> and will have to be able to sell to that market, and he will need a
> story which tells us all why we should use that language above our
> own favorite languages.  And as simple as these criteria seem, they
> are far from easy.  If Lisp's disadvantage is that non-lispers get a
> bad taste in their mouths due to its name, it also at least has a
> substantial market of those who don't get that bad taste, and Arc has
> no such market-base yet.  In order to get the non-lisp market, like
> those that Dylan was attempting to go after, he'll have to do aggressive
> marketing and promotion.  In order to attract the Lisp market base,
> however, he will have to walk a fine line; to attract Schemers; the base
> language will have to be small and basically functional.  To attract
> CL-ers, the language will have to fully allow for macro concepts that
> are not bound by pure mathematical concepts, and for both it will have
> to be able at some level to reason about its own code and compile to
> extremely efficient binary code.  Talking about ignoring efficiency
> for the long term is preposterous; any _user_ is allowed to make his
> own decisions about whether to consider efficiency or not (and is
> perfectly allowed to change his mind based on the requirement of the
> project), but a language designer does not have that luxury - if he
> decides to please some of the people none of the time, then he will
> inevitably please none of the people at least some of the time, and
> his language will fail.
> 

Wow, +5 insightful.

Oh...wait this isn't slashdot... :)

You made some *very* good points.  I'm going to save this post, it's 
probably one of the most clearly articulated posts I've read.

It's really good to hear from an actual honest to goodness language 
developer, rather than just hearing my own ill-formed, half conceived 
notions masquerading as opinions.

Thanks Duane.

-- 
Doug Tolton
(format t ···@~a~a.~a" "dtolton" "ya" "hoo" "com")
From: Bob Riemenschneider
Subject: Re: My take on ARC
Date: 
Message-ID: <ce15782d.0310211333.d1b61a9@posting.google.com>
Fred Gilham <······@snapdragon.csl.sri.com> wrote in message news:<··············@snapdragon.csl.sri.com>...
> > I disagree.  A few years ago Franz (I think it was; someone will
> > correct me if I'm wrong) managed to remove the L word from any
> > prominent place on their web site and publicity material, ...
> 
> Aren't you thinking of Harlequin when they became Xanalys?  ...

He was probably thinking about when all the Franz promotional stuff
said "Dynamic Objects" and "CLOS" -- with _maybe_ a note that "CLOS"
is an acronym somewhere in the fine print -- rather than "Lisp".

                                                       -- rar
From: Gareth McCaughan
Subject: Re: My take on ARC
Date: 
Message-ID: <87n0bumn64.fsf@g.mccaughan.ntlworld.com>
)
NNTP-Posting-Date: Tue, 21 Oct 2003 20:48:14 


Organization: ntl Cablemodem News Service

Fred Gilham wrote:

> > I disagree.  A few years ago Franz (I think it was; someone will
> > correct me if I'm wrong) managed to remove the L word from any
> > prominent place on their web site and publicity material, and I
> > don't observe that it led to any immediate resurgence in Lisp
> > popularity or in the fortunes of Franz.
> 
> Aren't you thinking of Harlequin when they became Xanalys?

Didn't Franz go through a period where it was all "Dynamic
Objects" and occasionally CLOS (very deliberately left
unexpanded, as if CLOS were a completely different language
from Common Lisp)?

-- 
Gareth McCaughan
.sig under construc
From: Daniel Barlow
Subject: Re: My take on ARC
Date: 
Message-ID: <87ptgqnvfx.fsf@noetbook.telent.net>
Gareth McCaughan <·····@g.local> writes:

> Didn't Franz go through a period where it was all "Dynamic
> Objects" and occasionally CLOS (very deliberately left
> unexpanded, as if CLOS were a completely different language
> from Common Lisp)?

That's what I was thinking of, yes; my unreliable memory says it was
sometime around 1997 or 1998.  Either it happened or we both
hallucinated it at once


-dan

-- 

   http://web.metacircles.com/cirCLe_CD - Free Software Lisp/Linux distro
From: Gareth McCaughan
Subject: Re: My take on ARC
Date: 
Message-ID: <87r817m0wc.fsf@g.mccaughan.ntlworld.com>
Dan Barlow wrote:

> ·····················@yahoo.com (Conrad Barski) writes:
...
> > This is because programmers already carry subconscious predjudices
> > against it:
> >      1. "Boy that lisp course it took in college sure sucked"
> >      2. "It's been around for decades and nobody uses it? It must
> >         suck."
> >      3. "Lisp is slow. That sucks."
> 
> I disagree.  A few years ago Franz (I think it was; someone will
> correct me if I'm wrong) managed to remove the L word from any
> prominent place on their web site and publicity material, and I don't
> observe that it led to any immediate resurgence in Lisp popularity or
> in the fortunes of Franz.  And Scheme isn't called Lisp either ...

It isn't the *name* "Lisp" about which people have those
prejudices. They generally remember barely enough about
the language (lots of parentheses, long function and operator
names, ...) that they can recognize Lisp code when they
see it, even if it's being tagged as "Scheme" or "Dynamic
Object Technology" or something.

So when you say the word "Lisp" that's *enough* to
activate all those prejudices, but unfortunately
there are other less eradicable things that also
activate the prejudices.

-- 
Gareth McCaughan
.sig under construc
From: Doug Tolton
Subject: Re: My take on ARC
Date: 
Message-ID: <iV0lb.210344$Pk.32131@news.easynews.com>
Conrad Barski wrote:

> Although I hate to say it (because it sounds shallow) I think that the
> one thing that the Lisp community could really benefit from is a
> rebranding. The simple fact is that when I talk to any programmers
> here in minneapolis the moment the word "Lisp" crosses my mouth
> anything else I say automatically ignored. (At least I hope it's the
> word "Lisp" :)
>
I've experienced this as well.  It only sounds shallow because most 
technical people think that technical merit alone is enough to carry the 
day.  However technical people above all others should realize things 
seldom work this way.  More often than not, good enough with good 
marketing is what carries the day.  Microsoft, Unix, C, C++, 
Python...are not all of these inferior technologies more popular than 
Lisp currently is?  Good technology wasn't enough to save Lisp Machines 
from extinction.

Paul stated that he has a good marketing head while we were in 
conversation, and after listening to some of his ideas I think he is 
right.  I'm not going to reveal the ideas because I think they would be 
more effective with a good implementation behind them. :)

> This is because programmers already carry subconscious predjudices
> against it:
>      1. "Boy that lisp course it took in college sure sucked"
>      2. "It's been around for decades and nobody uses it? It must
> suck."
>      3. "Lisp is slow. That sucks."
> 
> IMHO something as silly as a rebranding of Lisp could make a HUGE
> difference in the adoption of Lisp, and ARC is the best candidate I
> see for this right now.
> 
Rebranding is generally pretty successful.  It worked well for Delphi. 
It worked well for Linux.  The power of the brand is almost 
incomprehensible to most technical people that I know.  However if you 
look at how a significant amount of purchasing decisions are made it is 
based on brand image.

One gripe I have is with the name Lisp.  I know smart people came up 
with, but for heavens sake, why name a programming language after a 
speech impediment?  Seriously I get that joke almost everytime I bring 
up Lisp to someone.  If you are seriously looking at branding...well 
Lisp isn't a good brand name.
> 
>>One of the premises of ARC's design is that speed doesn't matter because
>>today's machines are so fast.  I don't accept this premise.  Speed only
>>doesn't matter if you're not doing production work.
> 
> 
> True. But this is why he uses the term "100 year language" in his
> articles. (I'm sure you;ve read the article with that name on his
> website.) He admits that it is horribly inefficient by design and that
> his core language may often only be usable as a prototyping tool for a
> long, long time. But he is trying to do at least some things to try
> and shorten the wait.
> 
I do think Rainer brought up some excellent points about speed.  People 
really do avoid a language that is perceived to be slow.  However if he 
is going for an incremental improvement rather than a drastic redefining 
of core concepts there are a lot of sources to draw from.

As I got thinking about it, people don't compare Lisp on a 2 ghz to Lisp 
on 200 Mhz.  They compare Lisp to C on the same machine.  When it comes 
to programming languages, I think being excessively slow could be an 
image killer.

Hopefully we'll get the best of all worlds:

An incremental improvement in Lisp, with a more consistent overall design.
Fast native code compilation
Good cross platform portability
Good web programming facilities
Good native environment hooks (threads, sockets, etc)
Good database integration
Incremental compilation
Highly optimized and super fast garbage collection
Typical great Lisp development environment
Good cross platform gui code (maybe like Lispworks CAPI or better)
A serious language that will endure through the ages, and can adapt like 
a Lisp.
Powerful Macro System
Good implementation of call/cc

hmm...can you think of anything else you'd want in a programming 
language/environment? :)

-- 
Doug Tolton
(format t ···@~a~a.~a" "dtolton" "ya" "hoo" "com")
From: Rob Warnock
Subject: Re: My take on ARC
Date: 
Message-ID: <htadnRQIPd4BUAmiXTWc-w@speakeasy.net>
Doug Tolton  <····@nospam.com> wrote:
+---------------
| One gripe I have is with the name Lisp.  I know smart people came up 
| with, but for heavens sake, why name a programming language after a 
| speech impediment?  Seriously I get that joke almost everytime I bring 
| up Lisp to someone.
+---------------

What, you think something like SEXPR would be better?!?  ;-}  ;-}


-Rob

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: Pascal Bourguignon
Subject: Re: My take on ARC
Date: 
Message-ID: <87n0bvdzic.fsf@thalassa.informatimago.com>
Doug Tolton <····@nospam.com> writes:
> One gripe I have is with the name Lisp.  I know smart people came up
> with, but for heavens sake, why name a programming language after a
> speech impediment?  Seriously I get that joke almost everytime I bring
> up Lisp to someone.  If you are seriously looking at branding...well
> Lisp isn't a good brand name.

This is quite oblivious to non-native English speakers...  I don't see
that there are more Lisp programmers in non English countries though.

-- 
__Pascal_Bourguignon__
http://www.informatimago.com/
Do not adjust your mind, there is a fault in reality.
Lying for having sex or lying for making war?  Trust US presidents :-(
From: Kenny Tilton
Subject: Re: My take on ARC
Date: 
Message-ID: <AK3lb.14856$pT1.12403@twister.nyc.rr.com>
Doug Tolton wrote:
> Conrad Barski wrote:
> 
>> Although I hate to say it (because it sounds shallow) I think that the
>> one thing that the Lisp community could really benefit from is a
>> rebranding. ....

> Paul stated that he has a good marketing head while we were in 
> conversation, and after listening to some of his ideas I think he is 
> right.  I'm not going to reveal the ideas because I think they would be 
> more effective with a good implementation behind them. :)
> 
>> This is because programmers already carry subconscious predjudices
>> against it:
>>      1. "Boy that lisp course it took in college sure sucked"
>>      2. "It's been around for decades and nobody uses it? It must
>> suck."
>>      3. "Lisp is slow. That sucks."
>>
>> IMHO something as silly as a rebranding of Lisp could make a HUGE
>> difference in the adoption of Lisp, and ARC is the best candidate I
>> see for this right now.
>>
> Rebranding is generally pretty successful.  It worked well for Delphi. 
> It worked well for Linux.  The power of the brand is almost 
> incomprehensible to most technical people that I know.  However if you 
> look at how a significant amount of purchasing decisions are made it is 
> based on brand image.
> 
> One gripe I have is with the name Lisp.  I know smart people came up 
> with, but for heavens sake, why name a programming language after a 
> speech impediment?  Seriously I get that joke almost everytime I bring 
> up Lisp to someone.  If you are seriously looking at branding...well 
> Lisp isn't a good brand name.

Jeez! Erann, Paul, Conrad, Doug...you talk so much about branding. I do 
not think that word means what you think it means. How about "there is 
no such thing as bad publicity"? Thalidomide has been resurrected for 
cancer treatment. What do they call it now? Thalomid. Apologies for 
another sad example, but a representative of the company selling Ayds 
(the OTC whatever remedy) said the epidemic of the same name had been no 
problem for them.

Someone else has already correctly noted that the stigma is as much an 
accelerant as it is an extinguisher; when we cross a very low threshold, 
the fact that we are a dusty, forty-seven year old language will be cool 
beyond belief. Then they notice we are native-compiled and have a nice 
thick standard and a ton of vendors and free versions and... game over.

As for Peter and Erann straining mightily to escape the stigma of cons, 
I don't mean to get all grandiose here, but I cannot help but think of 
another Peter denying Christ three times before dawn. Would you all 
please get a grip and meditate for thirty days and thirty nights on how 
great Lisp is before opening your big yaps again?! Erann "Mr Negative" 
Gat to his credit had to laugh at ILC2003 when I made fun of the idea 
that what Lisp has to do is get better (this not on branding, but on 
handling package issues differently). ("Yeah, that's our problem, Lisp 
isn't good enough.")

Read my lips: the problem is not with Lisp!!!!!!! Jesus H Christ!!! This 
crap is pathetic beyond belief, the Stockholm Syndrome all over again. 
The fact is that /you/ have the problem. You do not really believe Lisp 
is better. If you did, you would shut the fuck up and get to work on 
some libraries, which is what we really need. Peter is exempted because 
of the book. :)

Damn! Even while posting fifty messages today I managed to DL and build 
Common Lisp Music, in spite of three problems with the DL. My goddamn 
laptop now plays what to my ear sounds like a pretty sweet violin note, 
from scratch!

I did this because I meet tomorrow with an honest-to-god composer who is 
ready to get serious about Symbolic Composer, along with a couple of 
lispnyks. This may cause a little Lisp breakout in the land of music, 
and if so look out.

Talk is cheap. What did you chin-pulling, Lisp-denying, cons-apologizing 
guys /do/ today? Did you help Marc with cl-typesetting? Me with OpenGL 
or RoboCup or music software? Did you work on /any/ library?

You know, according to my survey, the biggest reason for Lisp's revival 
is Paul Graham, who Won Big and then gave Lisp a lot of the credit. And 
if you go back and read what he wrote, some of you branding geniuses 
will discover that in his articles he refers to Lisp by the fucking 
secret code name Lisp!!!! AFLAC!!!!

Meanwhile I have another project possibly getting underway, someone from 
the Linux camp. I am starting to think that Linux is doing well because 
Linux folks have an interesting quality: they actually do things.

<sigh> I am /so/ tired of propping youse guys spirits up. Do me a favor, 
  switch to Python. Leave Lisp to the real Lispniks.

We return you now to your regularly-scheduled self-loathing.

kenny

ps. It /is/ funny that we all got stuck with a silly name to which they 
probably gave about two seconds of thought.

-- 
http://tilton-technology.com
What?! You are a newbie and you haven't answered my:
  http://alu.cliki.net/The%20Road%20to%20Lisp%20Survey
From: Thomas F. Burdick
Subject: Re: My take on ARC
Date: 
Message-ID: <xcvad7vrqou.fsf@famine.OCF.Berkeley.EDU>
Kenny Tilton <·······@nyc.rr.com> writes:

> I did this because I meet tomorrow with an honest-to-god composer who is 
> ready to get serious about Symbolic Composer, along with a couple of 
> lispnyks. This may cause a little Lisp breakout in the land of music, 
> and if so look out.

Yow, lemme know how it goes.  For the record, there are geurilla units
in logic[*] and bioinformatics -- I know about Larry Hunter, but I'm
talking about the other 1/2 of bioinformatics.

> cl-typesetting?

The grandson of a leadtype setter was the first to respon to his call
for assistance, I'm proud of myself :-)

> Me with OpenGL

Are you going to kill me when I announce Medalion?  It's Cells + CLX.
It's for my own use / Hemlock's own use.  Maybe I'll wait until I port
it to QuickDraw...

> Meanwhile I have another project possibly getting underway, someone from 
> the Linux camp. I am starting to think that Linux is doing well because 
> Linux folks have an interesting quality: they actually do things.

Hmm, whats this?  And for the record, it's *unix* folks, not linux
folks.  If you have a bad impression about solaris or freebsd or
darwin/OS-X folks, it's undeserved.

> <sigh> I am /so/ tired of propping youse guys spirits up. Do me a favor, 
>   switch to Python. Leave Lisp to the real Lispniks.

I recommend leaving them alone, or killfiling them.  We don't need
another Naggum or Pitman or ... leaving the net-world.  "I'm only
retiring from usenet."  Okay, Mr. Barlow, Mr. Bradshaw, Mr. Mai, you
only come back when dragged.  You can reduce your usenet intake (viz.
Dave Bakash, Duane Rettig), but beware, "retiring" from usenet means a
general disappearance.  Not that you were saying that, just usenet is
where people came pre-AOL, post-AOL, ..., and they're still coming
from their PacBell-Yahoo accounts.

> ps. It /is/ funny that we all got stuck with a silly name to which they 
> probably gave about two seconds of thought.

When I bought "ANSI Common Lisp" by Graham, the chic who sold me the
book was dying laughing, because, c'mon American National Standard
Common Lisp.  We're actually friends now.

[*] Remember McCarthy's 30 year old jab at temporal logic people?  I
confirmed that temporal logic was born 30 years ago, and got many
laughs from repeating his anecdote.

-- 
           /|_     .-----------------------.                        
         ,'  .\  / | No to Imperialist war |                        
     ,--'    _,'   | Wage class war!       |                        
    /       /      `-----------------------'                        
   (   -.  |                               
   |     ) |                               
  (`-.  '--.)                              
   `. )----'                               
From: Kenny Tilton
Subject: Re: My take on ARC
Date: 
Message-ID: <008lb.15597$pT1.2391@twister.nyc.rr.com>
Thomas F. Burdick wrote:


>>Me with OpenGL
> 
> 
> Are you going to kill me when I announce Medalion?  It's Cells + CLX.

Cells in the generic sense or my Cells? If the latter, a major rewrite 
is on the way that should make bleeding edge applications much more 
tractable. You'll hear it here first.

Re Hemlock, want that Cells+OpenGL object inspector? You could readily 
port to CLX, or that could be a contract to get the OpenGL working on OS 
X and MCL or Lispworks. (Not sure what the ACL IDE (if any) is like on 
OS X.)

> It's for my own use / Hemlock's own use.  Maybe I'll wait until I port
> it to QuickDraw...

You mean Aqua or some other code word with no mnemonic value, right?

> 
> 
>>Meanwhile I have another project possibly getting underway, someone from 
>>the Linux camp. I am starting to think that Linux is doing well because 
>>Linux folks have an interesting quality: they actually do things.
> 
> 
> Hmm, whats this?  And for the record, it's *unix* folks, not linux
> folks.  

I probably meant "open software folks".

kenny


-- 
http://tilton-technology.com
What?! You are a newbie and you haven't answered my:
  http://alu.cliki.net/The%20Road%20to%20Lisp%20Survey
From: Thomas F. Burdick
Subject: Re: My take on ARC
Date: 
Message-ID: <xcvznfuqepp.fsf@famine.OCF.Berkeley.EDU>
Kenny Tilton <·······@nyc.rr.com> writes:

> Thomas F. Burdick wrote:
> 
> 
> >>Me with OpenGL
> > 
> > 
> > Are you going to kill me when I announce Medalion?  It's Cells + CLX.
> 
> Cells in the generic sense or my Cells? If the latter, a major rewrite 
> is on the way that should make bleeding edge applications much more 
> tractable. You'll hear it here first.

Cells as in your code that I grabbed from robocells.  Performance
improvements are good :)

> > It's for my own use / Hemlock's own use.  Maybe I'll wait until I port
> > it to QuickDraw...
> 
> You mean Aqua or some other code word with no mnemonic value, right?

The eventual plan is to try to get a double-clickable SBCL.app that
starts up with Hemlock.  Not wanting to throw Lisp<->ObjC into the mix
too, I'm gonna stick to the Macintosh Toolkit.  So, QuickDraw :)

-- 
           /|_     .-----------------------.                        
         ,'  .\  / | No to Imperialist war |                        
     ,--'    _,'   | Wage class war!       |                        
    /       /      `-----------------------'                        
   (   -.  |                               
   |     ) |                               
  (`-.  '--.)                              
   `. )----'                               
From: Kenny Tilton
Subject: Re: My take on ARC
Date: 
Message-ID: <3F9843E4.5030905@nyc.rr.com>
[also posted to c.l.l.]

Thomas F. Burdick wrote:
> Kenny Tilton <·······@nyc.rr.com> writes:
> 
> 
>>Thomas F. Burdick wrote:
>>
>>
>>
>>>>Me with OpenGL
>>>
>>>
>>>Are you going to kill me when I announce Medalion?  It's Cells + CLX.
>>
>>Cells in the generic sense or my Cells? If the latter, a major rewrite 
>>is on the way that should make bleeding edge applications much more 
>>tractable. You'll hear it here first.
> 
> 
> Cells as in your code that I grabbed from robocells.  Performance
> improvements are good :)

Delayed reaction: I am guessing this means you have observed a 
performance problem? If so, send deets, I might have some thoughts. 
Sometimes I leave behind debug stuff that can massively slow the system 
down. And depending on the application, app-level tricks can avoid 
overloading the dependency graph. Also I started Cells For Structs and 
could finish that for a Good Cause.

One interesting thing about Cells is that so much of it is the way it is 
because of performance tuning. Maybe more is possible. Oh, and don't 
forget synapses.

kenny


-- 
http://tilton-technology.com
What?! You are a newbie and you haven't answered my:
  http://alu.cliki.net/The%20Road%20to%20Lisp%20Survey
From: Thomas F. Burdick
Subject: Re: My take on ARC
Date: 
Message-ID: <xcv8ynbqi0h.fsf@famine.OCF.Berkeley.EDU>
Kenny Tilton <·······@nyc.rr.com> writes:

> Thomas F. Burdick wrote:
>
> > Kenny Tilton <·······@nyc.rr.com> writes:
> >
> > > Cells in the generic sense or my Cells? If the latter, a major rewrite 
> > > is on the way that should make bleeding edge applications much more 
> > > tractable. You'll hear it here first.
> > 
> > Cells as in your code that I grabbed from robocells.  Performance
> > improvements are good :)
> 
> Delayed reaction: I am guessing this means you have observed a 
> performance problem?

No, I haven't pushed Cells hard enough yet to have been able to notice
either way.  Me gotz no reeding comprehnshion, apparently.  Let me try
again: tractable is good :)

I bet you can guess what my main worry is.  FWIW, though, I'm always
nervous about performance -- I *did* come to Lisp because I wanted a
high-performance language.

> One interesting thing about Cells is that so much of it is the way it is 
> because of performance tuning. Maybe more is possible. Oh, and don't 
> forget synapses.

Oh, I haven't -- I think that's the coolest feature you've got in
there (that I've seen).  Specific laziness!

-- 
           /|_     .-----------------------.                        
         ,'  .\  / | No to Imperialist war |                        
     ,--'    _,'   | Wage class war!       |                        
    /       /      `-----------------------'                        
   (   -.  |                               
   |     ) |                               
  (`-.  '--.)                              
   `. )----'                               
From: Thomas F. Burdick
Subject: Re: My take on ARC
Date: 
Message-ID: <xcv1xt3qh65.fsf@famine.OCF.Berkeley.EDU>
···@famine.OCF.Berkeley.EDU (Thomas F. Burdick) writes:

> I bet you can guess what my main worry is.  FWIW, though, I'm always
> nervous about performance -- I *did* come to Lisp because I wanted a
> high-performance language.

(In my own defense, although I'm always nervous, I'm wise enough not
to do anything about it unless I know I need to.  In fact, just today
I removed two pages of optimizations on a Lisp system for a client.
Before doing so, I said that I could make the code more easy to
modify/maintain, at the cost of some rare cases being much slower;
when describing the cases, the response was, "it'll *work* if I tell
it to do that?!?!"  Uh, yeah, but kinda slow, now.)

-- 
           /|_     .-----------------------.                        
         ,'  .\  / | No to Imperialist war |                        
     ,--'    _,'   | Wage class war!       |                        
    /       /      `-----------------------'                        
   (   -.  |                               
   |     ) |                               
  (`-.  '--.)                              
   `. )----'                               
From: Duane Rettig
Subject: Re: My take on ARC
Date: 
Message-ID: <4n0brnigx.fsf@beta.franz.com>
···@famine.OCF.Berkeley.EDU (Thomas F. Burdick) writes:

> ···@famine.OCF.Berkeley.EDU (Thomas F. Burdick) writes:
> 
> > I bet you can guess what my main worry is.  FWIW, though, I'm always
> > nervous about performance -- I *did* come to Lisp because I wanted a
> > high-performance language.
> 
> (In my own defense, although I'm always nervous, I'm wise enough not
> to do anything about it unless I know I need to.  In fact, just today
> I removed two pages of optimizations on a Lisp system for a client.
> Before doing so, I said that I could make the code more easy to
> modify/maintain, at the cost of some rare cases being much slower;
> when describing the cases, the response was, "it'll *work* if I tell
> it to do that?!?!"  Uh, yeah, but kinda slow, now.)

Yes, _that's_ what John McCarthy meant at ILC2003 when he put forth
the word "satisficing", coined by Herbert Simon (Nobel prize, economics)-
contrast satisficing, seeking the minimum satisfactory performance, with
optimizing, where one always seeks the optimum, whatever the cost.
I had always sought for such a term, because when we preach against
premature optimization, it sounds like we're also preaching against
speed, and that's definitely not the case.

I had thought when I heard that term that it was new, but Google
places Simon's coinage of the term at 1957, a year befor Lisp
was born.  How appropriate is that? ...

-- 
Duane Rettig    ·····@franz.com    Franz Inc.  http://www.franz.com/
555 12th St., Suite 1450               http://www.555citycenter.com/
Oakland, Ca. 94607        Phone: (510) 452-2000; Fax: (510) 452-0182   
From: Kenny Tilton
Subject: Re: My take on ARC
Date: 
Message-ID: <Wt6mb.35088$pT1.12768@twister.nyc.rr.com>
Thomas F. Burdick wrote:

> Kenny Tilton <·······@nyc.rr.com> writes:
> 
> 
>>Thomas F. Burdick wrote:
>>
>>
>>>Kenny Tilton <·······@nyc.rr.com> writes:
>>>
>>>
>>>>Cells in the generic sense or my Cells? If the latter, a major rewrite 
>>>>is on the way that should make bleeding edge applications much more 
>>>>tractable. You'll hear it here first.
>>>
>>>Cells as in your code that I grabbed from robocells.  Performance
>>>improvements are good :)
>>
>>Delayed reaction: I am guessing this means you have observed a 
>>performance problem?
> 
> 
> No, I haven't pushed Cells hard enough yet to have been able to notice
> either way.  Me gotz no reeding comprehnshion, apparently.  Let me try
> again: tractable is good :)

That was my second guess, but I did not want to presume.

> 
> I bet you can guess what my main worry is.  FWIW, though, I'm always
> nervous about performance -- I *did* come to Lisp because I wanted a
> high-performance language.

One neat thing is that Cells, on top of making programming easier and 
more correct, also guarantee that a minimum of processing is done to 
process each update. So there is at least some payback for the overhead 
of tracking dependencies. The longer-lived the cell, the greater the 
chance Cells are long-run more efficient in any case where hand-coding 
the minimum update set is, um, intractable.

> 
> 
>>One interesting thing about Cells is that so much of it is the way it is 
>>because of performance tuning. Maybe more is possible. Oh, and don't 
>>forget synapses.
> 
> 
> Oh, I haven't -- I think that's the coolest feature you've got in
> there (that I've seen).  Specific laziness!

And (potentially) dynamically self-tuning specific laziness. A 
"sensitivity" synapse can observe run-time performance and functional 
requirements and adjust its threshhold to let more/less info thru.

And while you are not forgetting synapses, don't also forget that they 
also can do semantic transformations on a datafeed, such as, oh, always 
returning the mean of the value over the last N samples or milliseconds.

Meanwhile I am too intimidated by the massive revision to Just Do It. 
Well, a ton of things going on with the Music SIG of lisp-nyc and 
potentially the first non-Kenny Cells application and Yankee baseball.

Hey, guess what, one of Music SIG members is threatening to help with 
Cello. What do you think of SVG+XML as a browser-friendly backend?

kenny

-- 
http://tilton-technology.com
What?! You are a newbie and you haven't answered my:
  http://alu.cliki.net/The%20Road%20to%20Lisp%20Survey
From: Thomas F. Burdick
Subject: Re: My take on ARC
Date: 
Message-ID: <xcvy8vapl5v.fsf@famine.OCF.Berkeley.EDU>
Kenny Tilton <·······@nyc.rr.com> writes:

> And while you are not forgetting synapses, don't also forget that they 
> also can do semantic transformations on a datafeed, such as, oh, always 
> returning the mean of the value over the last N samples or milliseconds.

Oh, I hadn't thought of that!

> Meanwhile I am too intimidated by the massive revision to Just Do It. 
> Well, a ton of things going on with the Music SIG of lisp-nyc and 
> potentially the first non-Kenny Cells application and Yankee baseball.

(Meanwhile, I'm trying to distract myself from Raiders football)

> Hey, guess what, one of Music SIG members is threatening to help with 
> Cello. What do you think of SVG+XML as a browser-friendly backend?

Both "SVG+XML" and "browser-friendly backend" make me want to cry; but
then, there's a reason I named my web framework "La mort dans l'�me"
(approx. "with a heavy heart").

Lemme try again -- do non-Mozilla browsers actually support SVG?  I
would probably just output to PDF, personally.  They can contain
hyperlinks; just make sure your users don't try using ghostscript or
Preview.app

-- 
           /|_     .-----------------------.                        
         ,'  .\  / | No to Imperialist war |                        
     ,--'    _,'   | Wage class war!       |                        
    /       /      `-----------------------'                        
   (   -.  |                               
   |     ) |                               
  (`-.  '--.)                              
   `. )----'                               
From: Kenny Tilton
Subject: Re: My take on ARC
Date: 
Message-ID: <7Wdmb.37365$pT1.37260@twister.nyc.rr.com>
Thomas F. Burdick wrote:

> Kenny Tilton <·······@nyc.rr.com> writes:
>>Hey, guess what, one of Music SIG members is threatening to help with 
>>Cello. What do you think of SVG+XML as a browser-friendly backend?
> 
> 
> Both "SVG+XML" and "browser-friendly backend" make me want to cry; but
> then, there's a reason I named my web framework "La mort dans l'�me"
> (approx. "with a heavy heart").
> 
> Lemme try again -- do non-Mozilla browsers actually support SVG?  I
> would probably just output to PDF, personally.  

Hmmm. And PDF forms accept data. I wonder if the Adobe PDF browser 
plug-ins handle PDF forms and do something reasonable with the data entry.

Then all I need is a good cl-pdf open source library.

:)

kenny

-- 
http://tilton-technology.com
What?! You are a newbie and you haven't answered my:
  http://alu.cliki.net/The%20Road%20to%20Lisp%20Survey
From: Marc Battyani
Subject: Re: My take on ARC
Date: 
Message-ID: <bnc3gb$q3b@library1.airnews.net>
"Kenny Tilton" <·······@nyc.rr.com> wrote

[...]
> Then all I need is a good cl-pdf open source library.

This can't exist. The CL community only talks about [lack of, licensing of]
libraries rather than writing them. Another popular subject of discussion is
how CL would be incredibly better than Python, Java, ARC, Scheme, ML, etc. if
we used it to write code.

> :)
:(

pfff, even the parenthesis don't match today...

Marc
From: Kenny Tilton
Subject: Re: My take on ARC
Date: 
Message-ID: <nYgmb.37575$pT1.16935@twister.nyc.rr.com>
Marc Battyani wrote:
> "Kenny Tilton" <·······@nyc.rr.com> wrote
> 
> [...]
> 
>>Then all I need is a good cl-pdf open source library.
> 
> 
> This can't exist. The CL community only talks about [lack of, licensing of]
> libraries rather than writing them. Another popular subject of discussion is
> how CL would be incredibly better than Python, Java, ARC, Scheme, ML, etc. if
> we used it to write code.

I see. Well the good news is that they can switch to Python and enjoy 
instant popularity, since they apparently do not intend actually to 
write any code anyway. :)

Write now the RoboCells starter kit is on a back burner as I focus on 
getting up to speed on (+ Lisp Music). Today I got Common Music working 
to MIDI. I gather it will also use CSound as a backend, which goes 
beyond MIDI to syntehsized sound (a WAV, if you will, instead of a 
.MIDI). So it looks as if the Lisp-NYC music SIG will have plenty of 
toys to look at at our next meeting.

But pretty soon I want to overhaul Cells (some fundamental design flaws 
have been flushed out by recent applications, and some chats at ILC2003 
got me thinking and I see some nice improvements) so I can tear into 
Cello, the universal Common Lisp GUI.

Have you looked at PDF Forms? Will a browser process such a beast? Can 
CL-PDF then replace HTML/XML? Jes thinkin out loud.

Congrats on a nice package. Have you considered developing an example 
PDF that follows the style guidelines of a typical academic piece? I 
recall templates being available for Word and FrameMaker and probably 
TeX for the ILC2003 (I think linking back to some other site).

kenny

-- 
http://tilton-technology.com
What?! You are a newbie and you haven't answered my:
  http://alu.cliki.net/The%20Road%20to%20Lisp%20Survey
From: Marc Battyani
Subject: Re: My take on ARC
Date: 
Message-ID: <bneui3$dav@library1.airnews.net>
"Kenny Tilton" <·······@nyc.rr.com> wrote
> Marc Battyani wrote:
> >
> > This can't exist. The CL community only talks about [lack of, licensing
of]
> > libraries rather than writing them. Another popular subject of discussion
is
> > how CL would be incredibly better than Python, Java, ARC, Scheme, ML,
etc. if
> > we used it to write code.
>
> I see. Well the good news is that they can switch to Python and enjoy
> instant popularity, since they apparently do not intend actually to
> write any code anyway. :)

I even increased the noise myself by replying to one these posts. :(

> Write now the RoboCells starter kit is on a back burner as I focus on
> getting up to speed on (+ Lisp Music). Today I got Common Music working
> to MIDI. I gather it will also use CSound as a backend, which goes
> beyond MIDI to syntehsized sound (a WAV, if you will, instead of a
> .MIDI). So it looks as if the Lisp-NYC music SIG will have plenty of
> toys to look at at our next meeting.

Cool!

> But pretty soon I want to overhaul Cells (some fundamental design flaws
> have been flushed out by recent applications, and some chats at ILC2003
> got me thinking and I see some nice improvements) so I can tear into
> Cello, the universal Common Lisp GUI.
>
> Have you looked at PDF Forms? Will a browser process such a beast? Can
> CL-PDF then replace HTML/XML? Jes thinkin out loud.

PDF forms are not very good IMO. It's better to stick to HTML for this and to
use PDF for the final paper printing.

> Congrats on a nice package. Have you considered developing an example
> PDF that follows the style guidelines of a typical academic piece? I
> recall templates being available for Word and FrameMaker and probably
> TeX for the ILC2003 (I think linking back to some other site).

Not yet. BTW it would be in cl-typesetting not in cl-pdf.

Marc
From: Duane Rettig
Subject: Re: My take on ARC
Date: 
Message-ID: <43cdmtk5u.fsf@beta.franz.com>
···@famine.OCF.Berkeley.EDU (Thomas F. Burdick) writes:

> Kenny Tilton <·······@nyc.rr.com> writes:
> 
> > <sigh> I am /so/ tired of propping youse guys spirits up. Do me a favor, 
> >   switch to Python. Leave Lisp to the real Lispniks.
> 
> I recommend leaving them alone, or killfiling them.  We don't need
> another Naggum or Pitman or ... leaving the net-world.  "I'm only
> retiring from usenet."  Okay, Mr. Barlow, Mr. Bradshaw, Mr. Mai, you
> only come back when dragged.  You can reduce your usenet intake (viz.
> Dave Bakash, Duane Rettig), [ ... ]

No, not reduced intake; reduced output.  I always read every article
(some more carefully than others :-) and I don't have a killfile.  I
do have to bite my tongue more often than I'd like, but I have always
tried to followed my dad's favorite saying: "It is better to keep your
mouth shut and be thought a fool, than to open it and remove all doubt."
(attribution unknown)

Part of my reduced output is also due to one of my other usenet posting
philosophies; I try to post only articles that others aren't likely to
post.  If I respond to a post where the answer is obvious, then chances
are at least one or two are going to do the same thing; that contributes
to either my article or the other guy's article becoming a "me too"
response, and that does neither the other guy nor myself any good, and it
might (if it is a negative response) tend to overwhelm the previous
poster with the sense of being ganged up on.  All this to say that as
comp.lang.lisp becomes more and more populated with people who have
the ability and gumption to answer questions, there is less need for me
to do so.  [When issues are more controversial, this philosophy applies
less, but then yet another usenet posting philosphy takes over, which
is my desire not to let my time spent on c.l.l. cause me to lessen my
output on my job.]

-- 
Duane Rettig    ·····@franz.com    Franz Inc.  http://www.franz.com/
555 12th St., Suite 1450               http://www.555citycenter.com/
Oakland, Ca. 94607        Phone: (510) 452-2000; Fax: (510) 452-0182   
From: Doug Tolton
Subject: Re: My take on ARC
Date: 
Message-ID: <MBhlb.263151$Pk.41591@news.easynews.com>
Duane Rettig wrote:

> ···@famine.OCF.Berkeley.EDU (Thomas F. Burdick) writes:
> 
> 
>>Kenny Tilton <·······@nyc.rr.com> writes:
>>
>>
>>><sigh> I am /so/ tired of propping youse guys spirits up. Do me a favor, 
>>>  switch to Python. Leave Lisp to the real Lispniks.
>>
>>I recommend leaving them alone, or killfiling them.  We don't need
>>another Naggum or Pitman or ... leaving the net-world.  "I'm only
>>retiring from usenet."  Okay, Mr. Barlow, Mr. Bradshaw, Mr. Mai, you
>>only come back when dragged.  You can reduce your usenet intake (viz.
>>Dave Bakash, Duane Rettig), [ ... ]
> 
> 
> No, not reduced intake; reduced output.  I always read every article
> (some more carefully than others :-) and I don't have a killfile.  I
> do have to bite my tongue more often than I'd like, but I have always
> tried to followed my dad's favorite saying: "It is better to keep your
> mouth shut and be thought a fool, than to open it and remove all doubt."
> (attribution unknown)
> 
> Part of my reduced output is also due to one of my other usenet posting
> philosophies; I try to post only articles that others aren't likely to
> post.  If I respond to a post where the answer is obvious, then chances
> are at least one or two are going to do the same thing; that contributes
> to either my article or the other guy's article becoming a "me too"
> response, and that does neither the other guy nor myself any good, and it
> might (if it is a negative response) tend to overwhelm the previous
> poster with the sense of being ganged up on.  All this to say that as
> comp.lang.lisp becomes more and more populated with people who have
> the ability and gumption to answer questions, there is less need for me
> to do so.  [When issues are more controversial, this philosophy applies
> less, but then yet another usenet posting philosphy takes over, which
> is my desire not to let my time spent on c.l.l. cause me to lessen my
> output on my job.]
> 
I *like* reading your posts Duane.  Your wisdom is inspiring.

-- 
Doug Tolton
(format t ···@~a~a.~a" "dtolton" "ya" "hoo" "com")
From: Erann Gat
Subject: Re: My take on ARC
Date: 
Message-ID: <myfirstname.mylastname-2110030736590001@192.168.1.51>
In article <·····················@twister.nyc.rr.com>, Kenny Tilton
<·······@nyc.rr.com> wrote:

> Jeez! Erann, Paul, Conrad, Doug...you talk so much about branding. I do 
> not think that word means what you think it means. How about "there is 
> no such thing as bad publicity"?

Tell that to Kobe Bryant.  (I just love playing proof-by-anecdote, don't you?)

> As for Peter and Erann straining mightily to escape the stigma of cons, 
> I don't mean to get all grandiose here, but I cannot help but think of 
> another Peter denying Christ three times before dawn.

Aha!  Here I was thinking that religious wars were merely a metaphor when
it came to programming languages.  I see I was mistaken.

> Read my lips: the problem is not with Lisp!

I agree.  The problem is that it's being promoted by insufferable
self-righteous jerks with less business sense the average chipmunk.

> The fact is that /you/ have the problem. You do not really believe Lisp 
> is better.

When I was young my parents sent me to a YMCA summer camp not quite
realizing what they were getting me into.  Being the only atheist in a sea
of Southern Baptists (they take the C in YMCA seriously in Kentucky) it
was a pretty miserable two weeks.  By the time it was over the camp
counselors had me convinced that the real reason my prayers weren't being
answered was that I wasn't believing in Jesus hard enough.

Since then I have gone on to revel in this seemingly unique power that I
have to control the course of world events simply by not believing in
something hard enough.  Whole industries have been brought down simply
because of my lack of faith.  Remember pet rocks?  Dead now.  All because
I didn't believe.  SCO is my current project.  Still working on
Microsoft.  (It's a funny thing, sometimes my power seems to work
backwards.  Gotta figure out how to control that.)

> Talk is cheap. What did you chin-pulling, Lisp-denying, cons-apologizing 
> guys /do/ today? Did you help Marc with cl-typesetting? Me with OpenGL 
> or RoboCup or music software? Did you work on /any/ library?

I do not feel the need to defend my record of contribution to the Lisp
community to the likes of a piker like you, Kenny.  It's clear that
nothing I can possibly do will ever be good enough for you.

> You know, according to my survey, the biggest reason for Lisp's revival 
> is Paul Graham, who Won Big and then gave Lisp a lot of the credit. And 
> if you go back and read what he wrote, some of you branding geniuses 
> will discover that in his articles he refers to Lisp by the fucking 
> secret code name Lisp!!!! AFLAC!!!!

Right.  That's why he wrote, "... it's not Lisp that sucks, but Common
Lisp" and is calling his new language Arc.

I think Paul understands the importance of branding better than anyone in
the Lisp community (and that's one of the reasons he's rich and we're
not).

E.
From: Paolo Amoroso
Subject: Re: My take on ARC
Date: 
Message-ID: <87k76ywn6z.fsf@plato.moon.paoloamoroso.it>
Erann Gat writes:

> When I was young my parents sent me to a YMCA summer camp not quite
> realizing what they were getting me into.  Being the only atheist in a sea
> of Southern Baptists (they take the C in YMCA seriously in Kentucky) it

What is YMCA?


Paolo
-- 
Paolo Amoroso <·······@mclink.it>
From: Erann Gat
Subject: Re: My take on ARC
Date: 
Message-ID: <myfirstname.mylastname-2110031021410001@k-137-79-50-101.jpl.nasa.gov>
In article <··············@plato.moon.paoloamoroso.it>, Paolo Amoroso
<·······@mclink.it> wrote:

> Erann Gat writes:
> 
> > When I was young my parents sent me to a YMCA summer camp not quite
> > realizing what they were getting me into.  Being the only atheist in a sea
> > of Southern Baptists (they take the C in YMCA seriously in Kentucky) it
> 
> What is YMCA?

STFW!

http://www.google.com/search?q=ymca
http://dictionary.reference.com/search?q=ymca
E.
From: Ray Dillinger
Subject: Re: My take on ARC
Date: 
Message-ID: <3F957863.EEB81B91@sonic.net>
Paolo Amoroso wrote:
> 
> Erann Gat writes:
> 
> > When I was young my parents sent me to a YMCA summer camp not quite
> > realizing what they were getting me into.  Being the only atheist in a sea
> > of Southern Baptists (they take the C in YMCA seriously in Kentucky) it
> 
> What is YMCA?


"Young Men's Christian Association" -- It's an american group that tries 
to provide and organize events and activities for boys and men.  

Depending on where they are, they might be evangelicals or just a community
organization that tries to provide constructive outlets and structured time.  

There is also a YWCA, but it's a different group and has slightly different
goals - it's done a lot to combat discrimination and sexism.  

				Bear
From: Kenny Tilton
Subject: Re: My take on ARC
Date: 
Message-ID: <ikflb.17150$pT1.1908@twister.nyc.rr.com>
Ray Dillinger wrote:

> Paolo Amoroso wrote:
> 
>>Erann Gat writes:
>>
>>
>>>When I was young my parents sent me to a YMCA summer camp not quite
>>>realizing what they were getting me into.  Being the only atheist in a sea
>>>of Southern Baptists (they take the C in YMCA seriously in Kentucky) it
>>
>>What is YMCA?
> 
> 
> 
> "Young Men's Christian Association" -- It's an american group that tries 
> to provide and organize events and activities for boys and men.  
> 
> Depending on where they are, 

For example, Greenwich Village? From the group The Village People:

"Y.M.C.A

Young man, there's no need to feel down
I said, young man, pick yourself off the ground
I said, young man, cause your in a new town
There's no need to be unhappy

Young man, there's a place you can go
I said, young man, when you're short on your dough
You can stay there, and I'm sure you will find
Many ways to have a good time

It's fun to stay at the Y.M.C.A.
It's fun to stay at the Y.M.C.A.

They have everything for men to enjoy
You can hang out with all the boys

It's fun to stay at the Y.M.C.A.
It's fun to stay at the Y.M.C.A.

You can get yourself clean
You can have a good meal
You can do whatever you feel"

Quite an uplifting little ditty, btw. Surf it up if you can.

kenny


-- 
http://tilton-technology.com
What?! You are a newbie and you haven't answered my:
  http://alu.cliki.net/The%20Road%20to%20Lisp%20Survey
From: Gareth McCaughan
Subject: Re: My take on ARC
Date: 
Message-ID: <87ismimmzz.fsf@g.mccaughan.ntlworld.com>
Ray Dillinger <····@sonic.net> writes:

> Paolo Amoroso wrote:
> > 
> > Erann Gat writes:
> > 
> > > When I was young my parents sent me to a YMCA summer camp not
> > > quite realizing what they were getting me into.  Being the only
> > > atheist in a sea of Southern Baptists (they take the C in YMCA
> > > seriously in Kentucky) it
> > 
> > What is YMCA?
> 
> "Young Men's Christian Association" -- It's an american group that tries 
> to provide and organize events and activities for boys and men.  

Not only American. We have it in the UK too. I think it
may even have originated here.

The YMCA is of course the inspiration for Terry Pratchett's
"Young Men's Reformed Cultists of the Ichor God Bel-Shamharoth"
or whatever it's called :-).

-- 
Gareth McCaughan
.sig under construc
From: Christopher C. Stacy
Subject: YMCA [was: My take on ARC]
Date: 
Message-ID: <u3cdm5ovg.fsf_-_@dtpq.com>
>>>>> On Tue, 21 Oct 2003 18:15:57 GMT, Ray Dillinger ("Ray") writes:
 Ray> Paolo Amoroso wrote:
 >> 
 >> Erann Gat writes:
 >> 
 >> > When I was young my parents sent me to a YMCA summer camp not quite
 >> > realizing what they were getting me into.  Being the only atheist in
 >> > a sea of Southern Baptists (they take the C in YMCA seriously in
 >> > Kentucky) it
 >> 
 >> What is YMCA?
 Ray> "Young Men's Christian Association" -- It's an american group that tries 
 Ray> to provide and organize events and activities for boys and men.  

 Ray> Depending on where they are, they might be evangelicals or just a community
 Ray> organization that tries to provide constructive outlets and structured time.  

The YMCA is a higher order function:

(defun ymca (sport-rec-funs community-activity-funs)
  #'(lambda (&rest people-kids-families)
     (build-community-function-using-values
          people-kids-families
          '(caring honesty respect responsibility)
          sport-rec-funs community-activity-funs)))

Historically, the YMCA was founded in the Industrial Revolution in London,
1844, by George Williams, a sales assistant at draper's shop, as a way to
get boys off the streets.  The emphasis was on Bible study and prayer.  
YMCA in the USA was started right here in Boston, Massachusetts, in 1851.

The religous aspect of the YMCA might vary from chapter to chapter, 
and it might also have been very different 20 or 30 years ago.
I don't know about all that. YMCA is not really a "boy" thing anymore, 
either -- it's people of all ages and sexes, families, and singles.

The YMCA that I work for (a few miles north of Boston) was founded in 1865.
The mission statement says it's "dedicated to putting Christian principles
into practice through programs, membership opportunities and community
services that build healthy spirit, mind and body for all."  
However, Christianity per se is actually not part of the program here:
there is no religous training, nor even any references to God, Bible, 
or prayer, and nobody asks about your religion.  It's dedicated to the
development of all people regardless of age, gender, race, religion, 
income or ability to pay. It's a safe place where kids, adults and 
families from all different backgrounds come together to relax, recreate, 
experience leadership, personal growth and healthy lifestyles.
The motto of the YMCA is: "Building strong kids, strong families,
and strong communities."  The emphasis is on teaching children the core
character values of Caring, Honesty, Respect, and Responsibility.

Our YMCA has a family-oriented member-run community recreation and fitness
center offering swimming, basketball (a game invented and popularized at
the YMCA branch in western Massachusetts), and other sports, aerobics, gym
and Wellness Center (fancy cardio/weight training equipment, trainers, etc).  
There are child daycare programs.  There are middle-school and teen group
activities, and all kinds of child and adult special interest programs such
as dance, yoga, cooking, book clubs, nature, etc.  Like all YMCA's, there 
are community service and charity fund-raising projects, blood drives, etc.
There is an extensive summer camp program.

The YMCA chapter based in downtown Boston has a mission statement to build
the "health of spirit, mind and body based on the highest ideals of the
Judeo-Christian heritage, and to improve the quality of life for children,
individuals, families, and communities in the cities and towns of greater Boston."
They similarly offer health and fitness programs and childcare. But they also
have adult education, employment and job skills training, drug and alcohol
awareness programs, youth development programs, programs for youth at risk,
senior services, and shelter for homeless families.

Also right here, the YMCA chapter in Cambridge, Massachusetts, has similar
programs and charity activities, but they don't mention Christianity at all.
Their charter talks about the "inclusion for all ages, all incomes, and all
abilities, and widely crosses all races and religions, ethnicities and gender."  
Every YMCA I've ever seen has aquatics programs, and of course basketball. 
And in addition to their gym/weight/cardio facility, this branch also has an
indoor track, racquetball courts, climbing wall, and a sauna.  They have a
theatre and concert program as well.  Speaking of charitable community activities,
this branch is the largest provider of Single Room Occupancy housing in the
city of Cambridge, serving over 300 men each year.

--Chris
From: Christopher Browne
Subject: Re: My take on ARC
Date: 
Message-ID: <bn49ql$svl67$1@ID-125932.news.uni-berlin.de>
A long time ago, in a galaxy far, far away, Ray Dillinger <····@sonic.net> wrote:
> There is also a YWCA, but it's a different group and has slightly different
> goals - it's done a lot to combat discrimination and sexism.  

Well, there's an ongoing controversy there in the US organization
because of a rather "gay-positive" head of the organization.

This apparently doesn't sit well with those imagining there being a
"C" in "YWCA," so there's some lobbying taking place.  (Or so my
newspaper told me, today...)
-- 
If this was helpful, <http://svcs.affero.net/rm.php?r=cbbrowne> rate me
http://www.ntlug.org/~cbbrowne/internet.html
Rules of the Evil Overlord #18. "I will not have a son. Although his
laughably under-planned  attempt to usurp power would  easily fail, it
would provide a fatal distraction at a crucial point in time."
<http://www.eviloverlord.com/>
From: Doug Tolton
Subject: Re: My take on ARC
Date: 
Message-ID: <s%dlb.3365415$Bf5.462352@news.easynews.com>
Kenny Tilton wrote:
> 
> 
> Doug Tolton wrote:
> 
>> Conrad Barski wrote:
>>
>>> Although I hate to say it (because it sounds shallow) I think that the
>>> one thing that the Lisp community could really benefit from is a
>>> rebranding. ....
> 
> 
>> Paul stated that he has a good marketing head while we were in 
>> conversation, and after listening to some of his ideas I think he is 
>> right.  I'm not going to reveal the ideas because I think they would 
>> be more effective with a good implementation behind them. :)
>>
>>> This is because programmers already carry subconscious predjudices
>>> against it:
>>>      1. "Boy that lisp course it took in college sure sucked"
>>>      2. "It's been around for decades and nobody uses it? It must
>>> suck."
>>>      3. "Lisp is slow. That sucks."
>>>
>>> IMHO something as silly as a rebranding of Lisp could make a HUGE
>>> difference in the adoption of Lisp, and ARC is the best candidate I
>>> see for this right now.
>>>
>> Rebranding is generally pretty successful.  It worked well for Delphi. 
>> It worked well for Linux.  The power of the brand is almost 
>> incomprehensible to most technical people that I know.  However if you 
>> look at how a significant amount of purchasing decisions are made it 
>> is based on brand image.
>>
>> One gripe I have is with the name Lisp.  I know smart people came up 
>> with, but for heavens sake, why name a programming language after a 
>> speech impediment?  Seriously I get that joke almost everytime I bring 
>> up Lisp to someone.  If you are seriously looking at branding...well 
>> Lisp isn't a good brand name.
> 
> 
> Jeez! Erann, Paul, Conrad, Doug...you talk so much about branding. I do 
> not think that word means what you think it means. How about "there is 
> no such thing as bad publicity"? Thalidomide has been resurrected for 
> cancer treatment. What do they call it now? Thalomid. Apologies for 
> another sad example, but a representative of the company selling Ayds 
> (the OTC whatever remedy) said the epidemic of the same name had been no 
> problem for them.
> 
> Someone else has already correctly noted that the stigma is as much an 
> accelerant as it is an extinguisher; when we cross a very low threshold, 
> the fact that we are a dusty, forty-seven year old language will be cool 
> beyond belief. Then they notice we are native-compiled and have a nice 
> thick standard and a ton of vendors and free versions and... game over.
> 
> As for Peter and Erann straining mightily to escape the stigma of cons, 
> I don't mean to get all grandiose here, but I cannot help but think of 
> another Peter denying Christ three times before dawn. Would you all 
> please get a grip and meditate for thirty days and thirty nights on how 
> great Lisp is before opening your big yaps again?! Erann "Mr Negative" 
> Gat to his credit had to laugh at ILC2003 when I made fun of the idea 
> that what Lisp has to do is get better (this not on branding, but on 
> handling package issues differently). ("Yeah, that's our problem, Lisp 
> isn't good enough.")
> 
> Read my lips: the problem is not with Lisp!!!!!!! Jesus H Christ!!! This 
> crap is pathetic beyond belief, the Stockholm Syndrome all over again. 
> The fact is that /you/ have the problem. You do not really believe Lisp 
> is better. If you did, you would shut the fuck up and get to work on 
> some libraries, which is what we really need. Peter is exempted because 
> of the book. :)
> 
> Talk is cheap. What did you chin-pulling, Lisp-denying, cons-apologizing 
> guys /do/ today? Did you help Marc with cl-typesetting? Me with OpenGL 
> or RoboCup or music software? Did you work on /any/ library?
> 
> You know, according to my survey, the biggest reason for Lisp's revival 
> is Paul Graham, who Won Big and then gave Lisp a lot of the credit. And 
> if you go back and read what he wrote, some of you branding geniuses 
> will discover that in his articles he refers to Lisp by the fucking 
> secret code name Lisp!!!! AFLAC!!!!
> 
> Meanwhile I have another project possibly getting underway, someone from 
> the Linux camp. I am starting to think that Linux is doing well because 
> Linux folks have an interesting quality: they actually do things.
> 
> <sigh> I am /so/ tired of propping youse guys spirits up. Do me a favor, 
>  switch to Python. Leave Lisp to the real Lispniks.
> 
> We return you now to your regularly-scheduled self-loathing.
> 
> kenny
> 
> ps. It /is/ funny that we all got stuck with a silly name to which they 
> probably gave about two seconds of thought.

I've been busting my ass learning Lisp.  I've started using it at work 
and I've been talking others into learning and using Lisp.  I just went 
to the ILC out of my own time and on my own dime.  Three people have 
started learning it because I've been talking to them about it.

But maybe your right, maybe none of these things really count.  I guess 
I'm not a real Lisper.

-- 
Doug Tolton
(format t ···@~a~a.~a" "dtolton" "ya" "hoo" "com")
From: Kenny Tilton
Subject: Re: My take on ARC
Date: 
Message-ID: <Suelb.17140$pT1.16909@twister.nyc.rr.com>
Doug Tolton wrote:
> I've been busting my ass learning Lisp.  I've started using it at work 
> and I've been talking others into learning and using Lisp.  I just went 
> to the ILC out of my own time and on my own dime.  Three people have 
> started learning it because I've been talking to them about it.
> 
> But maybe your right, maybe none of these things really count.  I guess 
> I'm not a real Lisper.
> 

You, sir, are the very future of Lisp. Your enthusiasm and advocacy are 
the wind now filling our sails. Paying your way to ILC after such a 
short time is marvelous. And being so recently from The World Outside, 
you are our best ambassador to lands and peoples we old-timers no longer 
understand (and they do not understand us).

Using Lisp at work trumps free library work, or at least matches it. Am 
I off the hook yet? :)

Thx for the cool pix, btw.

-- 
http://tilton-technology.com
What?! You are a newbie and you haven't answered my:
  http://alu.cliki.net/The%20Road%20to%20Lisp%20Survey
From: Erann Gat
Subject: Re: My take on ARC
Date: 
Message-ID: <myfirstname.mylastname-2110031117280001@k-137-79-50-101.jpl.nasa.gov>
In article <·····················@twister.nyc.rr.com>, Kenny Tilton
<·······@nyc.rr.com> wrote:

> Am I off the hook yet? :)

No:

> Jeez! Erann, Paul, Conrad, Doug...

One down, three to go.

E.
From: Kenny Tilton
Subject: Re: My take on ARC
Date: 
Message-ID: <yIflb.17152$pT1.7884@twister.nyc.rr.com>
Erann Gat wrote:

> In article <·····················@twister.nyc.rr.com>, Kenny Tilton
> <·······@nyc.rr.com> wrote:
> 
> 
>>Am I off the hook yet? :)
> 
> 
> No:
> 
> 
>>Jeez! Erann, Paul, Conrad, Doug...
> 
> 
> One down, three to go.

Well this is tricky because the rant started as "don't change the name" 
and ended up as "what r u doing 4 lisp".

I see Conrad is a 2003 convert (and he wrote a great Road). I have to 
clean it up to get all the cross-referencing working, then I think he is 
headed for the top ten. And I am now granting a six-month stay of the 
"where the hell is a library" rant for newbies, so he's all good.

Graham of course has our eternal gratitude for having occasioned The 
Second (third? anyone keeping count?) Coming of Lisp.

Am I off the hook yet?

kenny

-- 
http://tilton-technology.com
What?! You are a newbie and you haven't answered my:
  http://alu.cliki.net/The%20Road%20to%20Lisp%20Survey
From: Erann Gat
Subject: Re: My take on ARC
Date: 
Message-ID: <myfirstname.mylastname-2110031309230001@k-137-79-50-101.jpl.nasa.gov>
In article <····················@twister.nyc.rr.com>, Kenny Tilton
<·······@nyc.rr.com> wrote:

> Erann Gat wrote:
> 
> > In article <·····················@twister.nyc.rr.com>, Kenny Tilton
> > <·······@nyc.rr.com> wrote:
> > 
> > 
> >>Am I off the hook yet? :)
> > 
> > 
> > No:
> > 
> > 
> >>Jeez! Erann, Paul, Conrad, Doug...
> > 
> > 
> > One down, three to go.
> 
> Well this is tricky because the rant started as "don't change the name" 

Well, the original topic was rebranding, which is related to but not the
same thing as changing the name.

> and ended up as "what r u doing 4 lisp".
> 
> I see Conrad is a 2003 convert (and he wrote a great Road). I have to 
> clean it up to get all the cross-referencing working, then I think he is 
> headed for the top ten. And I am now granting a six-month stay of the 
> "where the hell is a library" rant for newbies, so he's all good.
> 
> Graham of course has our eternal gratitude for having occasioned The 
> Second (third? anyone keeping count?) Coming of Lisp.
> 
> Am I off the hook yet?

Not quite.

How many Lisp programmers have you hired Kenny?  How much gross revenue
for Lisp vendors has been generated by your marketing efforts?  How many
open-source implementations of Common Lisp exist because of projects that
you initiated?  How many newspaper stories and magazine articles
mentioning Lisp have been generated as a result of work that you have
done?  How many papers have you published for other people to cite to
their PHBs showing evidence that Lisp can reduce development time and
uncertainty?

And finally (and this is the only question that really matters) do you
really think that asking questions like this is an effective
motivatational technique?

Now you're off the hook (if you want to be).

E.
From: Kenny Tilton
Subject: (+ Lisp Music)
Date: 
Message-ID: <3F968601.3050608@nyc.rr.com>
Good session last night in the first gathering of a Lisp-NYC borne cabal 
looking into doing Something With Lisp and Music:

The players:
   Howard Elmer, composer, Symbolic Composer user
   Bob Coyne, Lispnyk, musician, AI, 3D/OpenGL, interested in including 
algorithmic sound in his current interactive graphics project
   Nick Maj, hobby composer, new to Lisp, follows algorithmic music 
developments very closely
   Kenny Tilton, Lispnyk

Yours truly played some notes using pre-built instruments shipped with 
Common Lisp Music (CLM). I reported on some build problems on win32. CLM 
generates a WAV file, not MIDI. I mention this because the group seems 
to be forming a consensus that MIDI is not the way to go since it is too 
limited. ("cheesy" was the preferred epithet.)

Nick was encouraged by my success to have another go at building Common 
Music (CM) and CLM under Linux. Nick is totally on top of what is going 
on in music software (CSound, Supercollider, Common Music, more) and has 
a hobby interest in music composition and developing a book project 
built around Lisp and music composition.

Howard Elmer then walked us through a bunch of Symbolic Composer code he 
had developed and played some of it. Wild man, but MIDI.

Bob looks like the keystone. He and I know Lisp, but only Bob knows 
anything about music. And he has a Real Live Project in mind, I just 
want to see if I can get my system to play "Y.M.C.A.".

We then discussed ways to proceed. Using Lisp was all we were sure of, 
and I guess also not using MIDI. I explained that using CSound or 
Supercollider did not mean not using Lisp, it would just mean a few days 
of unpleasant wrangling with FFI stuff. re this, however...

Supercollider could be interesting to hack into. And it would be worth 
some effort, because Nick reports it has the twin charms of being 
interactive (other tools go thru a lengthy edit-compile-play sequence 
where compilation of a simple 16-second sequence can take as long or 
longer) and also having a lot of energy behind it. Good product, it seems.

My homework is to explore getting to SuperC from Lisp, but it seems you 
program it in a Smalltalk-like custom language. I gotta see if there is 
an underlying "C" API, or if it can be addressed over a socket (I love 
sockets!). But there is a possibility that the only clean interface is 
the ST app language. The source is about to be released (this is a 
project transitioning from proprietary to open). What we /could/ do is 
build in Lisp a language similar to the embedded ST language, then hit 
the Supercollider engine via the FFI.

OTOH, we also would like to make use of Common Music, which is already 
in Lisp. That eliminates some work, and we'd like to support a fellow 
Lisp project.

If we decide not to use CM, there is also CSound, a very strong package 
with a C api, so getting to it from Lisp is just some FFI work.

We meet again two weeks from last night, Tues, Oct 5th. Interested folks 
(music or Lisp) should get in touch.

kenny


-- 
http://tilton-technology.com
What?! You are a newbie and you haven't answered my:
  http://alu.cliki.net/The%20Road%20to%20Lisp%20Survey
From: Marcus Pearce
Subject: Re: (+ Lisp Music)
Date: 
Message-ID: <Pine.GSO.4.58.0310230946460.14378@vega.soi.city.ac.uk>
On Wed, 22 Oct 2003, Kenny Tilton wrote:

> Good session last night in the first gathering of a Lisp-NYC borne cabal
> looking into doing Something With Lisp and Music:
>
> generates a WAV file, not MIDI. I mention this because the group seems
> to be forming a consensus that MIDI is not the way to go since it is too
> limited. ("cheesy" was the preferred epithet.)

It wouldn't be cheesy if you used a decent synth (hardware or software)
rather than the midi synth on a soundcard (I'm making an assumption here).
Depending on the style, recording a non-quantised human performance to
midi might also help (another assumption). The point is that midi is just
a set of instructions which a tell an instrument what to do. It is not
cheesy in and of itself. For both human and midi renditions, if you use
low quality instruments or the performance is bad, the music will sound
bad.

The real issue is to find the right level of representation for what you
want to do with music. Midi works fine for passing messages between bits
of musical kit in electronic music performance and recording. However,
since its basic unit of structure is the note it is too high level for
something like sound synthesis. On the other hand, it is generally not
expressive enough for music analysis where we might be interested in
(e.g.,) the spelling of notes, higher level structures such as chords and
phrases, other aspects of the written score and so on.

Cheers,
Marcus
From: Kenny Tilton
Subject: Re: (+ Lisp Music)
Date: 
Message-ID: <O5Qlb.28153$pT1.6039@twister.nyc.rr.com>
Marcus Pearce wrote:
> On Wed, 22 Oct 2003, Kenny Tilton wrote:
> 
> 
>>Good session last night in the first gathering of a Lisp-NYC borne cabal
>>looking into doing Something With Lisp and Music:
>>
>>generates a WAV file, not MIDI. I mention this because the group seems
>>to be forming a consensus that MIDI is not the way to go since it is too
>>limited. ("cheesy" was the preferred epithet.)
> 
> 
> It wouldn't be cheesy if you used a decent synth (hardware or software)
> rather than the midi synth on a soundcard (I'm making an assumption here).

I don't know. I was reporting someone else's experience. I did ask if a 
nice synth would help, was told not completely. At the same time, I do 
not know what was meant by "cheesy". I'll keep all this in mind, tho, 
when we get together in a couple of weeks to compare notes again. Thx 
for the input.

> The real issue is to find the right level of representation for what you
> want to do with music. Midi works fine for passing messages between bits
> of musical kit in electronic music performance and recording. However,
> since its basic unit of structure is the note it is too high level for
> something like sound synthesis. On the other hand, it is generally not
> expressive enough for music analysis where we might be interested in
> (e.g.,) the spelling of notes, higher level structures such as chords and
> phrases, other aspects of the written score and so on.

We just got started, but I suspect diff members of the SIG will go in 
different directions. At least one of our number I think will be 
interested in greater expressiveness than MIDI, another sounds like he 
will continue on SCOM, which methinks is MIDI only.

I actually know nothing about this stuff, I am just offering help 
learning/mastering Lisp to facilitate the musicians.

kenny

-- 
http://tilton-technology.com
What?! You are a newbie and you haven't answered my:
  http://alu.cliki.net/The%20Road%20to%20Lisp%20Survey
From: ···@afghan.dogpound
Subject: Re: (+ Lisp Music)
Date: 
Message-ID: <87wuav9qbc.fsf@afghan.dogpound>
Kenny Tilton <·······@nyc.rr.com> writes:

> We meet again two weeks from last night, Tues, Oct 5th. Interested
> folks (music or Lisp) should get in touch.

I guess I consider myself a Hobby composer (though somewhat out of
practice), and a bit of a Lisper myself. I'm in your area, so let me
know the specifics of your next meeting.
From: Kenny Tilton
Subject: Re: My take on ARC
Date: 
Message-ID: <PAzlb.22734$pT1.15383@twister.nyc.rr.com>
Erann Gat wrote:
>>>>Am I off the hook yet? :)
>>>
>>>
>>>No:
>>>
>>>
>>>
>>>>Jeez! Erann, Paul, Conrad, Doug...
>>>
>>>
>>>One down, three to go.
>>
>>Well this is tricky because the rant started as "don't change the name" 
> 
> 
> Well, the original topic was rebranding, which is related to but not the
> same thing as changing the name.
> 
> 
>>and ended up as "what r u doing 4 lisp".
>>
>>I see Conrad is a 2003 convert (and he wrote a great Road). I have to 
>>clean it up to get all the cross-referencing working, then I think he is 
>>headed for the top ten. And I am now granting a six-month stay of the 
>>"where the hell is a library" rant for newbies, so he's all good.
>>
>>Graham of course has our eternal gratitude for having occasioned The 
>>Second (third? anyone keeping count?) Coming of Lisp.
>>
>>Am I off the hook yet?
> 
> 
> Not quite.
> 
> How many Lisp programmers have you hired Kenny? 

Nine, thanks for asking, but the subject is "what are you doing for 
Lisp". Present tense. I can see why you want to change the subject to 
the past, since you are doing nothing now*, but you are too big a man 
for sophistry.

* It occurs to me that Ciel would get you off the Useless Hook, but I 
gather you have stopped work on that?

Finally:

> Now you're off the hook (if you want to be).

You seem confused. I am not Erik.

kenny

-- 
http://tilton-technology.com
What?! You are a newbie and you haven't answered my:
  http://alu.cliki.net/The%20Road%20to%20Lisp%20Survey
From: Erann Gat
Subject: Re: My take on ARC
Date: 
Message-ID: <myfirstname.mylastname-2210031212500001@k-137-79-50-101.jpl.nasa.gov>
In article <·····················@twister.nyc.rr.com>, Kenny Tilton
<·······@nyc.rr.com> wrote:

> > How many Lisp programmers have you hired Kenny? 
> 
> Nine,

Wow.  I'm impressed.  Were these all full-time employees, or are you
counting letting a contract as "hiring" someone?

>  thanks for asking,

You're welcome.  I notice you didn't answer any of the other questions I
posed, including the only one that I said was actually important.

> but the subject is "what are you doing for Lisp". Present tense.

Funny, I thought the subject was Arc.  That's what it says in the subject
line of this thread.  (And wasn't it you who not so long ago brought up
events of four years ago?  And wasn't it you who during ILC dismissed the
answer I gave you to what I was doing for Lisp that day?  It would be a
lot easier to play this game by your rules if you stopped changing them
all the time.  I feel like I'm being asked to play Calvinball (STFW!).)

> * It occurs to me that Ciel would get you off the Useless Hook, but I 
> gather you have stopped work on that?

You gather wrong.  You're batting 1000 in that regard.

> > Now you're off the hook (if you want to be).
> 
> You seem confused. I am not Erik.

Erik?  Who's Erik?  I don't see any Erik here.  Wasn't it you who said
that the topic at hand was the present, not the past?

Make up your mind what you want to talk about and we can have a rational
conversation.  Until then I bid you adieu.  Best of luck with Cells.  It
looks like a really cool thing.

E.
From: Kenny Tilton
Subject: Re: My take on ARC
Date: 
Message-ID: <oABlb.22744$pT1.19362@twister.nyc.rr.com>
Erann Gat wrote:
> In article <·····················@twister.nyc.rr.com>, Kenny Tilton
> <·······@nyc.rr.com> wrote:
> 
> 
>>>How many Lisp programmers have you hired Kenny? 
>>
>>Nine,
> 
> 
> Wow.  I'm impressed.  Were these all full-time employees, or are you
> counting letting a contract as "hiring" someone?

Damn! U caught me. I paid Burdick carfare to hack callbacks from C into 
CMUCL. Everyone else fulltime, first at my ed sofwtare company, then 
CliniSys where I hope to do it again.

>>* It occurs to me that Ciel would get you off the Useless Hook, but I 
>>gather you have stopped work on that?
> 
> 
> You gather wrong. 

When I can DL it, you're off the hook.

>   Best of luck with Cells.  It
> looks like a really cool thing.

Thx. I'm gonna try to get PG to build them into Arc. :)

kenny

-- 
http://tilton-technology.com
What?! You are a newbie and you haven't answered my:
  http://alu.cliki.net/The%20Road%20to%20Lisp%20Survey
From: Erann Gat
Subject: Re: My take on ARC
Date: 
Message-ID: <myfirstname.mylastname-2210031359410001@k-137-79-50-101.jpl.nasa.gov>
In article <·····················@twister.nyc.rr.com>, Kenny Tilton
<·······@nyc.rr.com> wrote:

> >>* It occurs to me that Ciel would get you off the Useless Hook, but I 
> >>gather you have stopped work on that?
> > 
> > 
> > You gather wrong. 
> 
> When I can DL it, you're off the hook.

What a relief.

When you get down off your high horse you're off the hook too.

E.
From: Pascal Costanza
Subject: Re: My take on ARC
Date: 
Message-ID: <bn768o$p9m$1@newsreader2.netcologne.de>
Kenny Tilton wrote:
> 
> 
> Erann Gat wrote:

>> How many Lisp programmers have you hired Kenny? 
> 
> 
> Nine, thanks for asking, but the subject is "what are you doing for 
> Lisp". Present tense. I can see why you want to change the subject to 
> the past, since you are doing nothing now*, but you are too big a man 
> for sophistry.

Erann was one of the major influences that got me into seriously 
considering Lisp, via his paper on the Lisp vs. Java study.

Now, I seem to be able to attract some people to take a look at Lisp via 
my DSF paper.

Does this count?


Pascal
From: Doug Tolton
Subject: Re: My take on ARC
Date: 
Message-ID: <XHelb.254401$Pk.33744@news.easynews.com>
Kenny Tilton wrote:

> 
> 
> Doug Tolton wrote:
> 
>> I've been busting my ass learning Lisp.  I've started using it at work 
>> and I've been talking others into learning and using Lisp.  I just 
>> went to the ILC out of my own time and on my own dime.  Three people 
>> have started learning it because I've been talking to them about it.
>>
>> But maybe your right, maybe none of these things really count.  I 
>> guess I'm not a real Lisper.
>>
> 
> You, sir, are the very future of Lisp. Your enthusiasm and advocacy are 
> the wind now filling our sails. Paying your way to ILC after such a 
> short time is marvelous. And being so recently from The World Outside, 
> you are our best ambassador to lands and peoples we old-timers no longer 
> understand (and they do not understand us).
> 
> Using Lisp at work trumps free library work, or at least matches it. Am 
> I off the hook yet? :)
> 
> Thx for the cool pix, btw.
> 

LOL, ok I suppose you are of the hook. :)

NP on the pix, sometime this week I'll get around to resizing them.  I 
have to call my ISP too, they seem to have capped the hell out of my 
upstream speed.  It used to be 640k upstream, but it's a good deal 
slower than that now.

-- 
Doug Tolton
(format t ···@~a~a.~a" "dtolton" "ya" "hoo" "com")
From: Kenny Tilton
Subject: Re: My take on ARC
Date: 
Message-ID: <udflb.17148$pT1.31@twister.nyc.rr.com>
Doug Tolton wrote:
> NP on the pix, sometime this week I'll get around to resizing them.  

Good. Make the ones of me bigger, OK? I mean, McCarthy /did/ use my 
laptop and Graham /has/ responded to my survey.

kenny "big-head" sophist

-- 
http://tilton-technology.com
What?! You are a newbie and you haven't answered my:
  http://alu.cliki.net/The%20Road%20to%20Lisp%20Survey
From: Nils Goesche
Subject: Re: My take on ARC
Date: 
Message-ID: <87oewa10wm.fsf@darkstar.cartan>
Kenny Tilton <·······@nyc.rr.com> writes:

> Read my lips: the problem is not with Lisp!!!!!!! Jesus H
> Christ!!!  This crap is pathetic beyond belief, the Stockholm
> Syndrome all over again. The fact is that /you/ have the
> problem. You do not really believe Lisp is better. If you did,
> you would shut the fuck up and get to work on some libraries,
> which is what we really need. Peter is exempted because of the
> book. :)

[snip]

> <sigh> I am /so/ tired of propping youse guys spirits up. Do me
> a favor, switch to Python. Leave Lisp to the real Lispniks.
> 
> We return you now to your regularly-scheduled self-loathing.

I think this is worthy of being repeated many times, over and
over again.

Can't we somehow arrange for a vacation from any Python, Scheme
or ARC and any whining posts for a while?  *sigh*

Regards,
-- 
Nils G�sche
"Don't ask for whom the <CTRL-G> tolls."

PGP key ID #xD26EF2A0
From: Ray Dillinger
Subject: Re: My take on ARC
Date: 
Message-ID: <3F94D034.1F6BC62@sonic.net>
Doug Tolton wrote:
> 
> One gripe I have is with the name Lisp.  I know smart people came up
> with, but for heavens sake, why name a programming language after a
> speech impediment?  Seriously I get that joke almost everytime I bring
> up Lisp to someone.  If you are seriously looking at branding...well
> Lisp isn't a good brand name.

I work from time to time on my own lisp dialect. It's basically 
Scheme, plus first class syntax transformers (yes, it confounds 
the compile time/runtime separation, with a compile-time warning 
about inefficiency) and lots of bit-fiddling and introspection 
power. For introspection, I have a function that can take a "compiled" 
routine and get back the list that it was compiled from. Actually 
the list is kept when the routine is compiled to machine code - 
there's a pointer to the saved list in the machine code itself, and 
the function just follows the pointer to get the list back. Once 
you've got it, you can then codewalk it as a list, or feed to a 
transformer or whatever.  Right now it's horrifically slow and 
a memory hog and buggy, but it's very cool when it works. 

I have never very seriously considered releasing the implementation 
if it should ever be finished. It's a private exercise, kind of 
like doing katas for my code-fu. 

If pressed for a name, though, I usually call it "Lithp."

I suppose that's not a good name either.  :-)

> As I got thinking about it, people don't compare Lisp on a 2 ghz to Lisp
> on 200 Mhz.  They compare Lisp to C on the same machine.  When it comes
> to programming languages, I think being excessively slow could be an
> image killer.

It hasn't stopped Java. 


> hmm...can you think of anything else you'd want in a programming
> language/environment? :)
> 

I think Graham is absolutely right about demanding that implementors 
provide easy-to-use profiling and profile-based optimization.  We 
don't think of those as language issues now, but they are damned 
important when trying to provide efficient code, especially code 
in a language that has constructs that don't easily analyze, like 
ARC or Lithp.  Such languages really and truly do need them. 

				Bear
From: Steven E. Harris
Subject: Re: My take on ARC
Date: 
Message-ID: <q67k76ytr5r.fsf@raytheon.com>
Doug Tolton <····@nospam.com> writes:

> People really do avoid a language that is perceived to be slow.

Slowness piques my interest: What are they doing that makes it slow?
What extra "expensive" features are in there that are missing from
other languages? What are the trade-offs for the slowness?

For speed, we have assembly. Anything beyond that trades
abstraction. Heavy, deliberate trades like that can be interesting.

-- 
Steven E. Harris        :: ········@raytheon.com
Raytheon                :: http://www.raytheon.com
From: Stefan Scholl
Subject: Re: My take on ARC
Date: 
Message-ID: <i85q1ht5gbzn$.dlg@parsec.no-spoon.de>
On 2003-10-21 04:32:46, Doug Tolton wrote:
> Good web programming facilities
> Good database integration
> Good cross platform gui code (maybe like Lispworks CAPI or better)

Do we need all this in the next 100 years? :-)
From: Dave Benjamin
Subject: Re: My take on ARC
Date: 
Message-ID: <slrnbpdnji.16o.ramen@lackingtalent.com>
In article <·················@parsec.no-spoon.de>, Stefan Scholl wrote:
> On 2003-10-21 04:32:46, Doug Tolton wrote:
>> Good web programming facilities
>> Good database integration
>> Good cross platform gui code (maybe like Lispworks CAPI or better)
> 
> Do we need all this in the next 100 years? :-)

Hey, someone's gotta maintain all of these legacy web systems. =)

-- 
.:[ dave benjamin (ramenboy) -:- www.ramenfest.com -:- www.3dex.com ]:.
: d r i n k i n g   l i f e   o u t   o f   t h e   c o n t a i n e r :
From: Henrik Motakef
Subject: Re: My take on ARC
Date: 
Message-ID: <86k76xjazv.fsf@pokey.internal.henrik-motakef.de>
Stefan Scholl <······@no-spoon.de> writes:
>> Good web programming facilities
>> Good database integration
>> Good cross platform gui code (maybe like Lispworks CAPI or better)
> 
> Do we need all this in the next 100 years? :-)

We need all of this now, if we want anybody to remember Lisp in 100
years as anything but a language that died over 100 years ago.

Remember that being the best programming language doesn't mean
anything. It is about the best set of tools to solve a given problem,
and in practice library shopping beats reimplementing these libraries
even with the most powerful language every day. It is less fun,
though.

Fortunatly, these things are being addressed. Even if I don't think
this (or anything else) will make Lisp popular ever again, it
certainly makes things easier to those who use it nevertheless.
From: Pascal Bourguignon
Subject: Re: My take on ARC
Date: 
Message-ID: <87r817dznq.fsf@thalassa.informatimago.com>
·····················@yahoo.com (Conrad Barski) writes:
> Although I hate to say it (because it sounds shallow) I think that the
> one thing that the Lisp community could really benefit from is a
> rebranding. The simple fact is that when I talk to any programmers
> here in minneapolis the moment the word "Lisp" crosses my mouth
> anything else I say automatically ignored. (At least I hope it's the
> word "Lisp" :)
> 
> This is because programmers already carry subconscious predjudices
> against it:
>      1. "Boy that lisp course it took in college sure sucked"
>      2. "It's been around for decades and nobody uses it? It must
> suck."
>      3. "Lisp is slow. That sucks."
> 
> IMHO something as silly as a rebranding of Lisp could make a HUGE
> difference in the adoption of Lisp, and ARC is the best candidate I
> see for this right now.

That's been tried already:  Scheme.

-- 
__Pascal_Bourguignon__
http://www.informatimago.com/
Do not adjust your mind, there is a fault in reality.
Lying for having sex or lying for making war?  Trust US presidents :-(
From: Espen Vestre
Subject: Re: My take on ARC
Date: 
Message-ID: <kwismjuj3h.fsf@merced.netfonds.no>
·····················@yahoo.com (Conrad Barski) writes:

> This is because programmers already carry subconscious predjudices
> against it:
>      1. "Boy that lisp course it took in college sure sucked"
>      2. "It's been around for decades and nobody uses it? It must
> suck."
>      3. "Lisp is slow. That sucks."

I think this problem is about to disappear.

Young hackers think lisp is 1337, they don't have those prejudices any
more. If lisp has a public relations problem today, it's rather that
young aspiring PHBs haven't even _heard_ of it. On the other hand,
that means they don't carry a suitcase full of prejudices from the AI
winter, and may be willing to listen if you tell them that lisp is the
missing link they didn't know existed, the ultimate blend of fast
coding and high performance.
-- 
  (espen)
From: Bob Coyne
Subject: Re: My take on ARC
Date: 
Message-ID: <3F953A08.92C45787@worldnet.att.net>
Espen Vestre wrote:

> ·····················@yahoo.com (Conrad Barski) writes:
>
> > This is because programmers already carry subconscious predjudices
> > against it:
> >      1. "Boy that lisp course it took in college sure sucked"
> >      2. "It's been around for decades and nobody uses it? It must
> > suck."
> >      3. "Lisp is slow. That sucks."
>
> I think this problem is about to disappear.
>
> Young hackers think lisp is 1337, they don't have those prejudices any
> more. If lisp has a public relations problem today, it's rather that
> young aspiring PHBs haven't even _heard_ of it. On the other hand,
> that means they don't carry a suitcase full of prejudices from the AI
> winter, and may be willing to listen if you tell them that lisp is the
> missing link they didn't know existed, the ultimate blend of fast
> coding and high performance.
>

This is exactly right!  The future of Lisp is not in the oldsters who
championed it during the 80's.  Many (not all) of those people are bitter
and/or defensive about it.  "You mean you're still using Lisp!" they say
incredulously.  The future (as always) is with the youth who see it
with fresh eyes and genuine excitement.  The worst thing the "old"
lisp community could do would be to infect the younger generation
with their own disillusionment.
From: Drew McDermott
Subject: Re: My take on ARC
Date: 
Message-ID: <bn1ji3$lse$1@news.wss.yale.edu>
Erann Gat wrote:
   One
> of his dictums is to not forbid anything unless logic compels you to.

This seems like a vacuous statement to me; logic compels you to do 
almost nothing.  If I were doing the compelling, I would decree that 
logic compels the CAR of NIL to be undefined; but apparently it didn't 
compel the designers of CL.

 >(Drew McDermott complained
> famously when he learned that taking the CAR and CDR of a fixnum was not
> allowed in T.)

I don't remember complaining that.  I think I was regretting the demise 
of the facility in many Lisps that let you coerce a fixnum to a pointer 
and back again (just like C!).  It was at a talk given by Steele about 
what Common Lisp was shaping up to be like, and I said that one nice 
thing about pre-Common Lisps was that "you could feel the bits between 
your toes" when you programmed in them.  I now think I was wrong.

> So far I see little in ARC that is particularly interesting
> linguistically.  

I don't either, but (see above) I've been wrong before.

Lisp is, as McCarthy once said, a local maximum in the space of 
programming languages.  The space has a lot of warps, so it's not that 
easy to find an uphill direction.  For instance, many people have 
suggested that Lisp would be terrific if it just had a more traditional 
syntax.  And, of course, it's not hard at all to create a Lisp with that 
feature.  It's been done several times.  (You can do it as a front end 
to an ordinary Lisp with a few man-hours of time.)  But those dialects 
haven't caught on.  It turned out that _wasn't_ the big improvement some 
people expected.

Another example is the way symbols work in Lisp.  Someone raised the 
issue here a couple of weeks ago why you need symbols, when they're 
nothing but uniquified strings.  There were various good answers, but I 
think the most important reason is the way symbols and 'quote' interact. 
  It's hard to put it into words, but it occurs amazingly often (at 
least, in the domains I tend to work in) that you need to implement a 
language (predicate calculus, say), and S-expressions just happen to be 
nearly perfect for the job.  It's not just that they save you from 
defining another abstract datatype.  What's really useful is that when 
you need to refer to an object in the language, you can just write 
(quote symbol) or (quote big-expression).  The datatype used in the 
object language is exactly the same as the datatype used in the 
metalanguage, and you can use 'quote' to sneak things through from one 
layer to another!  If you try the same thing in Java, you can define the 
datatype "expressions in language L," and you can make the leaves of 
these expression trees be uniquified strings, but what you can't do is 
write

     if (the operator of this expression is NOT) ...

unless you allocate a static variable that contains the value of (the 
symbol representing) NOT.  This gets to be very tedious.

Anyway, the point is that it's unpredictable how a change to a language 
is going to break it or enhance it.  A major change like ARC has 
practically zero chance of being an improvement.

Then again, one might have predicted that Scheme would have zero chance 
of surviving, and been dead wrong.  It is still the case, however, that 
one could not have predicted the differences in the Scheme community and 
the CL community given a list of differences between the two languages. 
  Why would merging function and variable name spaces cause people to 
demand hygienic macros?  One of life's little mysteries.

       -- Drew
                                              -- Drew McDermott
                                                 Yale University CS Dept.
From: Erann Gat
Subject: Re: My take on ARC
Date: 
Message-ID: <myfirstname.mylastname-2010031546360001@k-137-79-50-101.jpl.nasa.gov>
In article <············@news.wss.yale.edu>, Drew McDermott
<··················@at.yale.dot.edu> wrote:

> Erann Gat wrote:
>    One
> > of his dictums is to not forbid anything unless logic compels you to.
> 
> This seems like a vacuous statement to me; logic compels you to do 
> almost nothing.  If I were doing the compelling, I would decree that 
> logic compels the CAR of NIL to be undefined; but apparently it didn't 
> compel the designers of CL.
> 
>  >(Drew McDermott complained
> > famously when he learned that taking the CAR and CDR of a fixnum was not
> > allowed in T.)
> 
> I don't remember complaining that.

Kent Pitman does :-)

> Incidentally, Drew McDermott sent a bug report soon after we rolled out T
> (Yale Scheme) complaining that we had made it impossible to rplacd
> fixnums in T.  [This fact is why I had a suspicion he was going to be more
> "neutral" (or "Machiavellian"? :-) on the matter of identity.  I knew
> from history that Drew had a keen seat-of-the-pants understanding of object
> identity if he'd gotten that far along.  rplacd'ing fixnums was not something
> most users knew about and it was not something you were "supposed" to do.
> But life was simpler back then and we worked with what we had.] 

I see I got the details wrong (it was RPLACD, not CAR and CDR) but I think
I didn't misrepresent the gist.

In any case I think you and I agree on the main point which is that logic
is not a reliable guide for figuring out what ought not be allowed in a
programming language.

> > So far I see little in ARC that is particularly interesting
> > linguistically.  
> 
> I don't either, but (see above) I've been wrong before.

Same here.


> Anyway, the point is that it's unpredictable how a change to a language 
> is going to break it or enhance it.  A major change like ARC has 
> practically zero chance of being an improvement.

But that's just the thing, to me ARC does not appear to be a major change
(yet).  This is not surprising.  Nor is it necessarily bad.  (It does make
me wonder why he's bothering.  He was asked several times at ILC why he
didn't just use Scheme and he never gave a straight answer.)

But Paul is really bright and -- what may be more important --
charismatic, and notwithstanding my critique I wouldn't bet against ARC
succeeding.

E.
From: Gareth McCaughan
Subject: Re: My take on ARC
Date: 
Message-ID: <87llrfo4qt.fsf@g.mccaughan.ntlworld.com>
Drew McDermott wrote:

> Then again, one might have predicted that Scheme would have zero
> chance of surviving, and been dead wrong.  It is still the case,
> however, that one could not have predicted the differences in the
> Scheme community and the CL community given a list of differences
> between the two languages. Why would merging function and variable
> name spaces cause people to demand hygienic macros?  One of life's
> little mysteries.

Surely that's one of the less mysterious ones? If you
have only one namespace, then there's more contention
for it, so you get more collisions, so some sort of
automatic hygiene system is more useful.

-- 
Gareth McCaughan
.sig under construc
From: ·············@comcast.net
Subject: Re: My take on ARC
Date: 
Message-ID: <vfqjwg5m.fsf@comcast.net>
Drew McDermott <··················@at.yale.dot.edu> writes:

> Anyway, the point is that it's unpredictable how a change to a
> language is going to break it or enhance it.  A major change like ARC
> has practically zero chance of being an improvement.

It doesn't seem that technical merit has much to do with acceptability.
From: Olivier Drolet
Subject: Re: My take on ARC
Date: 
Message-ID: <599a6555.0310201652.73cb080@posting.google.com>
······················@jpl.nasa.gov (Erann Gat) wrote in message news:<·······································@k-137-79-50-101.jpl.nasa.gov>...
(...)
> he wants to keep the number of data types in the core language as small as
> possible, with the ideal being that everything, including strings and
> numbers, is a list.  

In a discussion with someone at ILC2003 (Matthias�Radestock, I
believe), I learned that some in the Scheme community are considering
implementing Scheme with numbers typed as functions! Curious...

Olivier
From: Espen Vestre
Subject: Re: My take on ARC
Date: 
Message-ID: <aSWkb.32424$os2.470426@news2.e.nsc.no>
······················@jpl.nasa.gov (Erann Gat) writes:

> My personal take on all this is that the quest for minimalism is a
> quixotic one because it invariably leads you to the lambda calculus. 

I'm not sure it does. Actually the stuff he is doing reminded me
faintly of the first-order-logic approach to type theory that some
semanticists have worked on. This type stuff is the part of the arc
talk that really caught my interest, but I have to think harder
through it to give any further comments on the parallell that I
thought I glimpsed.
-- 
  (espen)
From: Erann Gat
Subject: Re: My take on ARC
Date: 
Message-ID: <myfirstname.mylastname-2010031327580001@k-137-79-50-101.jpl.nasa.gov>
In article <······················@news2.e.nsc.no>, Espen Vestre
<·····@*do-not-spam-me*.vestre.net> wrote:

> ······················@jpl.nasa.gov (Erann Gat) writes:
> 
> > My personal take on all this is that the quest for minimalism is a
> > quixotic one because it invariably leads you to the lambda calculus. 
> 
> I'm not sure it does.

Well, not necessraily the lambda calculus, but it leads you one of a
myriad equivalent formulations of what little you need to be
Turing-complete.  The lambda calculus is one such formulation, and
arguably the most likely one to end up with if you start with a Lispy
mindset.

Actually, if you really pursue minimalism with determination what you get
is unlambda.  Fortunately for all concerned, the designers of unlambda
don't seem to take themselves all that seriously.

The point is that minimalism in the core is not a good thing.  What Paul
really wants (I'm guessing) is to find the sweet spot where by some metric
the combination of the complexity of the core and the complexity of the
stuff you build on top of it is minimal.  (For example, most people would
agree that adding "define" to the lambda calculus is a net win.)  But the
problem is that to some extent judging complexity is subjective and
therefore imprecise, and it is not clear even among such two apparently
diverse choices as Scheme and Common Lisp which one really wins in that
regard.  I predict that Paul's quest to find "the sweet spot" will fail
because there is no one sweet spot.  There's a family of sweet spots --
and they're all called Lisp :-)

E.
From: Doug Tolton
Subject: Re: My take on ARC
Date: 
Message-ID: <xVYkb.195662$Pk.29562@news.easynews.com>
Erann Gat wrote:

> In article <······················@news2.e.nsc.no>, Espen Vestre
> <·····@*do-not-spam-me*.vestre.net> wrote:
> 
> 
>>······················@jpl.nasa.gov (Erann Gat) writes:
>>
>>
>>>My personal take on all this is that the quest for minimalism is a
>>>quixotic one because it invariably leads you to the lambda calculus. 
>>
>>I'm not sure it does.
> 
> 
> Well, not necessraily the lambda calculus, but it leads you one of a
> myriad equivalent formulations of what little you need to be
> Turing-complete.  The lambda calculus is one such formulation, and
> arguably the most likely one to end up with if you start with a Lispy
> mindset.
> 
> Actually, if you really pursue minimalism with determination what you get
> is unlambda.  Fortunately for all concerned, the designers of unlambda
> don't seem to take themselves all that seriously.
> 
> The point is that minimalism in the core is not a good thing.  What Paul
> really wants (I'm guessing) is to find the sweet spot where by some metric
> the combination of the complexity of the core and the complexity of the
> stuff you build on top of it is minimal.  (For example, most people would
> agree that adding "define" to the lambda calculus is a net win.)  But the
> problem is that to some extent judging complexity is subjective and
> therefore imprecise, and it is not clear even among such two apparently
> diverse choices as Scheme and Common Lisp which one really wins in that
> regard.  I predict that Paul's quest to find "the sweet spot" will fail
> because there is no one sweet spot.  There's a family of sweet spots --
> and they're all called Lisp :-)
> 
Is this prediction at all based on the fact that you have your own 
competing language in mind? :)

I actually hope it works out, I really like his ideas and I would be 
willing to help him implement them.  I also don't think he's saying the 
language itself is going to be simplistic, rather that it will be 
written most in terms of itself.  I find this to be very appealing 
personally.  I guess only time will tell though.


-- 
Doug Tolton
(format t ···@~a~a.~a" "dtolton" "ya" "hoo" "com")
From: Erann Gat
Subject: Re: My take on ARC
Date: 
Message-ID: <myfirstname.mylastname-2010031557340001@k-137-79-50-101.jpl.nasa.gov>
In article <·····················@news.easynews.com>, Doug Tolton
<····@nospam.com> wrote:

> Erann Gat wrote:
> 
> > There's a family of sweet spots -- and they're all called Lisp :-)
> > 
> Is this prediction at all based on the fact that you have your own 
> competing language in mind? :)

Of course not.  Ciel is not a Lisp.  It is a revolutionary new language,
completely unlike anything else that has come before.

Or it will be as soon as I finish designing it.  :-)

E.

P.S.  Seriously, (the design of) Ciel is (currently) based on abstract
associative maps, not CONS cells, which is probably enough to convince
most people that it is not a Lisp.  Alas, the more I pursue this design
the more it turns out that CONS cells are in fact a Really Good Idea (TM),
and that one abandons them at one's peril.  Alas, CONS cells are out of
style in my part of the world so until the winds of fashion shift I'm
going to poke at AAMs and see where they lead.
From: Peter Seibel
Subject: Re: My take on ARC
Date: 
Message-ID: <m3wuaz49x2.fsf@javamonkey.com>
······················@jpl.nasa.gov (Erann Gat) writes:

> P.S. Seriously, (the design of) Ciel is (currently) based on
> abstract associative maps, not CONS cells, which is probably enough
> to convince most people that it is not a Lisp. Alas, the more I
> pursue this design the more it turns out that CONS cells are in fact
> a Really Good Idea (TM), and that one abandons them at one's peril.
> Alas, CONS cells are out of style in my part of the world so until
> the winds of fashion shift I'm going to poke at AAMs and see where
> they lead.

If that gets too painful, you could always "reinvent" cons cells as a
specific case of tuples. (I.e. a cons cell is just a two-element
tuple.) Then it'll be just like Python. ;-) And it might be
interesting to have one element and three element tuples available as
first class data types. Not that that would enable anything you can't
already do in Lisp with structs and cons cells but it might make
things superficially different enough to sneak under the radar.

-Peter

-- 
Peter Seibel                                      ·····@javamonkey.com

         Lisp is the red pill. -- John Fraser, comp.lang.lisp
From: Sebastian Stern
Subject: Re: My take on ARC
Date: 
Message-ID: <ad7d32de.0310210126.127086ad@posting.google.com>
Erann Gat wrote:
> P.S.  Seriously, (the design of) Ciel is (currently) based on abstract
> associative maps, not CONS cells, which is probably enough to convince
> most people that it is not a Lisp.  Alas, the more I pursue this design
> the more it turns out that CONS cells are in fact a Really Good Idea (TM),
> and that one abandons them at one's peril.  

Could you please elaborate on this point? I just read your Ciel
Manifesto and I completely agree with it. An abstact interface for
code as data is a good thing. Yet you seem to be contradicting
yourself here. What is it about CONS cells that you find better that
abstract maps, and why does your Ciel manifesto say otherwise?

Sebastian
From: Erann Gat
Subject: Re: My take on ARC
Date: 
Message-ID: <myfirstname.mylastname-2110030752130001@192.168.1.51>
In article <····························@posting.google.com>,
········@yahoo.com (Sebastian Stern) wrote:

> Erann Gat wrote:
> > P.S.  Seriously, (the design of) Ciel is (currently) based on abstract
> > associative maps, not CONS cells, which is probably enough to convince
> > most people that it is not a Lisp.  Alas, the more I pursue this design
> > the more it turns out that CONS cells are in fact a Really Good Idea (TM),
> > and that one abandons them at one's peril.  
> 
> Could you please elaborate on this point?

Look up the thread entitled "The trouble with cons cells" in comp.lang.scheme.

> What is it about CONS cells that you find better that
> abstract maps, and why does your Ciel manifesto say otherwise?

The Manifesto says otherwise because I wrote it before attempting to work
out all the details.  But the issue is not resolved.  I may yet change my
mind again.  (One other point on which Paul Graham and I agree is that
language design takes a lot of time if it's done right.)  If anyone reads
up on the discussion over in c.l.s. and has new ideas I'd love to hear
them, preferably by email.

E.
From: Paolo Amoroso
Subject: Re: My take on ARC
Date: 
Message-ID: <87ekx6aalb.fsf@plato.moon.paoloamoroso.it>
Erann Gat writes:

> The point is that minimalism in the core is not a good thing.  What Paul
> really wants (I'm guessing) is to find the sweet spot where by some metric
> the combination of the complexity of the core and the complexity of the
> stuff you build on top of it is minimal.  (For example, most people would

From what I understand, Paul wants to find the sweet spot where by
some metric the combination of the complexity of the core and the
complexity of the stuff you build on top of it appeals to hackers.


Paolo
-- 
Paolo Amoroso <·······@mclink.it>
From: Doug Tolton
Subject: Re: My take on ARC
Date: 
Message-ID: <uZWkb.6442761$cI2.918188@news.easynews.com>
Erann Gat wrote:
> I couldn't find a good place in the ARC thread to hang this comment so I
> just decided to punt and start a new thread.
> 
> One central item in Paul's talk that I didn't see anyone mention
> (apologies if someone did and I just missed it) is his ideas for how to
> support typing in the core language.  Paul seems to be very enamored of
> minimalism and wants to keep the core language very tiny.  In particular,
> he wants to keep the number of data types in the core language as small as
> possible, with the ideal being that everything, including strings and
> numbers, is a list.  Graham has obviously been heavily influenced by
> McCarthy's original paper on Lisp, and how you can write EVAL in terms of
> CONS, CAR, CDR, COND, QUOTE, ATOM and EQ (the Big Seven).  He wants to
> make the core of ARC be a language that is not much bigger than that.
> 
> The problem you immediately encounter if you represent everything as a
> list is how to write PRINT, because if everything including numbers and
> strings is a list there is no difference internally between "abc" and (#\a
> #\b #\c).  How do you know which of those two printed representations
> should be used in any particular circumstance?
> 
> Paul's answer is to add three new constructs: TYPE, REP and TAG.  (I may
> get some of the details wrong here.  I didn't take notes during the
> presentation, and it's possible I'm getting e.g. TYPE and TAG confused.) 
> The operation of TAG, TYPE and REP correspond to CONS, CAR and CDR:
> 
> (TAG <datum> <type-tag>)  --> <encapsulated datum>
> (TYPE <encapulated datum>) --> <type-tag>
> (REP <encapsulated datum>) --> <datum>
> 
> I think he constrains type tags to be symbols, but I could be wrong.  One
> of his dictums is to not forbid anything unless logic compels you to.  His
> canonical example of something that is logically required to be forbidden
> is taking the CAR of a symbol.  He is apparently unaware that there have
> been Lisp dialects where this was allowed.  (Drew McDermott complained
> famously when he learned that taking the CAR and CDR of a fixnum was not
> allowed in T.)
> 
> I don't think he actually said so, but presumably the idea is that all
> operations that work on a <datum> also work on an <encapsulated datum>. 
> So you can define PRINT to do the Right Thing with lists by looking at how
> they have been encapsulated:
> 
>   (print (tag (quote (#\a #\b #\c)) (quote string))
> 
> would arrange to print "abc" by having PRINT examine the result of calling
> TAG on its argument.

 From what I recall your summary to this point is correct.

> 
> My personal take on all this is that the quest for minimalism is a
> quixotic one because it invariably leads you to the lambda calculus. 
> McCarthy's original repertoire of seven operations is not minimal because
> they can all be constructed out of LAMBDA.  McCarthy himself rejected the
> idea that there was anything particularly special about the Big Seven by
> endorsing the idea that integers and floating point numbers should be
> native types if the underlying machine supports them.

The main premise is that the language should be written in itself as 
much as possible.  Meaning that you would implement as few possible 
things in some external language such as C or assembly.
> 
> So far I see little in ARC that is particularly interesting
> linguistically.  (But Paul is approaching the problem with a premise that
> I do not accept, namely that Common Lisp and Scheme both "suck", to use
> his terminology.)  Nonetheless, it may still draw people to Lisp just
> because it's Paul's thing, and his charisma could easily carry the day. 
> That would be a Good Thing.

Paul is approaching this from a different angle than I have seen most 
language designers do.  He is necessarily writing the language of today, 
rather he is trying to build a language that will stand the test of 
time.  In fact if you look on his site he has an article about the 
hundred year language.

In essence he is contemplating the possibility of designing a language 
today that could still be in use 100 years from now.  When you start 
looking at it from this point of view you have to question some of the 
basic concepts in use today.  For instance, if all your numbers are 
implemented in a highly machine dependent way, it makes it more 
difficult to port the language to different architectures, not to 
mention architectures we haven't even though of yet.

I think he is pursuing a worthwhile goal.  The languages I use today 
have primarily been designed to solve the problems of yesterday, many of 
them aren't that great for the problems of today (except Lisp of course 
:) ).  To take a different approach and try to look towards the future 
is a good change of perspective IMO.
From: Erann Gat
Subject: Re: My take on ARC
Date: 
Message-ID: <myfirstname.mylastname-2010031310350001@k-137-79-50-101.jpl.nasa.gov>
In article <························@news.easynews.com>, Doug Tolton
<····@nospam.com> wrote:

> In essence he is contemplating the possibility of designing a language 
> today that could still be in use 100 years from now.

Lisp is almost half way there.  Scheme is about a quarter of the way
there.  Common Lisp is about 1/5 of the way there.

> When you start 
> looking at it from this point of view you have to question some of the 
> basic concepts in use today.  For instance, if all your numbers are 
> implemented in a highly machine dependent way, it makes it more 
> difficult to port the language to different architectures, not to 
> mention architectures we haven't even though of yet.

Not everything changes that fast.  The theory of numbers, for example, has
been pretty much worked out.  The IEEE Floating Point standard was
developed by people who pretty much knew what they were doing and I think
it's likely to stand the test of time.  Likewise for the Common Lisp
numerical library.  I see no need to reinvent these wheels even to support
the 100-year-langage goal.  100 years is not that long in the grand and
glorious scheme of things.

> I think he is pursuing a worthwhile goal.  The languages I use today 
> have primarily been designed to solve the problems of yesterday, many of 
> them aren't that great for the problems of today (except Lisp of course 
> :) ).  To take a different approach and try to look towards the future 
> is a good change of perspective IMO.

I agree, except that I don't think ARC is taking a different approach, at
least not technically.  From what I've seen ARC is so far just retreading
well-worn paths of language design.  Politically ARC is new in that it is
the first Lisp dialect developed by a BDFL (STFW!), but technically it
seems to me all but indistinguishable from Scheme (sans call/cc) at this
point.

E.
From: Doug Tolton
Subject: Re: My take on ARC
Date: 
Message-ID: <LhZkb.6450271$cI2.919179@news.easynews.com>
Erann Gat wrote:

> In article <························@news.easynews.com>, Doug Tolton
> <····@nospam.com> wrote:
> 
> 
>>In essence he is contemplating the possibility of designing a language 
>>today that could still be in use 100 years from now.
> 
> 
> Lisp is almost half way there.  Scheme is about a quarter of the way
> there.  Common Lisp is about 1/5 of the way there.
> 
> 
>>When you start 
>>looking at it from this point of view you have to question some of the 
>>basic concepts in use today.  For instance, if all your numbers are 
>>implemented in a highly machine dependent way, it makes it more 
>>difficult to port the language to different architectures, not to 
>>mention architectures we haven't even though of yet.
> 
> 
> Not everything changes that fast.  The theory of numbers, for example, has
> been pretty much worked out.  The IEEE Floating Point standard was
> developed by people who pretty much knew what they were doing and I think
> it's likely to stand the test of time.  Likewise for the Common Lisp
> numerical library.  I see no need to reinvent these wheels even to support
> the 100-year-langage goal.  100 years is not that long in the grand and
> glorious scheme of things.
> 
> 
>>I think he is pursuing a worthwhile goal.  The languages I use today 
>>have primarily been designed to solve the problems of yesterday, many of 
>>them aren't that great for the problems of today (except Lisp of course 
>>:) ).  To take a different approach and try to look towards the future 
>>is a good change of perspective IMO.
> 
> 
> I agree, except that I don't think ARC is taking a different approach, at
> least not technically.  From what I've seen ARC is so far just retreading
> well-worn paths of language design.  Politically ARC is new in that it is
> the first Lisp dialect developed by a BDFL (STFW!), but technically it
> seems to me all but indistinguishable from Scheme (sans call/cc) at this
> point.
> 
> E.

I think he mentioned it is going to have call/cc. :)

He did mention in his talk that he didn't think he was doing anything 
particularly ground breaking.  I tend to like the BDFL model though, 
take Python for instance.  I don't really like all of the design choices 
made, but it is incredibly consistent in it's design.  Once you get the 
Python mindset into your head it makes it easier to pick up new concepts 
in the language.

Maybe there isn't anything that isn'at already in Scheme / Common Lisp, 
I don't really know.  What I do know is that Paul is an incredibly 
gifted writer, and from my perspective he has a really good grasp on the 
technology.  He also has a fairly large group of people who agree with 
his mindset.  Those things right there put him in a position to build a 
fairly successfull community.  Whether or not he pulls it of remains to 
be seen, but I for one am hoping for success rather than failure.

-- 
Doug Tolton
(format t ···@~a~a.~a" "dtolton" "ya" "hoo" "com")
From: Ray Blaak
Subject: Re: My take on ARC
Date: 
Message-ID: <ur817u0ij.fsf@STRIPCAPStelus.net>
Doug Tolton <····@nospam.com> writes:
> Paul is approaching this from a different angle than I have seen most language
> designers do.  He is necessarily writing the language of today, rather he is
> trying to build a language that will stand the test of time.  In fact if you
> look on his site he has an article about the hundred year language.

As long as there is the instinctive tendency to control things with global
switches (e.g. a "not found" value for hash lookups), or to be too enamoured
with "writing simplicity" (are implicit var declarations still in there?), or
forgetting about threading (they were not in the language when I last looked 1
year or so ago), then this language won't even get off the ground.

> I think he is pursuing a worthwhile goal.  The languages I use today have
> primarily been designed to solve the problems of yesterday, many of them
> aren't that great for the problems of today (except Lisp of course :) ).  To
> take a different approach and try to look towards the future is a good
> change of perspective IMO.

The goal is excellent. Is he actually moving to it at all?

Caveat: my knowledge and thus my criticisms may be obsolete.

-- 
Cheers,                                        The Rhythm is around me,
                                               The Rhythm has control.
Ray Blaak                                      The Rhythm is inside me,
········@STRIPCAPStelus.net                    The Rhythm has my soul.
From: Doug Tolton
Subject: Re: My take on ARC
Date: 
Message-ID: <mFXkb.190970$Pk.28841@news.easynews.com>
Ray Blaak wrote:

> Doug Tolton <····@nospam.com> writes:
> 
>>Paul is approaching this from a different angle than I have seen most language
>>designers do.  He is necessarily writing the language of today, rather he is
>>trying to build a language that will stand the test of time.  In fact if you
>>look on his site he has an article about the hundred year language.
> 
> 
> As long as there is the instinctive tendency to control things with global
> switches (e.g. a "not found" value for hash lookups), or to be too enamoured
> with "writing simplicity" (are implicit var declarations still in there?), or
> forgetting about threading (they were not in the language when I last looked 1
> year or so ago), then this language won't even get off the ground.
>
One of his design goals was to have better inter-operability with the 
current functionality in Operating Systems.  I haven't seen anything 
about threads though.  Keep in mind he hasn't really ever published a 
complete language spec, so much of this is pure speculation.

> 
>>I think he is pursuing a worthwhile goal.  The languages I use today have
>>primarily been designed to solve the problems of yesterday, many of them
>>aren't that great for the problems of today (except Lisp of course :) ).  To
>>take a different approach and try to look towards the future is a good
>>change of perspective IMO.
> 
> 
> The goal is excellent. Is he actually moving to it at all?
> 
> Caveat: my knowledge and thus my criticisms may be obsolete.
> 

Yes he's moving on it, see the other ARC thread for a discussion of what 
he's done, and where he is currently on the project.


-- 
Doug Tolton
(format t ···@~a~a.~a" "dtolton" "ya" "hoo" "com")
From: Kenny Tilton
Subject: Re: My take on ARC
Date: 
Message-ID: <1DXkb.14176$pT1.3915@twister.nyc.rr.com>
Doug Tolton wrote:
> In essence he is contemplating the possibility of designing a language 
> today that could still be in use 100 years from now.

Given that Lisp is approaching fifty, and is still more modern than any 
other language, and that these fifty years were also the /first/ fifty 
years, during which the steepest part of the learning curve was 
climbed... game over.

But I hope he keeps working on Arc, especially if he does it via a fast 
VM which really is portable and practical for web applets.

At worst he is building yet another bridge to CL.

kenny

-- 
http://tilton-technology.com
What?! You are a newbie and you haven't answered my:
  http://alu.cliki.net/The%20Road%20to%20Lisp%20Survey
From: Doug Tolton
Subject: Re: My take on ARC
Date: 
Message-ID: <FXYkb.3309332$Bf5.453841@news.easynews.com>
Kenny Tilton wrote:

> 
> 
> Doug Tolton wrote:
> 
>> In essence he is contemplating the possibility of designing a language 
>> today that could still be in use 100 years from now.
> 
> 
> Given that Lisp is approaching fifty, and is still more modern than any 
> other language, and that these fifty years were also the /first/ fifty 
> years, during which the steepest part of the learning curve was 
> climbed... game over.
> 
> But I hope he keeps working on Arc, especially if he does it via a fast 
> VM which really is portable and practical for web applets.
> 
> At worst he is building yet another bridge to CL.
> 

He did say he wants it to be the premiere web application development 
language in the world.

Personally, I would just like to see a relatively successfull Lispy 
language.  Such that I can be a "Lisp" or "Arc" programmer, and not have 
to worry about keeping cruft like Java, C# and C++ on my resume.  I can 
code in them, but if you don't have to - who would want to.


-- 
Doug Tolton
(format t ···@~a~a.~a" "dtolton" "ya" "hoo" "com")
From: Christian Lynbech
Subject: Re: My take on ARC
Date: 
Message-ID: <ofn0bvoux5.fsf@situla.ted.dk.eu.ericsson.se>
>>>>> "Erann" == Erann Gat <······················@jpl.nasa.gov> writes:

Erann> In particular, he wants to keep the number of data types in the
Erann> core language as small as possible, with the ideal being that
Erann> everything, including strings and numbers, is a list.

Isn't this pretty much what Ousterhout was trying to do with TCL, only
using strings as the underlying representation of everything?

I believe that was abandoned after TCL 7, it was too slow, and from
TCL 8 it was back to the normal strategy of having numbers be numbers
and strings be strings.


------------------------+-----------------------------------------------------
Christian Lynbech       | christian ··@ defun #\. dk
------------------------+-----------------------------------------------------
Hit the philistines three times over the head with the Elisp reference manual.
                                        - ·······@hal.com (Michael A. Petonic)
From: Russell Wallace
Subject: Re: My take on ARC
Date: 
Message-ID: <3f9c7585.42897239@news.eircom.net>
On Tue, 21 Oct 2003 10:29:42 +0200, Christian Lynbech
<·················@ericsson.com> wrote:

>Isn't this pretty much what Ousterhout was trying to do with TCL, only
>using strings as the underlying representation of everything?
>
>I believe that was abandoned after TCL 7, it was too slow, and from
>TCL 8 it was back to the normal strategy of having numbers be numbers
>and strings be strings.

I think the idea with Arc is that as far as the _language semantics_
are concerned, everything can be a list of characters, but the
_implementation_ is free to implement numbers in binary for
efficiency. This would seem to give the best of both worlds.

-- 
"Sore wa himitsu desu."
To reply by email, remove
the small snack from address.
http://www.esatclear.ie/~rwallace
From: Wade Humeniuk
Subject: Re: My take on ARC
Date: 
Message-ID: <FNflb.23841$i92.18327@clgrps13>
Sitting outside at a cafe having my morning coffee I gave this some
thought.  First I wish I was at the conference to participate in the
Arc dicussion with Paul Graham.  Since I wasn't I will have to give my
personal thoughts.

One of the things I really like is my bicycle, its elegant, within my
very human scale, only needs a limited toolkit to fix and gets me from
point A to B while serving the multiple purpose of transportation,
exercise and sight seeing.  This is opposed to my car which is pretty
well outside my scale to modify or repair unless I make a large
investment in tools and skills. I'm much more insulated from the world
in my car.  There is little feedback to what is happening, no wind and
no weather. I'm a passenger in a car, I'm many more things on my bike.

 From what I have read of Viaweb, Paul picked Clisp, FreeBSD and simple
files for the DB.  Yet he succeeded using smaller scale tools (though
not too small) on a reliable and tried-and-true OS.  A bicycle by the
standards of some of todays large computer languages, hardware and
software.  In a way I can strongly relate this with my need to have
the technology within my scale, to feel comfortable with it.  I can
strip my bike down into its component pieces, reassemble it and make
some customizations.  If my bike does not live up to its task, I can
get a new/different one that meets my needs for very little cost.  My
bike is enough technology that it does not enslave me.  From reading
Paul's essays I get impression that he thinks Common Lisp and even
Scheme might be too much technology, technology meaning restrictive
and not freeing.  The technology being bigger than the programmer.

Common Lisp being restrictive in that the core standard imposes a
barrier to easily modifying it within a reasonable human scale.
Scheme being restrictive in terms of principles and peer pressure in
enforcing a certain mindset.  The old "You can have any colour, as
long as it is black".  The analogy is that Common Lisp is like a high
performance car, its great when you are a experienced mechanic and
driver, but it is just frustrating when you are a novice in both.
Scheme is like an old Peugeot PX10 Road Bike.  No modifications
for riding on those mountain bike trails allowed.

I really wish Paul well in his effort to create Arc and maybe his
dreams mesh a little with my humanistic needs when it comes to
technology.  Lisp is a great platform for stripping down a programming
language.

Wade