From: Ben Prew
Subject: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <1146200929.128239.294360@u72g2000cwu.googlegroups.com>
Recently, there was an (apparently cyclic) thread on comp.lang.lisp the
other day about how lisp sucks and why new developers aren't flocking
to bask in its glory.

And, while it started out with a few broad points, the thread quickly
moved into several small points, and the various merits of those
points.

One of them, in an example used by Rob Garret, discussed the merits of
deprecating nth in favor of elt, since elt is a superset of nth.

However, when I think about the existence of nth and elt more, I don't
think that it really matters to newbies whether or not both nth and elt
exist, since either:

   1. They don't know about the existence of both
   2. They know about the existence of both, but they don't really care

And, as I thought about it more, I came to the conclusion that it is
actually better to have both nth and elt in common lisp. This follows
my thinking that the biggest barriers of entry to learning a new
language are focused on a few main things:

   1. New syntax
   2. New libraries and their various available functions/methods
   3. Various new concepts presented within the language (C: pointers,
ruby: blocks, lisp: macros,functional style, Java:mostly-OO)

Note: not meant to be a complete list of new concepts from various
languages, just a few examples.

There are many reasons why languages do well and others don't, but I
think one of main reasons a language does well is that it has
similarities to the current collection of popular languages.

New concepts can be difficult to learn, but if you look at Ruby, Perl
and Python, all languages that are slower then lisp, yet have concepts
not commonly used in C/C++/Java that are similar to lisp (blocks,
functional style), they tend to do better, IMHO, because they have
strong syntax and library similarities to C/C++/Java.

I think that the difference in the 3 main points up above constitutes
what I call a languages "standard deviation" to the current most
popular languages. The corollary is; a language that offers a low
standard deviation will have a higher appeal then a similar language
with a higher standard deviation. (Note: I'm doing some
pseudo-statistics at work; well, the statistics are real, but I'm not
an real statistician, hence the "pseudo" part)

For example, why did Matz write Ruby? It has a lot of concepts that
lisp employs, but its significantly slower. Between inventing ruby and
just using lisp, why didn't Matz just use lisp, or smalltalk?

I think one of the big reasons (perhaps unconsciously), is that Matz
recognized the "standard deviation" between both lisp and smalltalk,
and set about designing a language that was closer to most programmers
current expectation of a language.

I think that lisp would gain broader appeal as well by reducing its
"standard deviation". Since I like the current syntax of lisp, and I
suspect a lot of other people do, that only leaves standard
functions/libraries and concepts.

Also, since I happen to think that lisp's concepts (macros) are some of
the best ever invented, I'll nix that idea as well.

This leaves:

   1. standard functions / libraries / objects

I think lisp would do well to add extra library functions that are very
similar to the existing favorite languages. Not only would this not
break any existing code, but it would help ease the learning curve that
new programmers face when learning a new language, especially one with
a much different syntax, such as lisp.

I'll readily accept that whatever is currently popular may not be the
best way to do something, but by not giving programmers a sense of
familiarity, you force them to basically start from scratch. Once
people give lisp a chance, they'll come to understand the power that it
conveys, but most people already don't have enough time to spend
learning new concepts they can use in their existing language, much
less spend time learning new concepts, a radically different syntax,
and a whole new set of libraries/functions!

I think that we (the lisp community) should imitate some functions of
the more popular languages to increase membership. As an example, I was
thinking it would be fairly easy to add things like "while", "for",
"foreach", "var", etc.

They could even be interned into a new package (maybe 'new-lisp-user'),
so someone could just import that package, and be greeted with
functions that more closely mirrored their expectations.

Here's some sample code:

(defmacro var (&rest all)
`(let ,@all))

See, that's all I'm talking about. Just creating new methods that look
an awful lot like existing lisp methods, but have function and name
that are more familiar to new programmers. Now, as a new programmer, I
can focus on learning the different syntax, those funky macro things,
etc, all while having the familiarity of my favorite language ;)

Note, I think my arguments are implicitly supported by SteveY's blog
post about why lisp in an unacceptable lisp. Its not that the points
brought up were technically sound, but rather that they represent the
kind of problem I'm talking about; things behaved differently then he
had come to expect from most "mainstream" languages.  I also think this
is what Ron was trying to explain, but that he took a different
approach then I did.

It's all about minimizing standard deviation.

[1] http://ruby-talk.org/cgi-bin/scat.rb/ruby/ruby-talk/179642
[2]
http://steve-yegge.blogspot.com/2006/04/lisp-is-not-acceptable-lisp.html

--
Ben Prew
Note: a copy of this can be found on my blog
(http://mosaitek.blogspot.com/)

From: Rainer Joswig
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <joswig-6CBE9A.07242428042006@news-europe.giganews.com>
In article <························@u72g2000cwu.googlegroups.com>,
 "Ben Prew" <········@gmail.com> wrote:

> Recently, there was an (apparently cyclic) thread on comp.lang.lisp the
> other day about how lisp sucks and why new developers aren't flocking
> to bask in its glory.

Why should that be the case? Why should Common Lisp suddenly
be the target for script kiddies, instead of developers?

> Also, since I happen to think that lisp's concepts (macros) are some of

Actually I don't think macros are that great. They are mostly
oversold and other more useful concepts are overlooked.

> I think that we (the lisp community) should imitate some functions of
> the more popular languages to increase membership. As an example, I was
> thinking it would be fairly easy to add things like "while", "for",
> "foreach", "var", etc.

Unfortunately nothing will happens from a Usenet post.
Everything begins with working source code.

-- 
http://lispm.dyndns.org/
From: Ben Prew
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <1146204262.383431.6620@j73g2000cwa.googlegroups.com>
Rainer Joswig wrote:
> In article <························@u72g2000cwu.googlegroups.com>,
>  "Ben Prew" <········@gmail.com> wrote:
>
> Why should that be the case? Why should Common Lisp suddenly
> be the target for script kiddies, instead of developers?

Are you saying that the entire concept of language "standard deviation"
is flawed, or just that it needs some refining?

I don't remember mentioning anything about "script-kiddies", and I
would argue that most of the mainstream languages availabe now are the
target for developers.  I would also argue that there are some
amazingly intelligent programmers using Ruby, Perl and Java.  Why not
make it easier for them to learn lisp as well?

I'm not looking to hamstring's lisp power, just looking for ways to
make it more "digestable" to incoming programmers who may have
experience it at least one, if not two or three of the mainstream
languages.  Why not make it easier for them to be immediately
productive?

Lisp is an incredibly beautiful language, and I much prefer it to most
of the mainsteam languages now (I use Perl professionally), so why not
extend an invitation to, experience the beauty, to other programmers?

>
> > Also, since I happen to think that lisp's concepts (macros) are some of
>
> Actually I don't think macros are that great. They are mostly
> oversold and other more useful concepts are overlooked.

What other concepts do you feel are getting overlooked?  I chose macros
as an example, because from my (limited) experience and from the lisp
people I've talked to, they mention macros as one of the few things
that really can't be done in other languages?

I am certainly willing to accept that there are additional useful
concepts that lisp provides, but if you can't get more people to use
the language, then it doesn't really benefit them.

>
> > I think that we (the lisp community) should imitate some functions of
> > the more popular languages to increase membership. As an example, I was
> > thinking it would be fairly easy to add things like "while", "for",
> > "foreach", "var", etc.
>
> Unfortunately nothing will happens from a Usenet post.
> Everything begins with working source code.

I had sample code :P

Seriously though, at this point, I'm feeling out what the community
wants to do.  I can write all the code I want, but as Pascal said, if
the community doesn't approve of it, it doesn't get added.

And, I'll be honest, I don't see myself as having a lot of vision, so I
can work on what I see, but I also know that sometimes a good idea from
one person can spark great ideas in others.

> 
> -- 
> http://lispm.dyndns.org/

--
Ben
From: Rainer Joswig
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <joswig-B2273C.08223828042006@news-europe.giganews.com>
In article <······················@j73g2000cwa.googlegroups.com>,
 "Ben Prew" <········@gmail.com> wrote:

I may sound negative to you, but I have seen these discussions
coming and going all the time.

> Rainer Joswig wrote:
> > In article <························@u72g2000cwu.googlegroups.com>,
> >  "Ben Prew" <········@gmail.com> wrote:
> >
> > Why should that be the case? Why should Common Lisp suddenly
> > be the target for script kiddies, instead of developers?
> 
> Are you saying that the entire concept of language "standard deviation"
> is flawed, or just that it needs some refining?

What standard? What other languages and libraries are popular?
Java and J2EE? C# and .net? Objective-C and Cocoa?
Where is the standard? I can't see one. 'Mainstream'
stuff is also Lingo, AutoLisp, ABAP, ...

> I don't remember mentioning anything about "script-kiddies", and I
> would argue that most of the mainstream languages availabe now are the
> target for developers.

Sure, Java, C, C# are mainstream programming languages and they
are for developers. But who uses Ruby beyond the hype? And
after the hype is gone? More than a few websites and some scripts?

>  I would also argue that there are some
> amazingly intelligent programmers using Ruby, Perl and Java.  Why not
> make it easier for them to learn lisp as well?

Why should they need to learn Lisp? They would learn Lisp if
it would be (more) useful for them - not because a random
function looks similar.

> Lisp is an incredibly beautiful language,

Lisp is neither a language (it is a family of languages) nor beautiful.
If you think of Common Lisp, it is quite ugly in parts (which
I don't have a real problem with).

> What other concepts do you feel are getting overlooked?  I chose macros
> as an example, because from my (limited) experience and from the lisp
> people I've talked to, they mention macros as one of the few things
> that really can't be done in other languages?

But you don't seem to have any first hand knowledge about it
and you are not looking from a software engineering standpoint?

All kinds of people with LIMITED experience are giving advice
about things they don't have a clue about. I could go to
a PERL newsgroup and giving them the advice to make PERL
more like OpenMCL, since that would attract Lisp developers.
I don't have any clue about PERL and my experience with
PERL is not even 'limited', it is zero. I don't even have any
interest in PERL, since I don't use it. Nobody would care.
Why should I?

> I am certainly willing to accept that there are additional useful
> concepts that lisp provides, but if you can't get more people to use
> the language, then it doesn't really benefit them.

This is all too vague for me...

...

-- 
http://lispm.dyndns.org/
From: Stefan Scholl
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <0T33v2rhI5mtNv8%stesch@parsec.no-spoon.de>
Ben Prew <········@gmail.com> wrote:
> target for developers.  I would also argue that there are some
> amazingly intelligent programmers using Ruby, Perl and Java.  Why not
> make it easier for them to learn lisp as well?

Why do you think Lisp is hard to learn for them?

Lisp is for the mediocre programmer, too.
From: Ben Prew
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <1146245158.014630.298440@g10g2000cwb.googlegroups.com>
Stefan Scholl wrote:
> Why do you think Lisp is hard to learn for them?

I think that if lisp was one of the first languages a programmer
learned, then no, lisp would not be hard to learn.

However, I feel that there are a lot of similarities between the
mainstream languages, and so when someone comes across lisp as their
3rd or 4th language, its difficult to overcome the relatively ingrained
ideas of what "OO" programming is all about, and what "functions" are
available in string libraries.

Given these ingrained ideas, I think that programmers new to lisp look
at it and say, "why didn't they do X like Y, every other language I
know does it like Y, lisp must be broken".

> Lisp is for the mediocre programmer, too.

I don't doubt that, as I have been able to learn it so far ;).  But,
then again, I learned Perl, so coming from that, I'm willing to accept
a lot of oddities in other languages as "quirky" :).
From: Ken Tilton
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <1Gq4g.29$lu7.4@fe12.lga>
Ben Prew wrote:
> Rainer Joswig wrote:
> 
>>In article <························@u72g2000cwu.googlegroups.com>,
>> "Ben Prew" <········@gmail.com> wrote:
>>
>>Why should that be the case? Why should Common Lisp suddenly
>>be the target for script kiddies, instead of developers?
> 
> 
> Are you saying that the entire concept of language "standard deviation"
> is flawed, ...

I will (say the entire concept is flawed). With a qualification. I agree 
that Python and Java benefitted from being similar to C. The flaw I see 
is... so what happened to Dylan?

I think one needs to look closer. Look at Python in its first five (I 
made that up) years when no one knew about it. Same with Perl and Ruby. 
O'Reilly cannot publish enough of those books now, but what was going on 
before they became The Latest New Thing?

Then maybe we can figure out why Dylan did not happen. Did it skip a 
grassroots phase during which it quietly spread by word of mouth? Java 
had the luxury of a shortcut: funding from Sun.

This grassroots thing, if I am right, is what Ron and Jeff do not get. 
Ron does not want to work on a portable sockets library, nor does jeff. 
They want to /first/ get the entire community to agree on a new program 
to Fix Lisp, then form a committee, hold elections, create by-laws, put 
up a web site... you know, CLRFI.

:)

Meanwhile Ken and Frank are taking what Peter did and trying to add 
OpenGL and Cells so we can have a drop-dead portable CL GUI to use. If 
it works, Peter and Thomas might like it. And so it goes...

ken
From: Ben Prew
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <1146247615.613145.188650@u72g2000cwu.googlegroups.com>
Ken Tilton wrote:

> Ben Prew wrote:
> >
> > Are you saying that the entire concept of language "standard deviation"
> > is flawed, ...
>
> I will (say the entire concept is flawed). With a qualification. I agree
> that Python and Java benefitted from being similar to C. The flaw I see
> is... so what happened to Dylan?

I would also argue the Perl benefited from having traits similar enough
to both C and sh/bash, that both programmer and sysadmin were able to
pick up Perl with relatively little pain.

Having spent a little time looking at the syntax and style of Dylan, I
would agree that it lowers the "standard deviation" relative to lisp,
and you're right, I don't know what happened to Dylan.

Perhaps having similar syntax isn't enough.  Unfortunately, I don't
know much about Dylan, its community, its libraries, etc, so I can't
speak to its possible shortcomings.

>
> I think one needs to look closer. Look at Python in its first five (I
> made that up) years when no one knew about it. Same with Perl and Ruby.
> O'Reilly cannot publish enough of those books now, but what was going on
> before they became The Latest New Thing?

I had a brief discussion with some of my co-workers about it, and I
think a good indicator of relative popularity is the number of articles
published on the language, or loc in the Python and Ruby equivalent of
CPAN.

I also looked up some relatively recent information about language
popularity [1], [2] and [3], but as expected, its hard to quantify
popularity.

Most of the people I talked to agree that rails offered a big boost to
Ruby, but I can't really think of a "killer app" for Python or Perl.
Perl has its whole
I'm-a-scripting-language-I-can-automate-crappy-tasks-and-I-sorta-integrate-with-C-thing
going, and for a while it was synonymous with CGI, but I'm not really
sure it holds that distinction anymore, and so it will be interesting
if it can maintain its popularity, in the long run.

The Lisp family of languages has certainly been around for a long time,
so there's definitely a point to be made about not chasing every
current popular trend, and there's definitely some problems of code-rot
by creating a 'new-lisp-user' package, as what's generally accepted now
(while, for, etc), may not be the norm 10-20 years from now.

> Then maybe we can figure out why Dylan did not happen. Did it skip a
> grassroots phase during which it quietly spread by word of mouth? Java
> had the luxury of a shortcut: funding from Sun.

Yeah, I really don't know much about Dylan, but now that you've
mentioned in, I'm certainly interested in looking at it, if only from
an anthropological standpoint.

And, I totally agree about Java's shortcut, the funding and "support"
from Sun went a long way to help "legitimize" Java, and I think they
did an excellent job of positioning it as the next big "enterprise
class" language, right down to the large body of standards, and
extensive bureaucracy :)

> This grassroots thing, if I am right, is what Ron and Jeff do not get.
> Ron does not want to work on a portable sockets library, nor does jeff.
> They want to /first/ get the entire community to agree on a new program
> to Fix Lisp, then form a committee, hold elections, create by-laws, put
> up a web site... you know, CLRFI.
>
> :)
>
> Meanwhile Ken and Frank are taking what Peter did and trying to add
> OpenGL and Cells so we can have a drop-dead portable CL GUI to use. If
> it works, Peter and Thomas might like it. And so it goes...

That sounds like it could be much more beneficial then my half-baked
ideas :).

Also, and this may be better as another topic, but is there a library
store, similar to CPAN, for lisp?  I thought what asdf did, but when I
looked into it more, it looked like it had more in common with make
then CPAN.

I think something like that would be very useful, as since CL has been
around for such a long time, and from what I'm hearing, there are
mountains of code out there that one just has to find.  Why not make it
easier to locate it?

Note: By CPAN, I mean a "generally accepted storage location/website
listing of lisp libraries, uploadable by anyone willing to create an
account"

[1]
http://radar.oreilly.com/archives/2005/12/ruby_book_sales_surpass_python.html
[2] http://csharpcomputing.com/Reviews/languagesByJobNumbers.htm
[3] http://www.dedasys.com/articles/language_popularity.html

--
Ben

> 
> ken
From: Ari Johnson
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <m2odyl4l38.fsf@hermes.theari.com>
"Ben Prew" <········@gmail.com> writes:

> Ken Tilton wrote:
>
>> Ben Prew wrote:
>> >
>> > Are you saying that the entire concept of language "standard deviation"
>> > is flawed, ...
>>
>> I will (say the entire concept is flawed). With a qualification. I agree
>> that Python and Java benefitted from being similar to C. The flaw I see
>> is... so what happened to Dylan?
>
> I would also argue the Perl benefited from having traits similar enough
> to both C and sh/bash, that both programmer and sysadmin were able to
> pick up Perl with relatively little pain.

One of the major strengths of Perl that I was discussing with a friend
who has a love-hate relationship with the language sufficient to
prompt him to develop his own language, the compiler for which is
presently a Perl program until enough of the standard library is
implemented to make it self-hosting, is this:

C is a fabulous language for writing code that is very close to the
machine.  There is arguably no better language for writing code that
interoperates with the hardware so closely.  However, C becomes
painful the minute you back away from the hardware any number of
layers.

Perl, on the other hand, seems to minimize the pain of gluing C code
into a higher-level language.  It is more than passable for dealing
with things distant from the hardware, and anytime you want to get
closer to the hardware you can just glue in some C code.  The most
powerful part of this is that the Perl side of the glue is minimal.
The C side is painful, but less painful than actually writing the
higher-level parts of your program in C would be.

There are many claims that Perl is more popular because of the number
and variety of modules for it.  However, those modules didn't all
exist from the start - the language had to become popular enough to
make them worth writing.  I think that the ease of gluing C code into
Perl modules to make the lower-level parts of the system more
accessible to higher-level code played a major role in this.  The
syntax that makes it easy to write text-processing code is another
major factor.
From: Ken Tilton
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <VOt4g.13$fM.2@fe08.lga>
Ben Prew wrote:
> Ken Tilton wrote:
> 
> 
>>Ben Prew wrote:
>>
>>>Are you saying that the entire concept of language "standard deviation"
>>>is flawed, ...
>>
>>I will (say the entire concept is flawed). With a qualification. I agree
>>that Python and Java benefitted from being similar to C. The flaw I see
>>is... so what happened to Dylan?
> 
...

> Having spent a little time looking at the syntax and style of Dylan, I
> would agree that it lowers the "standard deviation" relative to lisp,
> and you're right, I don't know what happened to Dylan.
> 
> Perhaps having similar syntax isn't enough.
....

> The Lisp family of languages has certainly been around for a long time,
> so there's definitely a point to be made about not chasing every
> current popular trend,...

...nor chasing syntax as Dylan did and as you are suggesting.

What matters more, and what has kept Lisp alive and now has brought it 
back to the mainstream is the core correctness of the ideas of Lisp. 
Paul Graham did not hurt, either.

In fact, if you like Deep Historical Perspective, meditate on this: the 
scripting languages merely eliminated an obstacle to adoption when they 
aped C syntax. That is not what made them catch fire. What made them 
catch fire was adding to C: interactive development (which is often 
misunderstood to mean "interpreted"), no static typing of variables, 
garbage collection... see where I am going? Those languages do not know 
it, but they are chasing Lisp.

And if you read enough "the making of XXX" stories, you find their 
inventors at least /do/ know. Lisp informed much of their thinking.

> I think something like that would be very useful, as since CL has been
> around for such a long time, and from what I'm hearing, there are
> mountains of code out there that one just has to find.

Nah, it's all bit-rotted crap. Probably good solid core ideas deep in 
there somewhere, but requiring serious effort to distill out and 
repackage. In the end, rewriting from scratch is much faster and has 
many advantages.

ken

-- 
Cells: http://common-lisp.net/project/cells/

"Have you ever been in a relationship?"
    Attorney for Mary Winkler, confessed killer of her
    minister husband, when asked if the couple had
    marital problems.
From: Thomas Atkins
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <1146266436.531260.130220@j73g2000cwa.googlegroups.com>
Ben Prew asked "Also, and this may be better as another topic, but is
there a library
store, similar to CPAN, for lisp?" Over at www.cliki.net there is a
collection of libraries. There's even tool to download a library and
it's dependencies available at www.cliki.net/asdf-install
From: Thomas Atkins
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <1146266918.805840.246440@i40g2000cwc.googlegroups.com>
Ben Prew asked "Also, and this may be better as another topic, but is
there a library
store, similar to CPAN, for lisp?"
 Over at www.cliki.net there is a collection of libraries. There's even
tool to download a library and it's dependencies available at
www.cliki.net/asdf-install.

Tom
From: Paolo Amoroso
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <8764kqqhz8.fsf@plato.moon.paoloamoroso.it>
"Ben Prew" <········@gmail.com> writes:

> I think something like that would be very useful, as since CL has been
> around for such a long time, and from what I'm hearing, there are
> mountains of code out there that one just has to find.  Why not make it
> easier to locate it?

You may have a look at:

  Common Lisp Directory
  http://www.cl-user.net


Paolo
-- 
Why Lisp? http://wiki.alu.org/RtL%20Highlight%20Film
The Common Lisp Directory: http://www.cl-user.net
From: Holger Schauer
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <yxz3bfruol6.fsf@gmx.de>
On 4622 September 1993, Ben Prew wrote:
> I would also argue the Perl benefited from having traits similar enough
> to both C and sh/bash, that both programmer and sysadmin were able to
> pick up Perl with relatively little pain.

Actually, I don't think that Perl's success is related to C.

> Having spent a little time looking at the syntax and style of Dylan, I
> would agree that it lowers the "standard deviation" relative to lisp,
> and you're right, I don't know what happened to Dylan.

Well, I wasn't a Dylan user at the time but I really liked the DDJ
articles about it. I think it mainly suffered from Apple's modest
treatment or Apple's lack of success in the nineties in general. I've
also heard that the available compilers weren't that good (i.e. that
old Lisp is slow strawmen was similarly applied to Dylan), but I can't
comment on that.

> Most of the people I talked to agree that rails offered a big boost
> to Ruby, but I can't really think of a "killer app" for Python or
> Perl.  Perl has its whole I'm-a-scripting-language-I-can-automate-
> crappy-tasks-and-I-sorta-integrate-with-C-thing
> going, and for a while it was synonymous with CGI, but I'm not really
> sure it holds that distinction anymore, and so it will be interesting
> if it can maintain its popularity, in the long run.

I think, Perl's success in the 90s surely had a lot to do with the
advent of the net and CGI. I don't think the integration with C-thing
mattered that much, instead (from my POV) it was more that it combined
sh, awk and sed into a single language while at the same time
providing powerful facilities directly at the command-line interface.

> The Lisp family of languages has certainly been around for a long
> time,

Perhaps that's one of its biggest obstacles. It's not new, and hence
not shiny or hot (not in my opinion but in the opinion of a lot of
foreign observers). Also, it didn't apply well to things that were hot
in the 90s when a lot of languages popped up which were specifically
designed for dealing with these things that were considered hot. 

Perhaps all there needs to be done is to provide positive answers to
the following two questions: What is /now/ the (current or next) big
thing and is somebody using CL to tackle it? And tackle it well? If
any of you can answer these two questions positively, publish some
succesful articles about it, that should be enough to ./ c.l.l. (or
cliki or whatever).

Holger

-- 
---          http://www.coling.uni-freiburg.de/~schauer/            ---
"Du darfst Deinen Fu� jetzt wieder aus dem Fettnapf nehmen.
 Andere wollen auch noch rein."
                  -- Anselm Lingnau in de.comp.os.unix.discussion
From: Raffael Cavallaro
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <2006042818342843658-raffaelcavallaro@pasdespamsilvousplaitmaccom>
On 2006-04-28 11:38:37 -0400, Ken Tilton <·········@gmail.com> said:

> Then maybe we can figure out why Dylan did not happen. Did it skip a 
> grassroots phase during which it quietly spread by word of mouth?

To a certain extent this did happen - Apple was very secretive about 
Dylan - they trotted it out at WWDC demos and some MacWorld Expo shows, 
but to actually *use* Apple Dylan you had to be a seed site, and there 
were precious few of those. Given what we now know about the rise of 
popularity of new programming languages, Apple would have been far 
better off (or at lease Apple Cambridge would have been far better off) 
to release Dylan as a public alpha very early in the game so that 
legions of programmers tired of C++ could use it in their spare time 
and make it grassroots popular before it ever reached release quality. 
As it was, it was a little known project that had yet to produce a 
release quality product with a mere handful of users so it was an easy 
target for downsizing.
From: Cameron MacKinnon
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <4452360e$0$15789$14726298@news.sunsite.dk>
Rainer Joswig wrote:
> Unfortunately nothing will happens from a Usenet post.
> Everything begins with working source code.

I don't think so. The Lisp community has managed to lose or misplace the 
source code to more functionality than has ever been written in most 
other languages save C. Every few years a new scripting language comes 
along and takes the programming world by storm, becoming vastly more 
popular than Lisp ever was, and it's a tautology that, in the beginning, 
they each had less functionality than currently exists in the Lisp 
world. How many languages have ever been used to build complete 
operating systems including assemblers/compilers, slick IDEs, network 
stack, device drivers and windowing systems?

If writing great code were enough, Lisp would have won a long time ago.
From: Pascal Costanza
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <4beua7FvljdiU1@individual.net>
Cameron MacKinnon wrote:
> Rainer Joswig wrote:
>> Unfortunately nothing will happens from a Usenet post.
>> Everything begins with working source code.
> 
> I don't think so. The Lisp community has managed to lose or misplace the 
> source code to more functionality than has ever been written in most 
> other languages save C. Every few years a new scripting language comes 
> along and takes the programming world by storm, becoming vastly more 
> popular than Lisp ever was, and it's a tautology that, in the beginning, 
> they each had less functionality than currently exists in the Lisp 
> world. How many languages have ever been used to build complete 
> operating systems including assemblers/compilers, slick IDEs, network 
> stack, device drivers and windowing systems?
> 
> If writing great code were enough, Lisp would have won a long time ago.

I think the scripting languages "take the programming world by storm" 
because they all explicitly or implicitly promise that "from now on" 
everything will be simple. They actually succeed in convincing people 
that that's the case by showing examples in which they are actually 
indeed simple. However, I am not convinced that they scale wrt a broad 
range of domains. Whenever a language tries to be truly general-purpose, 
it will reach a certain complexity that will make other more specialized 
languages look much simpler (and much more preferable). For example, 
check out some people prefer Ruby over Python.

It seems to me that Smalltalk is the only example of a language that 
didn't increase wrt complexity over the last few decades. However, I 
guess that the complexity goes into the tools / development environment. 
(You know, the complexity has to go somewhere, it cannot just disappear.)

Maybe all that the scripting languages prove is that domain-specific 
languages are indeed a good idea. The actual language framework for 
implementing them is typically C, which results in serious disadvantages 
wrt efficiency because the only sane way to implement a domain-specific 
language in C is by writing interpreters. (Notice that one popular way 
to extend those scripting language with efficient libraries is by 
implementing those libraries in C, not in the scripting language itself.)

What actually puzzles me is that those scripting languages, which tend 
to be dynamic / dynamically typed, are almost always implemented in 
static / statically typed languages (i.e., C). If dynamic typing is a 
good idea, they should, IMHO, actually be implemented in dynamic 
languages themselves. However, it seems to me that those who implement 
scripting languages  subconsciously don't believe in dynamic approaches 
themselves.

What's also wrong with implementing domain-specific languages in a very 
different language is that the barrier between the two languages is too 
deep. It seems to me much more preferable that:

- The implemented language is closer in spirit to the host language. 
(For example, they should both be dynamically typed.)
- The implemented language is much better integrated with the host 
language, such that you don't need to cross a certain barrier to provide 
extensions.
- As a result, the different domain-specific abstractions can be mixed 
and matched in the same framework (because you may need different 
approaches for different parts of your system).

I think that Lisp is especially good at this kind of integration.

I am skeptical whether "we" can do both at the same time, make Lisp more 
accessible to a broader spectrum of programmers, including "newbies" and 
"average" programmers, and keep the expressive power of Lisp that makes 
it such a flexible and at the same time efficient language. For example, 
take a look at all the different "new" Lisp dialects: My impression is 
that they also explicitly or implicitly try to simplify Lisp, so the 
fact that they provide less than Common Lisp is not just an accident. I 
rather think that the loss of expressiveness is intentional.

It's easy to suggest cosmetic and aesthetic changes to Common Lisp, 
including those that would "accidentally" break existing code. However, 
it's hard to suggest substantial improvements, especially improvements 
that are not already available for Common Lisp in one way or the other, 
for example as vendor-specific extensions.

There are enough languages out there that aim to simplify things and 
make it easier for occasional programmers to write programs, including 
in the Lisp world (cf. ISLISP or Scheme). There isn't a substantial need 
for another one, but there must be room for expert languages, IMHO.

Just a few random thoughts.


Pascal

-- 
3rd European Lisp Workshop
July 3-4 - Nantes, France - co-located with ECOOP 2006
http://lisp-ecoop06.bknr.net/
From: Eli Gottlieb
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <sGu4g.4534$TT.3412@twister.nyroc.rr.com>
Pascal Costanza wrote:
> Cameron MacKinnon wrote:
> 
>> Rainer Joswig wrote:
>>
>>> Unfortunately nothing will happens from a Usenet post.
>>> Everything begins with working source code.
>>
>>
>> I don't think so. The Lisp community has managed to lose or misplace 
>> the source code to more functionality than has ever been written in 
>> most other languages save C. Every few years a new scripting language 
>> comes along and takes the programming world by storm, becoming vastly 
>> more popular than Lisp ever was, and it's a tautology that, in the 
>> beginning, they each had less functionality than currently exists in 
>> the Lisp world. How many languages have ever been used to build 
>> complete operating systems including assemblers/compilers, slick IDEs, 
>> network stack, device drivers and windowing systems?
>>
>> If writing great code were enough, Lisp would have won a long time ago.
> 
> 
> I think the scripting languages "take the programming world by storm" 
> because they all explicitly or implicitly promise that "from now on" 
> everything will be simple. They actually succeed in convincing people 
> that that's the case by showing examples in which they are actually 
> indeed simple. However, I am not convinced that they scale wrt a broad 
> range of domains. Whenever a language tries to be truly general-purpose, 
> it will reach a certain complexity that will make other more specialized 
> languages look much simpler (and much more preferable). For example, 
> check out some people prefer Ruby over Python.
> 
> It seems to me that Smalltalk is the only example of a language that 
> didn't increase wrt complexity over the last few decades. However, I 
> guess that the complexity goes into the tools / development environment. 
> (You know, the complexity has to go somewhere, it cannot just disappear.)
> 
> Maybe all that the scripting languages prove is that domain-specific 
> languages are indeed a good idea. The actual language framework for 
> implementing them is typically C, which results in serious disadvantages 
> wrt efficiency because the only sane way to implement a domain-specific 
> language in C is by writing interpreters. (Notice that one popular way 
> to extend those scripting language with efficient libraries is by 
> implementing those libraries in C, not in the scripting language itself.)
> 
> What actually puzzles me is that those scripting languages, which tend 
> to be dynamic / dynamically typed, are almost always implemented in 
> static / statically typed languages (i.e., C). If dynamic typing is a 
> good idea, they should, IMHO, actually be implemented in dynamic 
> languages themselves. However, it seems to me that those who implement 
> scripting languages  subconsciously don't believe in dynamic approaches 
> themselves.

OK, so go work on SBCL.  AFAIK, it's the only Common Lisp system 
actually written in Common Lisp and able to self-compile.

> 
> What's also wrong with implementing domain-specific languages in a very 
> different language is that the barrier between the two languages is too 
> deep. It seems to me much more preferable that:
> 
> - The implemented language is closer in spirit to the host language. 
> (For example, they should both be dynamically typed.)
> - The implemented language is much better integrated with the host 
> language, such that you don't need to cross a certain barrier to provide 
> extensions.
> - As a result, the different domain-specific abstractions can be mixed 
> and matched in the same framework (because you may need different 
> approaches for different parts of your system).
> 
> I think that Lisp is especially good at this kind of integration.
> 
> I am skeptical whether "we" can do both at the same time, make Lisp more 
> accessible to a broader spectrum of programmers, including "newbies" and 
> "average" programmers, and keep the expressive power of Lisp that makes 
> it such a flexible and at the same time efficient language. For example, 
> take a look at all the different "new" Lisp dialects: My impression is 
> that they also explicitly or implicitly try to simplify Lisp, so the 
> fact that they provide less than Common Lisp is not just an accident. I 
> rather think that the loss of expressiveness is intentional.
> 
> It's easy to suggest cosmetic and aesthetic changes to Common Lisp, 
> including those that would "accidentally" break existing code. However, 
> it's hard to suggest substantial improvements, especially improvements 
> that are not already available for Common Lisp in one way or the other, 
> for example as vendor-specific extensions.

Plenty of things are available as vendor-specific extensions, and the 
vendor-specific part is why people don't use them.  People like coding 
programs that will compile in separate compilers: that's why they delay 
upgrading their GCC when there's a bug in the new version.

> 
> There are enough languages out there that aim to simplify things and 
> make it easier for occasional programmers to write programs, including 
> in the Lisp world (cf. ISLISP or Scheme). There isn't a substantial need 
> for another one, but there must be room for expert languages, IMHO.

I quite agree.

> 
> Just a few random thoughts.
> 
> 
> Pascal
> 

Let me make a suggestion: Less yackity yack about choosing language 
constructs and libraries to standardize.  A single, standard FFI that 
will not only allow interfacing to foreign libraries, but allow foreign 
libraries to understand Lisp data types.  Then if someone wants a new 
language construct or functionality, they can write as a library what 
used to require an implementation extension.

-- 
The science of economics is the cleverest proof of free will yet 
constructed.
From: jayessay
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <m3fyjx4bxw.fsf@rigel.goldenthreadtech.com>
Eli Gottlieb <···········@gmail.com> writes:

> OK, so go work on SBCL.  AFAIK, it's the only Common Lisp system
> actually written in Common Lisp and able to self-compile.

And how do you know this?  Think Allegro, LispWorks, ...


> Plenty of things are available as vendor-specific extensions, and the
> vendor-specific part is why people don't use them.

Which reduces their behavior here to that of an idiot.


>  People like coding programs that will compile in separate
> compilers: that's why they delay upgrading their GCC when there's a
> bug in the new version.

This "analogy" would imply that they would tend not to _upgrade_ their
implementation, not that they wouldn't pick any at all.


> Let me make a suggestion: Less yackity yack about choosing language
> constructs and libraries to standardize.  A single, standard FFI that

Isn't this old ground?  Doesn't CFFI address this (even if it is not
yet perfect)???


/Jon

-- 
'j' - a n t h o n y at romeo/charley/november com
From: Eli Gottlieb
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <jtv4g.12602$ZQ3.2300@twister.nyroc.rr.com>
jayessay wrote:
> Eli Gottlieb <···········@gmail.com> writes:
> 
> 
>>OK, so go work on SBCL.  AFAIK, it's the only Common Lisp system
>>actually written in Common Lisp and able to self-compile.
> 
> 
> And how do you know this?  Think Allegro, LispWorks, ...
> 
> 
> 
>>Plenty of things are available as vendor-specific extensions, and the
>>vendor-specific part is why people don't use them.
> 
> 
> Which reduces their behavior here to that of an idiot.
> 
> 
> 
>> People like coding programs that will compile in separate
>>compilers: that's why they delay upgrading their GCC when there's a
>>bug in the new version.
> 
> 
> This "analogy" would imply that they would tend not to _upgrade_ their
> implementation, not that they wouldn't pick any at all.
> 
> 
> 
>>Let me make a suggestion: Less yackity yack about choosing language
>>constructs and libraries to standardize.  A single, standard FFI that
> 
> 
> Isn't this old ground?  Doesn't CFFI address this (even if it is not
> yet perfect)???
> 
> 
> /Jon
> 
It does not address this problem, because CFFI sits on top of 
vendor-specific FFIs.  Joy.  Furthermore, can I write a macro in C that 
will recognize and manipulate Lisp data from any conforming 
implementation and export it to Lisp with CFFI?

-- 
The science of economics is the cleverest proof of free will yet 
constructed.
From: jayessay
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <m3bqul49e3.fsf@rigel.goldenthreadtech.com>
Eli Gottlieb <···········@gmail.com> writes:

> It does not address this problem, because CFFI sits on top of
> vendor-specific FFIs.

How would that affect a user of CFFI?  The idea is you use the vendor
neutral facade.  One possibility where this could fall apart is that
some significant vendor does not provide enough of an FFI to fully
support the requirements CFFI aims at.  That might be true, I don't
know as I am fine with my vendor and the capabilities they provide
(having worked with them for several years now).  But even if true, it
would seem that that vendor would fix that situation so as not to be
excluded from possible use by people wanting CFFI.


> Furthermore, can I write a macro in C that will recognize and
> manipulate Lisp data from any conforming implementation and export
> it to Lisp with CFFI?

I'm not sure that question even makes sense, but if it does, maybe
someone who knows CFFI can answer.


/Jon

-- 
'j' - a n t h o n y at romeo/charley/november com
From: Ken Tilton
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <tvy4g.939$rD.510@fe11.lga>
Eli Gottlieb wrote:
> jayessay wrote:
> 
>> Eli Gottlieb <···········@gmail.com> writes:
>>
>>
>>> OK, so go work on SBCL.  AFAIK, it's the only Common Lisp system
>>> actually written in Common Lisp and able to self-compile.
>>
>>
>>
>> And how do you know this?  Think Allegro, LispWorks, ...
>>
>>
>>
>>> Plenty of things are available as vendor-specific extensions, and the
>>> vendor-specific part is why people don't use them.
>>
>>
>>
>> Which reduces their behavior here to that of an idiot.
>>
>>
>>
>>> People like coding programs that will compile in separate
>>> compilers: that's why they delay upgrading their GCC when there's a
>>> bug in the new version.
>>
>>
>>
>> This "analogy" would imply that they would tend not to _upgrade_ their
>> implementation, not that they wouldn't pick any at all.
>>
>>
>>
>>> Let me make a suggestion: Less yackity yack about choosing language
>>> constructs and libraries to standardize.  A single, standard FFI that
>>
>>
>>
>> Isn't this old ground?  Doesn't CFFI address this (even if it is not
>> yet perfect)???
>>
>>
>> /Jon
>>
> It does not address this problem, because CFFI sits on top of 
> vendor-specific FFIs.  Joy. 

Have you checked the list of supported vendors?

    http://common-lisp.net/project/cffi/

Utter joy. :)

ken

-- 
Cells: http://common-lisp.net/project/cells/

"Have you ever been in a relationship?"
    Attorney for Mary Winkler, confessed killer of her
    minister husband, when asked if the couple had
    marital problems.
From: Ken Tilton
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <oHy4g.940$rD.785@fe11.lga>
Ken Tilton wrote:
> 
> 
> Eli Gottlieb wrote:
> 
>> jayessay wrote:
>>
>>> Eli Gottlieb <···········@gmail.com> writes:
>>>
>>>
>>>> Let me make a suggestion: Less yackity yack about choosing language
>>>> constructs and libraries to standardize.  A single, standard FFI that
>>>
>>>
>>>
>>>
>>> Isn't this old ground?  Doesn't CFFI address this (even if it is not
>>> yet perfect)???
>>>
>>>
>>> /Jon
>>>
>> It does not address this problem, because CFFI sits on top of 
>> vendor-specific FFIs.  Joy. 
> 
> 
> Have you checked the list of supported vendors?
> 
>    http://common-lisp.net/project/cffi/
> 
> Utter joy. :)

I should have added that James's approach in CFFI ducks under 
implementation FFIs wherever possible to avoid the LCD problem of UFFI, 
and that Luis Oliveira did a heroic job of hitting every implementation 
that would cooperate during SOC 2005 to create an effectively complete win.

It reached such a threshold that implementors are indeed working with 
CFFI developers to make things even better. Subscribe to the list to 
follow the action, which frankly is getting over my application 
programmer's head.

You might want to be a little more careful about denigrating projects 
that good.

ken

-- 
Cells: http://common-lisp.net/project/cells/

"Have you ever been in a relationship?"
    Attorney for Mary Winkler, confessed killer of her
    minister husband, when asked if the couple had
    marital problems.
From: Jack Unrue
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <gp2552hptp6ipqcc1af8nfce6ho94f0mek@4ax.com>
On Fri, 28 Apr 2006 21:06:23 GMT, Eli Gottlieb <···········@gmail.com> wrote:
>
> Furthermore, can I write a macro in C that will recognize
> and manipulate Lisp data from any conforming 
> implementation and export it to Lisp with CFFI?

Since macros in C are enabled by the *pre-processor*
stage, and thus the only data that can be manipulated
are tokens in the parser input stream, I think this is
a malformed question.

-- 
Jack Unrue
From: Eli Gottlieb
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <dsw4g.4543$TT.280@twister.nyroc.rr.com>
Jack Unrue wrote:
> On Fri, 28 Apr 2006 21:06:23 GMT, Eli Gottlieb <···········@gmail.com> wrote:
> 
>>Furthermore, can I write a macro in C that will recognize
>>and manipulate Lisp data from any conforming 
>>implementation and export it to Lisp with CFFI?
> 
> 
> Since macros in C are enabled by the *pre-processor*
> stage, and thus the only data that can be manipulated
> are tokens in the parser input stream, I think this is
> a malformed question.
> 
Correction:
Furthermore, can I write a function in C that will recognize
and manipulate Lisp data from any conforming
implementation and, by using CFFI, export that function to Lisp for use 
as a macro function?
-- 
The science of economics is the cleverest proof of free will yet 
constructed.
From: jayessay
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <m3wtd92sjj.fsf@rigel.goldenthreadtech.com>
Eli Gottlieb <···········@gmail.com> writes:

> Jack Unrue wrote:
> > On Fri, 28 Apr 2006 21:06:23 GMT, Eli Gottlieb <···········@gmail.com> wrote:
> >
> >>Furthermore, can I write a macro in C that will recognize
> >> and manipulate Lisp data from any conforming implementation and
> >> export it to Lisp with CFFI?
> > Since macros in C are enabled by the *pre-processor*
> > stage, and thus the only data that can be manipulated
> > are tokens in the parser input stream, I think this is
> > a malformed question.
> >
> Correction:
> Furthermore, can I write a function in C that will recognize
> and manipulate Lisp data from any conforming
> implementation

Probably not as that would imply knowing the details of the
implementations internal data representations, which seems at best
like a bad idea (low level implementation coupling and restrictions on
how an implemtation may represent low level internal code) and at
worst just plain goofy.


> and, by using CFFI, export that function to Lisp for use as a macro
> function?

Are you saying that you want to write a macro in C and export it for
use in Lisp?  That's not just colossally dumb its outright crazy.


/Jon

-- 
'j' - a n t h o n y at romeo/charley/november com
From: Raffael Cavallaro
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <200604281911488930-raffaelcavallaro@pasdespamsilvousplaitmaccom>
On 2006-04-28 19:23:44 -0400, jayessay <······@foo.com> said:

> Probably not as that would imply knowing the details of the
> implementations internal data representations, which seems at best
> like a bad idea (low level implementation coupling and restrictions on
> how an implemtation may represent low level internal code) and at
> worst just plain goofy.

I think Eli may be asking for a standardized library for doing calls 
and callbacks from C into lisp (in addition to a standardized ffi for 
calling out to C libraries from lisp, but we're all already on the same 
page wrt that).
From: Ken Tilton
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <dWy4g.941$rD.517@fe11.lga>
Raffael Cavallaro wrote:
> On 2006-04-28 19:23:44 -0400, jayessay <······@foo.com> said:
> 
>> Probably not as that would imply knowing the details of the
>> implementations internal data representations, which seems at best
>> like a bad idea (low level implementation coupling and restrictions on
>> how an implemtation may represent low level internal code) and at
>> worst just plain goofy.
> 
> 
> I think Eli may be asking for a standardized library for doing calls and 
> callbacks from C into lisp ...

Done. As is passing data to C, which Eli agonized about elsewhere.

Ain't life grand? :)

ken

-- 
Cells: http://common-lisp.net/project/cells/

"Have you ever been in a relationship?"
    Attorney for Mary Winkler, confessed killer of her
    minister husband, when asked if the couple had
    marital problems.
From: Eli Gottlieb
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <pVz4g.4561$TT.1476@twister.nyroc.rr.com>
Raffael Cavallaro wrote:
> On 2006-04-28 19:23:44 -0400, jayessay <······@foo.com> said:
> 
>> Probably not as that would imply knowing the details of the
>> implementations internal data representations, which seems at best
>> like a bad idea (low level implementation coupling and restrictions on
>> how an implemtation may represent low level internal code) and at
>> worst just plain goofy.
> 
> 
> I think Eli may be asking for a standardized library for doing calls and 
> callbacks from C into lisp (in addition to a standardized ffi for 
> calling out to C libraries from lisp, but we're all already on the same 
> page wrt that).
> 
That's exactly what I was asking about.  Think: How do special operators 
or primitive functions get added to the language?  They're coded in a 
lower level language.  If we could export C functions into Lisp, we 
could extend implementations by writing a library.

And since it's apparently been done, where is it?

-- 
The science of economics is the cleverest proof of free will yet 
constructed.
From: jayessay
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <m3slnx2gdg.fsf@rigel.goldenthreadtech.com>
Eli Gottlieb <···········@gmail.com> writes:

> Raffael Cavallaro wrote:
> > On 2006-04-28 19:23:44 -0400, jayessay <······@foo.com> said:
> >
> >> Probably not as that would imply knowing the details of the
> >> implementations internal data representations, which seems at best
> >> like a bad idea (low level implementation coupling and restrictions on
> >> how an implemtation may represent low level internal code) and at
> >> worst just plain goofy.
> > I think Eli may be asking for a standardized library for doing calls
> > and callbacks from C into lisp (in addition to a standardized ffi
> > for calling out to C libraries from lisp, but we're all already on
> > the same page wrt that).
> >
> That's exactly what I was asking about.

You should try to be clearer - your version of this was pretty opaque.


>  Think: How do special operators or primitive functions get added to
> the language?  They're coded in a lower level language.

Not in the case of something like Lisp.  Plenty of "primitives" can be
added by means of the base language + macros.  You are thinking too
much in terms of static non malleable languages (C, Pascal, etc.).


> If we could export C functions into Lisp, we could extend
> implementations by writing a library.

What makes you think this would be workable (or even a "good thing")
in implementations that don't use C as their main implementation
language, in particular for those which use CL as the main language.
You seem pretty confused about implementation aspects.


/Jon

-- 
'j' - a n t h o n y at romeo/charley/november com
From: Eli Gottlieb
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <lVC4g.4570$TT.2570@twister.nyroc.rr.com>
jayessay wrote:
> Eli Gottlieb <···········@gmail.com> writes:
> 
> 
>>Raffael Cavallaro wrote:
>>
>>>On 2006-04-28 19:23:44 -0400, jayessay <······@foo.com> said:
>>>
>>>
>>>>Probably not as that would imply knowing the details of the
>>>>implementations internal data representations, which seems at best
>>>>like a bad idea (low level implementation coupling and restrictions on
>>>>how an implemtation may represent low level internal code) and at
>>>>worst just plain goofy.
>>>
>>>I think Eli may be asking for a standardized library for doing calls
>>>and callbacks from C into lisp (in addition to a standardized ffi
>>>for calling out to C libraries from lisp, but we're all already on
>>>the same page wrt that).
>>>
>>
>>That's exactly what I was asking about.
> 
> 
> You should try to be clearer - your version of this was pretty opaque.
> 
> 
> 
>> Think: How do special operators or primitive functions get added to
>>the language?  They're coded in a lower level language.
> 
> 
> Not in the case of something like Lisp.  Plenty of "primitives" can be
> added by means of the base language + macros.  You are thinking too
> much in terms of static non malleable languages (C, Pascal, etc.).

No, I mean actual Common Lisp primitives.  You know, like number support 
not built from lists of symbols?  apply and eval? let?  That stuff must 
be coded into the implementation, which is usually (though as you folks 
so eagerly repeat, not always) implemented in a lower-level language for 
speed/not-having-to-bootstrap purposes.

> 
> 
> 
>>If we could export C functions into Lisp, we could extend
>>implementations by writing a library.
> 
> 
> What makes you think this would be workable (or even a "good thing")
> in implementations that don't use C as their main implementation
> language, in particular for those which use CL as the main language.
> You seem pretty confused about implementation aspects.
> 
> 
> /Jon
> 
It's quite simple really: C is the lowest you can go and remain above 
assembler code.  Thus, if you can interface to C you can interface to 
just about anything.

-- 
The science of economics is the cleverest proof of free will yet 
constructed.
From: Pascal Bourguignon
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <87fyjw7vca.fsf@thalassa.informatimago.com>
Eli Gottlieb <···········@gmail.com> writes:

> No, I mean actual Common Lisp primitives.  You know, like number
> support not built from lists of symbols?  

Numbers are not built from lists of symbols. They are built from functions:

(defparameter zero (lambda (f) (lambda (x) x)))
(defparameter one (lambda (f) (lambda (x) (funcall f x))))
(defparameter two (lambda (f) (lambda (x) (funcall f (funcall f x)))))
(defparameter three (lambda (f) (lambda (x) (funcall f (funcall f (funcall f x))))))
(defun succ (n) (lambda (f) (lambda (x) (funcall f (funcall (funcall n f) x)))))
(defparameter four (succ three))
(defparameter five (succ four))
;; ...
(defparameter true (lambda (x) (lambda (y) x)))
(defparameter false (lambda (x) (lambda (y) y)))
(defun not (b) (funcall (funcall b false) true))
(defun zerop (n) (funcall (funcall n (lambda (x) false)) true))

(defun add (m)
  (lambda (n) (lambda (x) (lambda (y) (funcall (funcall m x) (funcall (funcall n x) y))))))
(defun sub (n) (lambda (m) (funcall (funcall m pred) n)))
(defun mult (n) (lambda (m) (funcall (funcall n (add m)) zero)))
;; etc...

If you mean to use a microprocessor ALU, remember it's only an
optimization, and you can easily implement it writing more lisp code.
Have a look at any Common Lisp compiler (obviously written in Common
Lisp).


> apply and eval? let?  That stuff must be coded into the
> implementation, which is usually (though as you folks so eagerly
> repeat, not always) implemented in a lower-level language for
> speed/not-having-to-bootstrap purposes.

Not at all.  LET is just a macro:

(defmacro let (bindings &body body)
  `((lambda ,(mapcar (lambda (item) (if (consp item) (first item) item)))
      ,@body) ,@(mapcar (lambda (item) (if (cons item) (second item) 'nil)))))

APPLY and EVAL are just lisp functions like any other.

They're not even special operators!


>>>If we could export C functions into Lisp, we could extend
>>>implementations by writing a library.
>> What makes you think this would be workable (or even a "good thing")
>> in implementations that don't use C as their main implementation
>> language, in particular for those which use CL as the main language.
>> You seem pretty confused about implementation aspects.
>> /Jon
>> 
> It's quite simple really: C is the lowest you can go and remain above
> assembler code.  Thus, if you can interface to C you can interface to
> just about anything.

Of course, if you can do the harder thing, you can also do the
easiest.  But why would you want to empty the Dead Sea with only a
spoon?  Try a toothpick rather.  If you can interface to transistors,
you can even interface to more things than with C!  You could even go
to the atoms, and interface to the biological world even!!



-- 
__Pascal Bourguignon__                     http://www.informatimago.com/

"What is this talk of "release"?  Klingons do not make software
"releases".  Our software "escapes" leaving a bloody trail of
designers and quality assurance people in its wake."
From: Rainer Joswig
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <C078FC68.3A2C4%joswig@lisp.de>
Am 29.04.2006 7:34 Uhr schrieb "Eli Gottlieb" unter <···········@gmail.com>
in ··················@twister.nyroc.rr.com:

> jayessay wrote:
>> Eli Gottlieb <···········@gmail.com> writes:
>> 
>> 
>>> Raffael Cavallaro wrote:
>>> 
>>>> On 2006-04-28 19:23:44 -0400, jayessay <······@foo.com> said:
>>>> 
>>>> 
>>>>> Probably not as that would imply knowing the details of the
>>>>> implementations internal data representations, which seems at best
>>>>> like a bad idea (low level implementation coupling and restrictions on
>>>>> how an implemtation may represent low level internal code) and at
>>>>> worst just plain goofy.
>>>> 
>>>> I think Eli may be asking for a standardized library for doing calls
>>>> and callbacks from C into lisp (in addition to a standardized ffi
>>>> for calling out to C libraries from lisp, but we're all already on
>>>> the same page wrt that).
>>>> 
>>> 
>>> That's exactly what I was asking about.
>> 
>> 
>> You should try to be clearer - your version of this was pretty opaque.
>> 
>> 
>> 
>>> Think: How do special operators or primitive functions get added to
>>> the language?  They're coded in a lower level language.
>> 
>> 
>> Not in the case of something like Lisp.  Plenty of "primitives" can be
>> added by means of the base language + macros.  You are thinking too
>> much in terms of static non malleable languages (C, Pascal, etc.).
> 
> No, I mean actual Common Lisp primitives.  You know, like number support
> not built from lists of symbols?  apply and eval? let?  That stuff must
> be coded into the implementation, which is usually (though as you folks
> so eagerly repeat, not always) implemented in a lower-level language for
> speed/not-having-to-bootstrap purposes.

Why don't you just read a book about Lisp implementation? Plus there are
lots of systems to study.

Most Common Lisp systems are written in Common Lisp with a little C and some
assembler. The compiler in most Common Lisp systems (incl. code generation)
is written in Common Lisp. If I would think about it I can't name one
where the compiler is not written in Common Lisp.

If you want to extend the Common Lisp system with a new feature, then
you can take CLOS as an example. Common Lisp started without CLOS.
Common Lisp was described in the book Common Lisp the Language
(edition one, called CLtL1) and did not include an object system.
Later, CLOS got added, so that ANSI CL included CLOS as
an object-system.

When CLOS was being developed, there was a portable reference implementation
called PCL (Portable Common Loops). It was extending the various Common Lisp
implementations with things like:

- new types
- classes
- new primitiv data type: objects
- new type of functions: generic functions
- meta classes
- methods
- method combinations
- some integration of types and classes
- various changes to built-in functions

PCL was written in Common Lisp plus some system-dependent Lisp code and got
ported to, say, more than ten Common Lisp implementations.
From: jayessay
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <m3k6982omo.fsf@rigel.goldenthreadtech.com>
Eli Gottlieb <···········@gmail.com> writes:

> jayessay wrote:
> > Eli Gottlieb <···········@gmail.com> write
>
> > Not in the case of something like Lisp.  Plenty of "primitives" can
> > be
> > added by means of the base language + macros.  You are thinking too
> > much in terms of static non malleable languages (C, Pascal, etc.).
> 
> No, I mean actual Common Lisp primitives.

Yeah, and so did I.  I suppose it depends on what you mean by
"primitives".  See below...


>  You know, like number support not built from lists of symbols?

This makes no sense - numbers are not built in this way.  And neither
would any sensible version of user built ones.


>  apply and eval? 

Ordinary functions.


> let?

OK, but even new "let-like" operators could be built on top the
typical core.  Even Let could be built on top of some core.


> That stuff must be coded into the implementation, which is usually

Well, as you can see you are mostly dead wrong on these examples.


> (though as you folks so eagerly repeat, not always) implemented in a
> lower-level language for speed/not-having-to-bootstrap purposes.

And here you are mostly wrong as well.


> >>If we could export C functions into Lisp, we could extend
> >>implementations by writing a library.
> > What makes you think this would be workable (or even a "good thing")
> > in implementations that don't use C as their main implementation
> > language, in particular for those which use CL as the main language.
> > You seem pretty confused about implementation aspects.
> > /Jon
> >
> It's quite simple really: C is the lowest you can go and remain above
> assembler code

As stated, this is simply wrong.


> .  Thus, if you can interface to C you can interface to
> just about anything.

OK, on current "popular" systems this is an understandable claim.  OTOH
it's irrelevant to the discussion.


/Jon

-- 
'j' - a n t h o n y at romeo/charley/november com
From: Pascal Bourguignon
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <87odyl6qto.fsf@thalassa.informatimago.com>
Eli Gottlieb <···········@gmail.com> writes:
> That's exactly what I was asking about.  Think: How do special
> operators or primitive functions get added to the language?  They're
> coded in a lower level language.  If we could export C functions into
> Lisp, we could extend implementations by writing a library.

Actually, they're not necessarily coded in a lower level language.

For example, when you implement a BASIC in Common Lisp, the primitive
functions of the BASIC are implemented in a higher level language:
Common Lisp.

In the case of Common Lisp, more often, the primitives are implemented
in Common Lisp itself.  It's only for performance reasons that some
low-level primitives are implemented in assembler.

For example, we could implement CONS, CAR and CDR as:

(defun cons (car cdr)  (lambda (x) (if x car cdr)))
(defun car (cons)      (funcall cons t))
(defun cdr (cons)      (funcall cons nil))

Only it wouldn't be so efficient, so we rather implement them as some
assembly code primitive functions.


Finally, special operators are implemented by the compiler, and Common
Lisp compilers are written in Common Lisp.  Even clisp compiler is
written in Common Lisp (even though clisp virtual machine and most
clisp primitives are written in C).  So it doesn't help much to be
able to call C functions thru a FFI to add a special operator.  Just
fetch the CL source of your CL compiler, and write some CL code!


-- 
__Pascal Bourguignon__                     http://www.informatimago.com/

What is this talk of 'release'? Klingons do not make software 'releases'.
Our software 'escapes' leaving a bloody trail of designers and quality
assurance people in it's wake.
From: Eli Gottlieb
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <rWC4g.4571$TT.2301@twister.nyroc.rr.com>
Pascal Bourguignon wrote:
> Eli Gottlieb <···········@gmail.com> writes:
> 
>>That's exactly what I was asking about.  Think: How do special
>>operators or primitive functions get added to the language?  They're
>>coded in a lower level language.  If we could export C functions into
>>Lisp, we could extend implementations by writing a library.
> 
> 
> Actually, they're not necessarily coded in a lower level language.
> 
> For example, when you implement a BASIC in Common Lisp, the primitive
> functions of the BASIC are implemented in a higher level language:
> Common Lisp.
> 
> In the case of Common Lisp, more often, the primitives are implemented
> in Common Lisp itself.  It's only for performance reasons that some
> low-level primitives are implemented in assembler.
> 
> For example, we could implement CONS, CAR and CDR as:
> 
> (defun cons (car cdr)  (lambda (x) (if x car cdr)))
> (defun car (cons)      (funcall cons t))
> (defun cdr (cons)      (funcall cons nil))
> 
> Only it wouldn't be so efficient, so we rather implement them as some
> assembly code primitive functions.
> 
> 
> Finally, special operators are implemented by the compiler, and Common
> Lisp compilers are written in Common Lisp.  Even clisp compiler is
> written in Common Lisp (even though clisp virtual machine and most
> clisp primitives are written in C).  So it doesn't help much to be
> able to call C functions thru a FFI to add a special operator.  Just
> fetch the CL source of your CL compiler, and write some CL code!
> 
> 
I'm starting to get what was meant by calling the Common Lisp community 
"incestuous".

-- 
The science of economics is the cleverest proof of free will yet 
constructed.
From: jayessay
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <m3fyjw2ojm.fsf@rigel.goldenthreadtech.com>
Eli Gottlieb <···········@gmail.com> writes:

> Pascal Bourguignon wrote:
> > Eli Gottlieb <···········@gmail.com> writes:
> >
> >>That's exactly what I was asking about.  Think: How do special
> >>operators or primitive functions get added to the language?  They're
> >>coded in a lower level language.  If we could export C functions into
> >>Lisp, we could extend implementations by writing a library.
> > Actually, they're not necessarily coded in a lower level language.
> > For example, when you implement a BASIC in Common Lisp, the primitive
> > functions of the BASIC are implemented in a higher level language:
> > Common Lisp.
> > In the case of Common Lisp, more often, the primitives are
> > implemented
> > in Common Lisp itself.  It's only for performance reasons that some
> > low-level primitives are implemented in assembler.
> > For example, we could implement CONS, CAR and CDR as:
> > (defun cons (car cdr)  (lambda (x) (if x car cdr)))
> > (defun car (cons)      (funcall cons t))
> > (defun cdr (cons)      (funcall cons nil))
> > Only it wouldn't be so efficient, so we rather implement them as some
> > assembly code primitive functions.
> > Finally, special operators are implemented by the compiler, and
> > Common
> > Lisp compilers are written in Common Lisp.  Even clisp compiler is
> > written in Common Lisp (even though clisp virtual machine and most
> > clisp primitives are written in C).  So it doesn't help much to be
> > able to call C functions thru a FFI to add a special operator.  Just
> > fetch the CL source of your CL compiler, and write some CL code!
> >
> I'm starting to get what was meant by calling the Common Lisp
> community "incestuous".

By that assesment, I suppose the "C community" is "incestuous" as
well...


/Jon

-- 
'j' - a n t h o n y at romeo/charley/november com
From: Eli Gottlieb
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <SCP4g.4763$Gg.4760@twister.nyroc.rr.com>
jayessay wrote:
> Eli Gottlieb <···········@gmail.com> writes:
> 
> 
>>Pascal Bourguignon wrote:
>>
>>>Eli Gottlieb <···········@gmail.com> writes:
>>>
>>>
>>>>That's exactly what I was asking about.  Think: How do special
>>>>operators or primitive functions get added to the language?  They're
>>>>coded in a lower level language.  If we could export C functions into
>>>>Lisp, we could extend implementations by writing a library.
>>>
>>>Actually, they're not necessarily coded in a lower level language.
>>>For example, when you implement a BASIC in Common Lisp, the primitive
>>>functions of the BASIC are implemented in a higher level language:
>>>Common Lisp.
>>>In the case of Common Lisp, more often, the primitives are
>>>implemented
>>>in Common Lisp itself.  It's only for performance reasons that some
>>>low-level primitives are implemented in assembler.
>>>For example, we could implement CONS, CAR and CDR as:
>>>(defun cons (car cdr)  (lambda (x) (if x car cdr)))
>>>(defun car (cons)      (funcall cons t))
>>>(defun cdr (cons)      (funcall cons nil))
>>>Only it wouldn't be so efficient, so we rather implement them as some
>>>assembly code primitive functions.
>>>Finally, special operators are implemented by the compiler, and
>>>Common
>>>Lisp compilers are written in Common Lisp.  Even clisp compiler is
>>>written in Common Lisp (even though clisp virtual machine and most
>>>clisp primitives are written in C).  So it doesn't help much to be
>>>able to call C functions thru a FFI to add a special operator.  Just
>>>fetch the CL source of your CL compiler, and write some CL code!
>>>
>>
>>I'm starting to get what was meant by calling the Common Lisp
>>community "incestuous".
> 
> 
> By that assesment, I suppose the "C community" is "incestuous" as
> well...
> 
> 
> /Jon
> 
If I may coin a phrase:

Lambda tar pit (noun, by mutation from "Turing tar pit") - The attempt 
to build an entire language/programming model from nothing but symbols 
and lambdas.  Such a language is theoretically Turing-complete but sheer 
hell to work in.

-- 
The science of economics is the cleverest proof of free will yet 
constructed.
From: jayessay
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <m37j582i2a.fsf@rigel.goldenthreadtech.com>
Eli Gottlieb <···········@gmail.com> writes:

> jayessay wrote:
> > Eli Gottlieb <···········@gmail.com> writes:
> >
> >>Pascal Bourguignon wrote:
> >>
> >>>Eli Gottlieb <···········@gmail.com> writes:
> >>>
> >>>
> >>>>That's exactly what I was asking about.  Think: How do special
> >>>>operators or primitive functions get added to the language?  They're
> >>>>coded in a lower level language.  If we could export C functions into
> >>>>Lisp, we could extend implementations by writing a library.
> >>>
> >>>Actually, they're not necessarily coded in a lower level language.
> >>>For example, when you implement a BASIC in Common Lisp, the primitive
> >>>functions of the BASIC are implemented in a higher level language:
> >>>Common Lisp.
> >>>In the case of Common Lisp, more often, the primitives are
> >>>implemented
> >>>in Common Lisp itself.  It's only for performance reasons that some
> >>>low-level primitives are implemented in assembler.
> >>>For example, we could implement CONS, CAR and CDR as:
> >>>(defun cons (car cdr)  (lambda (x) (if x car cdr)))
> >>>(defun car (cons)      (funcall cons t))
> >>>(defun cdr (cons)      (funcall cons nil))
> >>>Only it wouldn't be so efficient, so we rather implement them as some
> >>>assembly code primitive functions.
> >>>Finally, special operators are implemented by the compiler, and
> >>>Common
> >>>Lisp compilers are written in Common Lisp.  Even clisp compiler is
> >>>written in Common Lisp (even though clisp virtual machine and most
> >>>clisp primitives are written in C).  So it doesn't help much to be
> >>>able to call C functions thru a FFI to add a special operator.  Just
> >>>fetch the CL source of your CL compiler, and write some CL code!
> >>>
> >>
> >>I'm starting to get what was meant by calling the Common Lisp
> >>community "incestuous".
> > By that assesment, I suppose the "C community" is "incestuous" as
> > well...
> > /Jon
> >
> If I may coin a phrase:
> 
> Lambda tar pit (noun, by mutation from "Turing tar pit") - The attempt
> to build an entire language/programming model from nothing but symbols
> and lambdas.  Such a language is theoretically Turing-complete but
> sheer hell to work in.

Fair enough.  But again, by this assesment, the "C community" I
suppose is in a similar tar pit...

/Jon

-- 
'j' - a n t h o n y at romeo/charley/november com
From: Eli Gottlieb
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <4ma5g.15399$ZQ3.14044@twister.nyroc.rr.com>
jayessay wrote:
> Eli Gottlieb <···········@gmail.com> writes:
> 
> 
>>jayessay wrote:
>>
>>>Eli Gottlieb <···········@gmail.com> writes:
>>>
>>>
>>>>Pascal Bourguignon wrote:
>>>>
>>>>
>>>>>Eli Gottlieb <···········@gmail.com> writes:
>>>>>
>>>>>
>>>>>
>>>>>>That's exactly what I was asking about.  Think: How do special
>>>>>>operators or primitive functions get added to the language?  They're
>>>>>>coded in a lower level language.  If we could export C functions into
>>>>>>Lisp, we could extend implementations by writing a library.
>>>>>
>>>>>Actually, they're not necessarily coded in a lower level language.
>>>>>For example, when you implement a BASIC in Common Lisp, the primitive
>>>>>functions of the BASIC are implemented in a higher level language:
>>>>>Common Lisp.
>>>>>In the case of Common Lisp, more often, the primitives are
>>>>>implemented
>>>>>in Common Lisp itself.  It's only for performance reasons that some
>>>>>low-level primitives are implemented in assembler.
>>>>>For example, we could implement CONS, CAR and CDR as:
>>>>>(defun cons (car cdr)  (lambda (x) (if x car cdr)))
>>>>>(defun car (cons)      (funcall cons t))
>>>>>(defun cdr (cons)      (funcall cons nil))
>>>>>Only it wouldn't be so efficient, so we rather implement them as some
>>>>>assembly code primitive functions.
>>>>>Finally, special operators are implemented by the compiler, and
>>>>>Common
>>>>>Lisp compilers are written in Common Lisp.  Even clisp compiler is
>>>>>written in Common Lisp (even though clisp virtual machine and most
>>>>>clisp primitives are written in C).  So it doesn't help much to be
>>>>>able to call C functions thru a FFI to add a special operator.  Just
>>>>>fetch the CL source of your CL compiler, and write some CL code!
>>>>>
>>>>
>>>>I'm starting to get what was meant by calling the Common Lisp
>>>>community "incestuous".
>>>
>>>By that assesment, I suppose the "C community" is "incestuous" as
>>>well...
>>>/Jon
>>>
>>
>>If I may coin a phrase:
>>
>>Lambda tar pit (noun, by mutation from "Turing tar pit") - The attempt
>>to build an entire language/programming model from nothing but symbols
>>and lambdas.  Such a language is theoretically Turing-complete but
>>sheer hell to work in.
> 
> 
> Fair enough.  But again, by this assesment, the "C community" I
> suppose is in a similar tar pit...
> 
> /Jon
> 
The C Tar Pit is having a language which starts at the first floor of 
the abstraction pyramid (the bits of the machine) and makes the 
programmer build to the top.  The Lambda Tar Pit is starting at the tip 
and trying to build down to the bottom from there.

-- 
The science of economics is the cleverest proof of free will yet 
constructed.
From: jayessay
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <m31wve3642.fsf@rigel.goldenthreadtech.com>
Eli Gottlieb <···········@gmail.com> writes:

> The C Tar Pit is having a language which starts at the first floor of
> the abstraction pyramid (the bits of the machine) and makes the
> programmer build to the top.  The Lambda Tar Pit is starting at the
> tip and trying to build down to the bottom from there.

That's (another) common error you are making here (which is probably
the real "C tar pit": that it is a high level assember that actually
has anything in common with current machines and their architectures).

Also, the lambda technique is actually _much_ lower level and basic -
though not particularly appropriate for current machines either...

/Jon

-- 
'j' - a n t h o n y at romeo/charley/november com
From: ·······@gmail.com
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <1146345982.924179.53190@u72g2000cwu.googlegroups.com>
Eli Gottlieb wrote:
> That's exactly what I was asking about.  Think: How do special operators
> or primitive functions get added to the language?  They're coded in a
> lower level language.  If we could export C functions into Lisp, we
> could extend implementations by writing a library.
In the implementation of languages you're used to, maybe. Those
languages usually have a very high interpretative overhead, or lack
reflection and introspective abilities. Very much like the original
LISP that only had a couple primitive and implemented the rest (and
themselves, in metacircular definitions) on top of those, Common Lisp
has a surprisingly small core. Most of the standard special forms
defined in the CLHS don't have to be primitive. In fact, just about any
can be, and often is, defined in terms of the others. See
http://home.pipeline.com/~hbaker1/MetaCircular.html for example. When
metaprogramming is available (be it through macros, a rewriting
preprocessor à la camlp4, etc), it is rather silly to insist on using
an "escape hatch" to another language. Metaprogramming is the ultimate
escape hatch. The only reason one could want to use an external
language would then be performance. However, remember that the
interpretative overhead of most CL (or Lisp in general) implementation
is rather low, or even inexistent, so that escaping to another language
will very rarely make sense, and, when it will, it will nearly always
make more sense to have it be a function than a special form.

You've only managed to make a fool of yourself lately by bringing your
cultural expectations with yourself. Asserting things because of one's
ignorance of the alternatives is a mistake one should not make
repeatedly. Read a bit before assuming things are the same in every
language and implementation.

Paul Khuong
From: Eli Gottlieb
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <woa5g.15400$ZQ3.6447@twister.nyroc.rr.com>
·······@gmail.com wrote:
> Eli Gottlieb wrote:
> 
>>That's exactly what I was asking about.  Think: How do special operators
>>or primitive functions get added to the language?  They're coded in a
>>lower level language.  If we could export C functions into Lisp, we
>>could extend implementations by writing a library.
> 
> In the implementation of languages you're used to, maybe. Those
> languages usually have a very high interpretative overhead, or lack
> reflection and introspective abilities. Very much like the original
> LISP that only had a couple primitive and implemented the rest (and
> themselves, in metacircular definitions) on top of those, Common Lisp
> has a surprisingly small core. Most of the standard special forms
> defined in the CLHS don't have to be primitive. In fact, just about any
> can be, and often is, defined in terms of the others. See
> http://home.pipeline.com/~hbaker1/MetaCircular.html for example. When
> metaprogramming is available (be it through macros, a rewriting
> preprocessor � la camlp4, etc), it is rather silly to insist on using
> an "escape hatch" to another language. Metaprogramming is the ultimate
> escape hatch. The only reason one could want to use an external
> language would then be performance. However, remember that the
> interpretative overhead of most CL (or Lisp in general) implementation
> is rather low, or even inexistent, so that escaping to another language
> will very rarely make sense, and, when it will, it will nearly always
> make more sense to have it be a function than a special form.
> 
> You've only managed to make a fool of yourself lately by bringing your
> cultural expectations with yourself. Asserting things because of one's
> ignorance of the alternatives is a mistake one should not make
> repeatedly. Read a bit before assuming things are the same in every
> language and implementation.
> 
> Paul Khuong
> 
So I suppose I can add first-class lexical environment support to any 
Common Lisp implementation WITHOUT reimplementing any part of the 
language OR adding to the implementation's source code?

-- 
The science of economics is the cleverest proof of free will yet 
constructed.
From: Pascal Costanza
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <4bkrfuF11psbkU2@individual.net>
Eli Gottlieb wrote:

> So I suppose I can add first-class lexical environment support to any 
> Common Lisp implementation WITHOUT reimplementing any part of the 
> language OR adding to the implementation's source code?

To a certain (probably sufficient?) extent you can, as I have explained 
previously.


Pascal

-- 
3rd European Lisp Workshop
July 3-4 - Nantes, France - co-located with ECOOP 2006
http://lisp-ecoop06.bknr.net/
From: James Bielman
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <87r73f5thf.fsf@jamesjb.com>
Eli Gottlieb <···········@gmail.com> writes:

> Furthermore, can I write a macro in C that will recognize and
> manipulate Lisp data from any conforming implementation and export
> it to Lisp with CFFI?

Obviously it doesn't make a lot of sense to be talking about C macros
calling Lisp code, but doing it from functions isn't so hard---I
whipped this up in just a few minutes (error checking and releasing
the interned Lisp objects from the hash tables are, of course, left as
an exercise for the reader):

;;; sillydemo.lisp: A silly demo of portably calling Lisp from C.
(asdf:oos 'asdf:load-op 'cffi)

(defpackage #:sillydemo
  (:use #:cl #:cffi))
(in-package #:sillydemo)

;;; Hash table of Lisp objects handed to C code, by ID.
(defvar *lisp-objects* (make-hash-table))

;;; Reverse hash table for looking up object IDs.
(defvar *reverse-lisp-objects* (make-hash-table))

;;; Next ID number to hand out when returning a Lisp object.
(defvar *next-lisp-object-id* 0)

;;; Intern a Lisp object in *LISP-OBJECTS*, allocating a unique ID for
;;; it to pass to the C code.  If an existing object is EQL to OBJECT,
;;; it will return that ID instead of creating another.
(defun intern-lisp-object (object)
  (multiple-value-bind (id winp)
      (gethash object *reverse-lisp-objects*)
    (unless winp
      (setf id (incf *next-lisp-object-id*))
      (setf (gethash id *lisp-objects*) object)
      (setf (gethash object *reverse-lisp-objects*) id))
    id))

;;; Convert an array of object IDs from C into a list of Lisp objects.
(defun convert-object-array (argc argv)
  (loop for i below argc
        for id = (mem-aref argv :uint32 i)
        for object = (gethash id *lisp-objects*)
        collect object))

;;; Convert a Lisp expression, as a string, to a Lisp object.
(defcallback lisp-read :uint32 ((expr :string))
  (let* ((*read-eval* nil)
         (value (ignore-errors (read-from-string expr))))
    (intern-lisp-object value)))

;;; Call the function NAME, passing ARGC arguments, taken from ARGV.
;;; 
;;; It would be even cooler if we could define varargs callbacks for
;;; this, but I don't think any Lisp can do that AFAIK...
(defcallback lisp-apply :uint32 ((name :string) (argc :int) (argv :pointer))
  (let* ((*read-eval* nil)
         (symbol (ignore-errors (read-from-string name))))
    (unless (and symbol (symbolp symbol) (fboundp symbol))
      (format t "~&;; Warning: Undefined function ~A." name)
      (null-pointer))
    (let* ((args (convert-object-array argc argv))
           (result (apply symbol args)))
      (intern-lisp-object result))))

;;; These global variables must be function pointers in the C shared
;;; library.  They will be set to the callback functions.
(defcvar "lisp_read" :pointer)
(defcvar "lisp_apply" :pointer)

;;; Load the shared library and set up the pointers.
(load-foreign-library "./sillydemo.so")
(setf *lisp-read* (callback lisp-read))
(setf *lisp-apply* (callback lisp-apply))

;;; Call the "run" function in the shared library.
(foreign-funcall "run" :void)

----

/* sillydemo.c: compile this into sillydemo.so:
 * in linux: gcc -fPIC -DPIC -shared -o sillydemo.so sillydemo.c */
#include <stdint.h>
#include <stdlib.h>

typedef uint32_t lispobj;

lispobj (*lisp_apply) (char *name, int argc, lispobj *argv);
lispobj (*lisp_read) (char *expr);

int
run ()
{
   lispobj args[3], fortytwo, result;

   fortytwo = lisp_read ("42");
   args[0] = args[1] = fortytwo;
   result = lisp_apply ("expt", 2, args);
   args[0] = lisp_read ("t");
   args[1] = lisp_read ("\"result: ~A~%\"");
   args[2] = result;
   lisp_apply ("format", 3, args);
   
   return 0;
}

----
$ sbcl --noinform --load sillydemo.lisp
[ snip junk ]
result: 150130937545296572356771972164254457814047970568738777235893533016064

$ clisp -q -x '(load "sillydemo.lisp")'
[ snip junk ]
0 errors, 0 warnings
result: 150130937545296572356771972164254457814047970568738777235893533016064

James
From: Benjamin Teuber
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <e2vdnu$5pb$1@kohl.informatik.uni-bremen.de>
Pascal Costanza schrieb:

> I think the scripting languages [...]

Isn't the term "scripting language" an oversimplification people like 
the java-propaganda-minister like to use for them?

To me, the fact that there aren't so many compilers around for a 
language isn't really a property of that language itself.

> Maybe all that the scripting languages prove is that domain-specific 
> languages are indeed a good idea. The actual language framework for 
> implementing them is typically C, which results in serious disadvantages 
> wrt efficiency because the only sane way to implement a domain-specific 
> language in C is by writing interpreters.(Notice that one popular way 
> to extend those scripting language with efficient libraries is by 
> implementing those libraries in C, not in the scripting language itself.)
> 
> What actually puzzles me is that those scripting languages, which tend 
> to be dynamic / dynamically typed, are almost always implemented in 
> static / statically typed languages (i.e., C). If dynamic typing is a 
> good idea, they should, IMHO, actually be implemented in dynamic 
> languages themselves. However, it seems to me that those who implement 
> scripting languages  subconsciously don't believe in dynamic approaches 
> themselves.

Yes, most poular languages are lacking the meta-level and possibility to 
go low-level if you like.

> 
> What's also wrong with implementing domain-specific languages in a very 
> different language is that the barrier between the two languages is too 
> deep. It seems to me much more preferable that:
> 
> - The implemented language is closer in spirit to the host language. 
> (For example, they should both be dynamically typed.)
> - The implemented language is much better integrated with the host 
> language, such that you don't need to cross a certain barrier to provide 
> extensions.
> - As a result, the different domain-specific abstractions can be mixed 
> and matched in the same framework (because you may need different 
> approaches for different parts of your system).
> 
> I think that Lisp is especially good at this kind of integration.

This is because noone else realized that syntax sucks...
> 
> I am skeptical whether "we" can do both at the same time, make Lisp more 
> accessible to a broader spectrum of programmers, including "newbies" and 
> "average" programmers, and keep the expressive power of Lisp that makes 
> it such a flexible and at the same time efficient language. For example, 
> take a look at all the different "new" Lisp dialects: My impression is 
> that they also explicitly or implicitly try to simplify Lisp, so the 
> fact that they provide less than Common Lisp is not just an accident. I 
> rather think that the loss of expressiveness is intentional.
> 
> It's easy to suggest cosmetic and aesthetic changes to Common Lisp, 
> including those that would "accidentally" break existing code. However, 
> it's hard to suggest substantial improvements, especially improvements 
> that are not already available for Common Lisp in one way or the other, 
> for example as vendor-specific extensions.
> 
> There are enough languages out there that aim to simplify things and 
> make it easier for occasional programmers to write programs, including 
> in the Lisp world (cf. ISLISP or Scheme). There isn't a substantial need 
> for another one, but there must be room for expert languages, IMHO.


I think the solution would be a powerful meta-language like cl (maybe a 
bit less ugly, but anyways...) on top of which we implement a clean and 
easy to learn objectfuncional language as a library (Ruby seems a good 
direction).
People could just learn and use that one without care for what's hidden, 
just being happy with how easy one can write web-apps with it, until 
once they've grown up and need more power...
They will discover the meta-level to add simple features they need, and 
realize it is good. From now on, they will work more on the meta-level 
to extend and bend the language like Neo does to the Matrix (dunno why 
this came to my mind...) - and they will become happy lispers.

Anyways, see you tomorrow =)

Ben
From: Pascal Costanza
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <4c5vr9F14h2s2U2@individual.net>
Benjamin Teuber wrote:
> Pascal Costanza schrieb:
> 
>> I think the scripting languages [...]
> 
> Isn't the term "scripting language" an oversimplification people like 
> the java-propaganda-minister like to use for them?
> 
> To me, the fact that there aren't so many compilers around for a 
> language isn't really a property of that language itself.

You need a specification that tells you what parts of a language 
implementation are intentional and in what respects an implementation is 
allowed to deviate from such a specification. That's like a contract 
that tells users what they can expect from different implementations and 
that binds implementors. A language that is defined by its only 
implementation doesn't give you such a specification.


Pascal

-- 
3rd European Lisp Workshop
July 3-4 - Nantes, France - co-located with ECOOP 2006
http://lisp-ecoop06.bknr.net/
From: Benjamin Teuber
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <e2tt50$n2s$1@kohl.informatik.uni-bremen.de>
> Actually I don't think macros are that great. They are mostly
> oversold and other more useful concepts are overlooked.

I don't agree. To me, macros (as a clean shortcut for eval&quote) are 
_that_ great! They allow you to extend the compiler to produce any 
language you like and thus touch the very basis of computer science.

So I think they are the most useful thing after recursive data 
structures and functions (because they use both).

What "more useful concepts" do you think have been overlooked?

Ben
From: Bill Atkins
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <87vest2ybc.fsf@rpi.edu>
Benjamin Teuber <······@web.de> writes:

>> Actually I don't think macros are that great. They are mostly
>> oversold and other more useful concepts are overlooked.
>
> I don't agree. To me, macros (as a clean shortcut for eval&quote) are
> _that_ great! They allow you to extend the compiler to produce any
> language you like and thus touch the very basis of computer science.
>
> So I think they are the most useful thing after recursive data
> structures and functions (because they use both).
>
> What "more useful concepts" do you think have been overlooked?
>
> Ben

CLOS, code-as-data (without which your macros could not exist),
interactive development, multiple return values, native compilation
almost as a rule, the MOP, the sequence functions, a symbol data type,
exact mathematics by default, built-in and distinct array and list
types (languages that provide these often conflate the two - ruby,
perl, python), pathnames, the ability to READ in and PRINT out data,
and so on.

Macros are nice, but they're just part of the big picture.  And macros
are not always appropriate; there are sometimes more flexible ways of
doing things.

-- 
This is a song that took me ten years to live and two years to write.

- Bob Dylan
From: ······@corporate-world.lisp.de
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <1146260534.621180.271450@j33g2000cwa.googlegroups.com>
Bill Atkins schrieb:

> Benjamin Teuber <······@web.de> writes:
>
> >> Actually I don't think macros are that great. They are mostly
> >> oversold and other more useful concepts are overlooked.
> >
> > I don't agree. To me, macros (as a clean shortcut for eval&quote) are
> > _that_ great! They allow you to extend the compiler to produce any
> > language you like and thus touch the very basis of computer science.
> >
> > So I think they are the most useful thing after recursive data
> > structures and functions (because they use both).
> >
> > What "more useful concepts" do you think have been overlooked?
> >
> > Ben
>
> CLOS, code-as-data (without which your macros could not exist),
> interactive development, multiple return values, native compilation
> almost as a rule, the MOP, the sequence functions, a symbol data type,
> exact mathematics by default, built-in and distinct array and list
> types (languages that provide these often conflate the two - ruby,
> perl, python), pathnames, the ability to READ in and PRINT out data,
> and so on.

'Late binding' is one of the more important concepts.
From: Eli Gottlieb
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <v2w4g.4705$Gg.4666@twister.nyroc.rr.com>
······@corporate-world.lisp.de wrote:
> Bill Atkins schrieb:
> 
> 
>>Benjamin Teuber <······@web.de> writes:
>>
>>
>>>>Actually I don't think macros are that great. They are mostly
>>>>oversold and other more useful concepts are overlooked.
>>>
>>>I don't agree. To me, macros (as a clean shortcut for eval&quote) are
>>>_that_ great! They allow you to extend the compiler to produce any
>>>language you like and thus touch the very basis of computer science.
>>>
>>>So I think they are the most useful thing after recursive data
>>>structures and functions (because they use both).
>>>
>>>What "more useful concepts" do you think have been overlooked?
>>>
>>>Ben
>>
>>CLOS, code-as-data (without which your macros could not exist),
>>interactive development, multiple return values, native compilation
>>almost as a rule, the MOP, the sequence functions, a symbol data type,
>>exact mathematics by default, built-in and distinct array and list
>>types (languages that provide these often conflate the two - ruby,
>>perl, python), pathnames, the ability to READ in and PRINT out data,
>>and so on.
> 
> 
> 'Late binding' is one of the more important concepts.
> 
I dunno.  I'd like to be able to specify types for my function arguments 
and have them checked.  Then I could get an error like "Passed a 
non-list as PARAM argument of FUNC.", rather than "Tried to take the CDR 
of a non-list." somewhere in the xth macro expansion of FUNC's already 
sizable source code.

-- 
The science of economics is the cleverest proof of free will yet 
constructed.
From: Eli Gottlieb
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <v3w4g.4707$Gg.4077@twister.nyroc.rr.com>
······@corporate-world.lisp.de wrote:
> Bill Atkins schrieb:
> 
> 
>>Benjamin Teuber <······@web.de> writes:
>>
>>
>>>>Actually I don't think macros are that great. They are mostly
>>>>oversold and other more useful concepts are overlooked.
>>>
>>>I don't agree. To me, macros (as a clean shortcut for eval&quote) are
>>>_that_ great! They allow you to extend the compiler to produce any
>>>language you like and thus touch the very basis of computer science.
>>>
>>>So I think they are the most useful thing after recursive data
>>>structures and functions (because they use both).
>>>
>>>What "more useful concepts" do you think have been overlooked?
>>>
>>>Ben
>>
>>CLOS, code-as-data (without which your macros could not exist),
>>interactive development, multiple return values, native compilation
>>almost as a rule, the MOP, the sequence functions, a symbol data type,
>>exact mathematics by default, built-in and distinct array and list
>>types (languages that provide these often conflate the two - ruby,
>>perl, python), pathnames, the ability to READ in and PRINT out data,
>>and so on.
> 
> 
> 'Late binding' is one of the more important concepts.
> 
I dunno.  I'd like to be able to specify types for my function arguments 
and have them checked.  Then I could get an error like "Passed a 
non-list as PARAM argument of FUNC.", rather than "Tried to take the CDR 
of a non-list." somewhere in the xth macro expansion of FUNC's already 
sizable source code.

Binding and type-checking should be performed where at the programmer's 
convenience, not where the language designers thought it should be.

-- 
The science of economics is the cleverest proof of free will yet 
constructed.
From: Tayssir John Gabbour
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <1146303222.412408.7870@j73g2000cwa.googlegroups.com>
Bill Atkins wrote:
> Benjamin Teuber <······@web.de> writes:
> >
> > What "more useful concepts" do you think have been overlooked?
>
> CLOS, code-as-data (without which your macros could not exist),
> interactive development, multiple return values, native compilation
> almost as a rule, the MOP, the sequence functions, a symbol data type,
> exact mathematics by default, built-in and distinct array and list
> types (languages that provide these often conflate the two - ruby,
> perl, python), pathnames, the ability to READ in and PRINT out data,
> and so on.

Special variables, conditions system... didn't someone make a list of
this stuff?

(And the more I procrastinate making e-z lists, the later I will be for
the 4-hour train ride to ECLM...)

Tayssir
From: Pascal Bourguignon
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <87ejzi89ra.fsf@thalassa.informatimago.com>
"Ben Prew" <········@gmail.com> writes:
> [...]
> Here's some sample code:
>
> (defmacro var (&rest all)
> `(let ,@all))

You'd rather want to write:

(var a = 2
     b = (+ a 2)
     c = (* a b)
  (print (list a b c)))

(defmacro var (&body body)
  (if (string= '= (second body))
     `(let ((,(first body) ,(third body)))
        (var ,@(cdddr body)))
     `(progn ,@body)))


> It's all about minimizing standard deviation.

I don't think so.   
Otherwise, why don't you just learn C over and over?
The standard deviation between C and C is 0.


-- 
__Pascal Bourguignon__                     http://www.informatimago.com/

PUBLIC NOTICE AS REQUIRED BY LAW: Any use of this product, in any
manner whatsoever, will increase the amount of disorder in the
universe. Although no liability is implied herein, the consumer is
warned that this process will ultimately lead to the heat death of
the universe.
From: marc spitzer
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <slrne54dmr.srh.ms4720@sdf.lonestar.org>
On 2006-04-28, Pascal Bourguignon <···@informatimago.com> wrote:
>
> I don't think so.   
> Otherwise, why don't you just learn C over and over?
> The standard deviation between C and C is 0.

Thats just not so. K&R vs ANSI 1 vs current ANSI vs MSVC vs GCC ...

and this is for the defined part of the language. 

marc


-- 
······@sdf.lonestar.org
SDF Public Access UNIX System - http://sdf.lonestar.org
From: Kaz Kylheku
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <1146257744.360382.186270@e56g2000cwe.googlegroups.com>
Ben Prew wrote:
> One of them, in an example used by Rob Garret, discussed the merits of
> deprecating nth in favor of elt, since elt is a superset of nth.

ELT and NTH do the same thing over a common subset of input parameters
(suitably reversed, of course). The input spaces form an intersection
for which neither function signals an error and computes the same
thing.

NTH blows up if it is given a vector instead of a list. But, on the
other hand, ELT blows up if it is given an index out of range. For
instance (NTH 1 NIL) produces NIL, whereas (ELT 1 NIL) is an error.
You have to pause to consider that before you replace NTH by ELT.

NTH and ELT are different functions, which have a different purpose.

CL has a sequences library, consisting of functions whose interface
accepts vectors and lists. ELT is a member of this library, and can be
used as a building block to locally extend it.   If I see ELT being
used in Lisp code, it tells me more than just that the element of a
sequence is being accessed. It means that the code is written under the
sequences library paradigm. It tells me that lists are being used as
sequences, rather than as structures.

But you see ELT is usually not the best building block for writing
sequence functions. You want to test the type of the sequence(s) early
in the processing and invoke different code. For instance, if the
sequence is being sequentially processed, then if it is a list, a cons
iterator that marches by CDR operations is better than ELT references,
which scan the list from the beginning (in the absence of sophisticated
loop optimizations from the compiler).  In the vector branch of the
code, you would use AREF rather than ELT. Why use a function that
handles lists in code which is specifically designed for vectors, and
already typechecked?

So ELT exists primarily for completeness (what would a sequences
library be, if it had no function for accessing the N-th element?) and
as a building block writing functions that accept both lists and
vectors, functions that will probably be easy to write, but sub-optimal
in run-time efficiency.

Another observation is that NTH is typically used with small integer
constants. You do not write code which computes the index to NTH,
particularly if that index can be large. You use NTH when you run out
of the available functions named the English ordinals. You have  FIRST,
SECOND, THIRD, ... TENTH. There is a function to get the eleventh
element, and it is spelled (NTH 10 ...)
From: Norman Werner
Subject: Re: Decreasing the "standard deviation" of lisp
Date: 
Message-ID: <1146426455.877192.278450@g10g2000cwb.googlegroups.com>
Standard eviation?! I think you are talking about some distance measure
instead.
There is no statistics unless you imply that new languages with old or
new features 
are designed on a (quasi) random, basis. 

Norman