Xah <···@xahlee.org> writes:
> why Haskell or Dylan hasn't replaced CL yet?
On the remote chance that you really are looking for information
instead of (or in addition to) just trying to stir up a fight with
Erik Naggum...
CL and Dylan are in slightly different parts of the design space. CL
offers great runtime flexibility, but at the cost of carrying around
substantial parts of its program development environment at runtime --
rather like a tortoise carrying its house around. "Delivery modes"
that produce compact CL applications arrived late in the game and are
only a partial solution.
Dylan started out close to CL in its goals, but with a greater
emphasis on runtime efficiency and separation of the programming
environment from the runtime support. For this, the designers gave up
some runtime flexibility and some faciliities for introspection. They
also took the opportunity to clean up the design and jettison about
20 years of accumulated barnacles.
(The Dylan syntax was also modified in an attempt to appeal more to
"mainstream" programmers raised on infix languages such as C. Given
Dylan's goals, that was probably a good idea, but in my personal view
the attempt was botched, making it very hard to produce a macro system
with the power and elegance that we see in various Lisp dialects.)
Unfortunately, Dylan was strangled in the crib by its parents. During
Apple's near-death experience, they first strung out Dylan development
and then killed the whole lab that was supporting it. Harlequin's
effort got farther, but suffered a similar fate. My group at CMU
reluctantly moved to Java, and were later blown away by DARPA
insanity.
A few refugees still carry the Dylan torch, and I wish them well. But
a great deal of ground has now been lost to Java, Perl, and C++, and
some projects have reverted to CL. So we may never know how well
Dylan might have succeeded on its own merits.
I have less first-hand experience with Haskell, but its strong and
deep commitment to a mostly-pure functional style puts it in a very
different part of the design space from these other languages.
Haskell may have its own niche, but it is not a direct competitor to
CL or Dylan. Neither, by the way, are the various flavors of ML,
which place type-safety above all other virtues.
Probably the real answer to your question is that Haskell and Dylan
have not displaced CL because Java beat them to it. I believe that CL
will survive (until some worthy successor comes along) because it is
still by far the best language for building really complex systems in
an incremental, evolutionary style. Some large "AI style" projects
can't live without it. But in many gray areas where Java is
sufficient (not better -- just sufficient), Java has won.
-- Scott
===========================================================================
Scott E. Fahlman Internet: ···@cs.cmu.edu
Principal Research Scientist Phone: 412 268-2575
Department of Computer Science Fax: 412 268-5576
Carnegie Mellon University Latitude: 40:26:46 N
5000 Forbes Avenue Longitude: 79:56:55 W
Pittsburgh, PA 15213 Mood: :-)
===========================================================================
"Scott E. Fahlman" wrote:
> I believe that CL will survive (until some worthy successor comes along)
> because it is still by far the best language for building really complex
> systems in an incremental, evolutionary style.
Yes, that's one of CL's niches.
But it's not the only one.
I earn my living writing rather simple programs. Almost all my programs are
less than 15,000 lines (and half of that is typically recycled library code).
Most of my projects take somewhere between 2 weeks and 2 months of my time.
So: nothing 'really complex' or 'incremental, evolutionary' here. But for
me, switching from C/C++ to Common Lisp is the best thing I've ever done in
my career. Now that I use CL, I can suggest simple 1-month projects to my
clients that would have taken half a year (and would have been way too
expensive) before. Programming is fun again, like it used to be 15 years
ago.
The flexibility and clarity that CL offers, is not only important for
'large "AI style" projects'. It may be even more useful for small projects
where deadlines and time-to-market constraints force you to either write
simple, flexible and understandable programs or spend most of your time
chasing bugs.
Arthur Lemmens
P.S. Before I switched to Common Lisp (about two years ago), I took a serious
look at Haskell, Dylan and Java as well.
I didn't choose Haskell, because:
- there was no industrial-strength compiler with a good Windows GUI
library
- I couldn't convince myself that lazy functional programming didn't
result in memory leaks at unpredictable times
I didn't choose Dylan, because:
- it was too young and uncertain
- the promise of more runtime efficiency wasn't as important for me
as the lack of a powerful and elegant macro system
- I've never been able to swallow Dylan's switch from prefix to infix
I didn't choose Java, because:
- well, why should I? The only reason for preferring Java to Common
Lisp is that everybody else is using it.
If I may ask you: Which CL implementation are you using? Are you using only
one CL implementation?
Janos Blazi
Arthur Lemmens <·······@simplex.nl> schrieb in im Newsbeitrag:
·················@simplex.nl...
>
> "Scott E. Fahlman" wrote:
>
> > I believe that CL will survive (until some worthy successor comes along)
> > because it is still by far the best language for building really complex
> > systems in an incremental, evolutionary style.
>
> Yes, that's one of CL's niches.
> But it's not the only one.
>
> I earn my living writing rather simple programs. Almost all my programs
are
> less than 15,000 lines (and half of that is typically recycled library
code).
> Most of my projects take somewhere between 2 weeks and 2 months of my
time.
>
> So: nothing 'really complex' or 'incremental, evolutionary' here. But for
> me, switching from C/C++ to Common Lisp is the best thing I've ever done
in
> my career. Now that I use CL, I can suggest simple 1-month projects to my
> clients that would have taken half a year (and would have been way too
> expensive) before. Programming is fun again, like it used to be 15 years
> ago.
>
> The flexibility and clarity that CL offers, is not only important for
> 'large "AI style" projects'. It may be even more useful for small projects
> where deadlines and time-to-market constraints force you to either write
> simple, flexible and understandable programs or spend most of your time
> chasing bugs.
>
> Arthur Lemmens
>
> P.S. Before I switched to Common Lisp (about two years ago), I took a
serious
> look at Haskell, Dylan and Java as well.
> I didn't choose Haskell, because:
> - there was no industrial-strength compiler with a good Windows GUI
> library
> - I couldn't convince myself that lazy functional programming didn't
> result in memory leaks at unpredictable times
> I didn't choose Dylan, because:
> - it was too young and uncertain
> - the promise of more runtime efficiency wasn't as important for me
> as the lack of a powerful and elegant macro system
> - I've never been able to swallow Dylan's switch from prefix to
infix
> I didn't choose Java, because:
> - well, why should I? The only reason for preferring Java to Common
> Lisp is that everybody else is using it.
>
-----= Posted via Newsfeeds.Com, Uncensored Usenet News =-----
http://www.newsfeeds.com - The #1 Newsgroup Service in the World!
-----== Over 80,000 Newsgroups - 16 Different Servers! =-----
Janos Blazi wrote:
> If I may ask you: Which CL implementation are you using?
Lispworks for Windows 4.1.
> Are you using only one CL implementation?
For real work: yes.
Arthur
An additional question:
How would you judge the chances or merits of Python? Will it survive?
janos Blazi
Scott E. Fahlman <···@cs.cmu.edu> schrieb in im Newsbeitrag:
··············@myrddin.gwydion.cs.cmu.edu...
>
> Xah <···@xahlee.org> writes:
>
> > why Haskell or Dylan hasn't replaced CL yet?
>
> On the remote chance that you really are looking for information
> instead of (or in addition to) just trying to stir up a fight with
> Erik Naggum...
>
> CL and Dylan are in slightly different parts of the design space. CL
> offers great runtime flexibility, but at the cost of carrying around
> substantial parts of its program development environment at runtime --
> rather like a tortoise carrying its house around. "Delivery modes"
> that produce compact CL applications arrived late in the game and are
> only a partial solution.
>
> Dylan started out close to CL in its goals, but with a greater
> emphasis on runtime efficiency and separation of the programming
> environment from the runtime support. For this, the designers gave up
> some runtime flexibility and some faciliities for introspection. They
> also took the opportunity to clean up the design and jettison about
> 20 years of accumulated barnacles.
>
> (The Dylan syntax was also modified in an attempt to appeal more to
> "mainstream" programmers raised on infix languages such as C. Given
> Dylan's goals, that was probably a good idea, but in my personal view
> the attempt was botched, making it very hard to produce a macro system
> with the power and elegance that we see in various Lisp dialects.)
>
> Unfortunately, Dylan was strangled in the crib by its parents. During
> Apple's near-death experience, they first strung out Dylan development
> and then killed the whole lab that was supporting it. Harlequin's
> effort got farther, but suffered a similar fate. My group at CMU
> reluctantly moved to Java, and were later blown away by DARPA
> insanity.
>
> A few refugees still carry the Dylan torch, and I wish them well. But
> a great deal of ground has now been lost to Java, Perl, and C++, and
> some projects have reverted to CL. So we may never know how well
> Dylan might have succeeded on its own merits.
>
> I have less first-hand experience with Haskell, but its strong and
> deep commitment to a mostly-pure functional style puts it in a very
> different part of the design space from these other languages.
> Haskell may have its own niche, but it is not a direct competitor to
> CL or Dylan. Neither, by the way, are the various flavors of ML,
> which place type-safety above all other virtues.
>
> Probably the real answer to your question is that Haskell and Dylan
> have not displaced CL because Java beat them to it. I believe that CL
> will survive (until some worthy successor comes along) because it is
> still by far the best language for building really complex systems in
> an incremental, evolutionary style. Some large "AI style" projects
> can't live without it. But in many gray areas where Java is
> sufficient (not better -- just sufficient), Java has won.
>
> -- Scott
>
>
===========================================================================
> Scott E. Fahlman Internet: ···@cs.cmu.edu
> Principal Research Scientist Phone: 412 268-2575
> Department of Computer Science Fax: 412 268-5576
> Carnegie Mellon University Latitude: 40:26:46 N
> 5000 Forbes Avenue Longitude: 79:56:55 W
> Pittsburgh, PA 15213 Mood: :-)
>
===========================================================================
-----= Posted via Newsfeeds.Com, Uncensored Usenet News =-----
http://www.newsfeeds.com - The #1 Newsgroup Service in the World!
-----== Over 80,000 Newsgroups - 16 Different Servers! =-----
In article <··········@goliath.newsfeeds.com>, Janos Blazi wrote:
>An additional question:
>How would you judge the chances or merits of Python? Will it survive?
>
>janos Blazi
See:
<http://www.norvig.com/python-lisp.html>
Python for Lisp Programmers
Summary:"Python seems to be well-suited for many of the tasks that Lisp
is well-suited for, except those that require high performance."
-- Kimmo
((lambda (integer)
(coerce (loop for i upfrom 0 by 8 below (integer-length integer)
collect (code-char (ldb (byte 8 i) integer))) 'string))
100291759904362517251920937783274743691485481194069255743433035)
Kimmo T Takkunen <········@cc.helsinki.fi> wrote in message
····························@sirppi.helsinki.fi...
janos Blazi
>
> See:
> <http://www.norvig.com/python-lisp.html>
> Python for Lisp Programmers
>
> Summary:"Python seems to be well-suited for many of the tasks that Lisp
> is well-suited for, except those that require high performance."
>
Personally I'd be much more nervous about Python's lack of true GC. The last
time I snooped the python newsgroup there was some concern over this by
people using Python for larger projects and for programs that have to be
left running for long periods. Python cooperates with C very nicely for
optimisation purposes. The *practical* difference speed difference isn't
nearly the 100 to 1 that Norvig implies, in my experience. I still strongly
prefer Lisp, however. To my surprise one of the main reasons is
readability - with a little practice Lisp is much readble than Python,
something I'd never have expected. And its definitely more "designable" - ie
capable of being used to fit whatever reasonable design I come up with.
Jonathan Coupe
"Janos Blazi" <······@netsurf.de> writes:
> An additional question:
> How would you judge the chances or merits of Python? Will it survive?
Well, I haven't used Python myself. By reputation, it's kind of an
ugly and awkward design, but has a lot of built-in functionality that
makes it easy to do certain kinds of tasks with little effort. Those
language always seem to survive in their specialized niches until
something better comes along to evict them. usually the "something
better" is not more general, but is yet another niche-dweller.
My own guess is that Python will go the way of Snobol, Forth, Prolog,
and so on -- a slow fade over time, with some fanatical adherents
still hanging on. But that's just a guess from someone far outside
the Python culture.
-- Scott
···@cs.cmu.edu (Scott E. Fahlman) wrote:
> Well, I haven't used Python myself. By reputation, it's kind of an
> ugly and awkward design, but has a lot of built-in functionality that
> makes it easy to do certain kinds of tasks with little effort. Those
> language always seem to survive in their specialized niches until
> something better comes along to evict them. usually the "something
> better" is not more general, but is yet another niche-dweller.
>
> My own guess is that Python will go the way of Snobol, Forth, Prolog,
> and so on -- a slow fade over time, with some fanatical adherents
> still hanging on. But that's just a guess from someone far outside
> the Python culture.
HhhuuH ???
--
Now after given some insight and inside info on Dylan/CL, Dr. Fahlman
proceeds to stamp out other languages by wanton speculation on the demise of
Python?
Thank you Scott for the info on Dylan and CL. It's very much appreciated by
me, and no doubt by many others here.
As to the Python speech, thanks to Xah for pointing out a crime and set
newbie bystanders straight. We do suffer from conceit at times.
Python (which i'm indifferent) has probably more users than CL, used in more
application than CL, and picking up users faster than probably all lisps
with two or more of O'Reilly publications and much mention in the Linux
crowd, and is decidedly here to live longer than CL.
For those linguists out there, there's a new language in the Perl/Python
class called Ruby from Japan. Here's it's website:
http://www.ruby-lang.org/en/
it's similar in looks to Perl but is supposedly more pure OOP and
well-designed than even Python.
Xah
···@xahlee.org
http://xahlee.org/PageTwo_dir/more.html
"Scott E. Fahlman" <···@cs.cmu.edu> wrote
>
> Well, I haven't used Python myself. By reputation, it's kind of an
> ugly and awkward design, but has a lot of built-in functionality that
> makes it easy to do certain kinds of tasks with little effort.
Perhaps you're confusing Python and Perl?
···@cs.cmu.edu (Scott E. Fahlman) writes:
> "Janos Blazi" <······@netsurf.de> writes:
>
> > An additional question:
> > How would you judge the chances or merits of Python? Will it survive?
>
> Well, I haven't used Python myself.
Oh, come on!
> By reputation, it's kind of an
> ugly and awkward design, but has a lot of built-in functionality that
> makes it easy to do certain kinds of tasks with little effort.
Where did you read that? It's almost completely ass-backwards!
> Those language always seem to survive in their specialized niches
> until something better comes along to evict them. usually the
> "something better" is not more general, but is yet another
> niche-dweller.
Python is not (particularly) specialised. It's not as general as CL,
mind.
> My own guess is that Python will go the way of Snobol, Forth, Prolog,
> and so on -- a slow fade over time, with some fanatical adherents
> still hanging on.
You may be right. It's not fading yet, though. Maybe you were
thinking of some other language.
> But that's just a guess from someone far outside the Python culture.
What was the point of posting that piece of malinformed rubbish? I
don't intend to start a language war (I like Python, I hope it does
well so I can one day get a job programming it rather than
C++/Perl/Java if lisp jobs are lacking), but I know c.l.l gets
aggravated when people post misinformation about lisp, and we should
know better.
Regards,
Michael
--
very few people approach me in real life and insist on proving they are
drooling idiots. -- Erik Naggum, comp.lang.lisp
Michael Hudson <·····@cam.ac.uk> writes:
> What was the point of posting that piece of malinformed rubbish? I
> don't intend to start a language war (I like Python, I hope it does
> well so I can one day get a job programming it rather than
> C++/Perl/Java if lisp jobs are lacking), but I know c.l.l gets
> aggravated when people post misinformation about lisp, and we should
> know better.
Yeah, you're right. I was just trying to respond to a question, but I
shouldn't have ventured an opinion about Python without doing a lot
more research.
The uses of Python that I had looked at were in specialized niches
such as XML programming. They made use of some nice modules and
libraries that the user community had written. So Python was
classified in my head as basically a scripting language with a few
poweful facilities that give it an advantage in certain niches. Kind
of like PERL, but a bit more elegant and with better support for an OO
style.
But upon further investigation, I see that Python really is more
general than I gave it credit for. It began as a scripting langauge,
but seems to have developed into a nice little "Swiss Army Knife".
When viewed as a scripting language, Python is a lot cleaner and a lot
more general than most. In particular, it seems very useful for
adding an interpreter-like wrapper to applications written in other
langauges. It's probably unfair to refer to that as a "niche".
If viewed as a programming language for serious, large-scale,
long-lived software projects that have to produce reasonably efficient
code, it looks to me like Python has some serious drawbacks and
omissions. Some serious Python users complain about this; others
claim the language is great for large-scale projects. I have nothing
useful to add to that discussion, and this newsgroup is probably not
the right place for it. However, if someone has extensive experience
with both CL and Python for similar large-scale projects, I think a
comparison -- or just a collection of war stories -- would be welcome
here.
Personally, I like to use CL for most tasks where others would reach
for Python or Perl or some scripting language. Scripts have a nasty
way of evolving into real programs that have to be maintained,
debugged, and sometimes tuned and compiled. CL may be overkill for
smallish tasks, but it's nice not to have a discontinuity between
small and large tasks. Your mileage may vary.
-- Scott
This preamble is for Erik Naggum! DO NOT, I REPEAT, DO NOT READ THE
FOLLOWING LINK!
Everybody else, read at your own risk. :)
http://www.strout.net/python/pythonvslisp.html
Cheers
--
Marco Antoniotti ===========================================
PARADES, Via San Pantaleo 66, I-00186 Rome, ITALY
tel. +39 - 06 68 10 03 17, fax. +39 - 06 68 80 79 26
http://www.parades.rm.cnr.it/~marcoxa
Marco Antoniotti <·······@parades.rm.cnr.it> writes:
> Everybody else, read at your own risk. :)
oh no, please, we've been through that one on the mcl mailing list
lately. That comparision is so bad and so boring, couldn't you
please have left it alone?
--
(espen)
Espen Vestre <·····@*do-not-spam-me*.vestre.net> writes:
> Marco Antoniotti <·······@parades.rm.cnr.it> writes:
>
> > Everybody else, read at your own risk. :)
>
> oh no, please, we've been through that one on the mcl mailing list
> lately. That comparision is so bad and so boring, couldn't you
> please have left it alone?
I apologize. :{ I just couldn't resist.
Cheers
--
Marco Antoniotti ===========================================
PARADES, Via San Pantaleo 66, I-00186 Rome, ITALY
tel. +39 - 06 68 10 03 17, fax. +39 - 06 68 80 79 26
http://www.parades.rm.cnr.it/~marcoxa
I have read that article. I do not quite understand the criteria chosen. It
seems he does not consider speed and he does not consider stability problems
that may arise from the reference counting either.
Clearly Python is a much less sophistacated language that Python and this
has many advanteges. So Python must win if the strong sides of Lisp are not
appreciated.
Janos Blazi
Marco Antoniotti <·······@parades.rm.cnr.it> schrieb in im Newsbeitrag:
··············@parades.rm.cnr.it...
>
> This preamble is for Erik Naggum! DO NOT, I REPEAT, DO NOT READ THE
> FOLLOWING LINK!
>
> Everybody else, read at your own risk. :)
>
> http://www.strout.net/python/pythonvslisp.html
>
> Cheers
>
> --
> Marco Antoniotti ===========================================
> PARADES, Via San Pantaleo 66, I-00186 Rome, ITALY
> tel. +39 - 06 68 10 03 17, fax. +39 - 06 68 80 79 26
> http://www.parades.rm.cnr.it/~marcoxa
-----= Posted via Newsfeeds.Com, Uncensored Usenet News =-----
http://www.newsfeeds.com - The #1 Newsgroup Service in the World!
-----== Over 80,000 Newsgroups - 16 Different Servers! =-----
"Janos Blazi" <······@netsurf.de> writes:
> How would you judge the chances or merits of Python? Will it
> survive?
There was a post in comp.lang.dylan a couple of months back where the
poster mentioned that in the December 1999 issue of Linux Journal,
Guido van Rossum (the creator of Python) in an interview mentioned
Dylan as "...everything Python is plus so much more...".
I hesitate to mention it as I haven't read the article in question,
don't know the context of the statement and don't want to fuel any
flames but I'm curious if anyone read and can comment on the article.
Chris.
From: David J. Cooper
Subject: So-called "infix" languages (was: Re: why Haskell hasn't replaced CL yet?)
Date:
Message-ID: <38B18131.73EE23E1@genworks.com>
"Scott E. Fahlman" wrote:
>
> (The Dylan syntax was also modified in an attempt to appeal more to
> "mainstream" programmers raised on infix languages such as C.
>
Lately I have been wondering about this. Why is C considered an
"infix" language? Only a small subset of its operators are used
in an infix syntax, mostly simple arithmetic operators like "+"
and "-". Normal function calls are done using prefix notation
just like CL, are they not? The parentheses are just in a
slightly different position.
So languages like C are not really "infix" at all -- they are
a confused mixture of infix, prefix, and other really strange
stuff like var++ (what's that??)
The bottom line is that CL has a consistent syntax while
languages like C or Java do not. End of story.
-dave
--
David J. Cooper Jr, Chief Engineer Genworks International
·······@genworks.com 5777 West Maple, Suite 130
(248) 932-2512 (Genworks HQ/voicemail) West Bloomfield, MI 48322-2268
(248) 407-0633 (pager) http://www.genworks.com
From: Robert Monfera
Subject: Re: So-called "infix" languages (was: Re: why Haskell hasn't replaced CL yet?)
Date:
Message-ID: <38B180C1.3243B786@fisec.com>
"David J. Cooper" wrote:
> [...] really strange
> stuff like var++ (what's that??)
Postfix.
Robert Monfera <·······@fisec.com> writes:
> "David J. Cooper" wrote:
>
> > [...] really strange
> > stuff like var++ (what's that??)
>
> Postfix.
And the hodge-podge of prefix, infix and postfix that is your average
Algol-like language syntax (whether C, Pascal, Ada or what have you),
is often called misfix (from mixfix) by those who loathe it and the
complexities (for both compilers and users) that this causes... ;)
Regs, Pierre.
--
Pierre Mai <····@acm.org> PGP and GPG keys at your nearest Keyserver
"One smaller motivation which, in part, stems from altruism is Microsoft-
bashing." [Microsoft memo, see http://www.opensource.org/halloween1.html]
From: Larry Elmore
Subject: Re: So-called "infix" languages (was: Re: why Haskell hasn't replaced CL yet?)
Date:
Message-ID: <88sti8$ale$1@news.campuscwix.net>
"Pierre R. Mai" <····@acm.org> wrote in message
···················@orion.dent.isdn.cs.tu-berlin.de...
> Robert Monfera <·······@fisec.com> writes:
>
> > "David J. Cooper" wrote:
> >
> > > [...] really strange
> > > stuff like var++ (what's that??)
> >
> > Postfix.
>
> And the hodge-podge of prefix, infix and postfix that is your average
> Algol-like language syntax (whether C, Pascal, Ada or what have you),
> is often called misfix (from mixfix) by those who loathe it and the
> complexities (for both compilers and users) that this causes... ;)
Just out of curiousity, what would you call J or APL's notation?
Larry
From: Marco Antoniotti
Subject: Re: So-called "infix" languages (was: Re: why Haskell hasn't replaced CL yet?)
Date:
Message-ID: <lw66vh6242.fsf@parades.rm.cnr.it>
"Larry Elmore" <········@montana.campuscw.net> writes:
> "Pierre R. Mai" <····@acm.org> wrote in message
> ···················@orion.dent.isdn.cs.tu-berlin.de...
> > Robert Monfera <·······@fisec.com> writes:
> >
> > > "David J. Cooper" wrote:
> > >
> > > > [...] really strange
> > > > stuff like var++ (what's that??)
> > >
> > > Postfix.
> >
> > And the hodge-podge of prefix, infix and postfix that is your average
> > Algol-like language syntax (whether C, Pascal, Ada or what have you),
> > is often called misfix (from mixfix) by those who loathe it and the
> > complexities (for both compilers and users) that this causes... ;)
>
> Just out of curiousity, what would you call J or APL's notation?
Ideograms. The Asian (and ancient Egyptian) crowd may appreciate it
better than people from the Eternal City. :)
Cheers
--
Marco Antoniotti ===========================================
PARADES, Via San Pantaleo 66, I-00186 Rome, ITALY
tel. +39 - 06 68 10 03 17, fax. +39 - 06 68 80 79 26
http://www.parades.rm.cnr.it/~marcoxa
> "David J. Cooper" wrote:
>> [...] really strange
>> stuff like var++ (what's that??)
Robert Monfera <·······@fisec.com> wrote
> Postfix.
shut your pie-hole if you don't know what you are talking about.
Ken Pitman wrote about this about 2 years ago, and from the message he
either implied or stated that he's been tired of clarifying this issue.
Search dejanews.com.
(Ken Pitman is one of the founder or leader in lisps.
http://world.std.com/~pitman/
)
I'll clarify your brain with my own version.
The common name for the lisp way is Fully Parenthesized Prefix Notation.
This syntax is the most straightforward to represent a tree, but it's not
the only choice. For example, one could have Fully Parenthesized Postfix
Notation by simply moving the semantics of the first element to the last.
You write
(arg1 arg2 ... f) instead of the usual (f arg1 arg2)
Like wise, you can essentially move f anywhere and still make sense. In
Mathematica, they put the f in front of the paren, and use square brackets
instead. e.g. f[a, b, c], Sin[3], Map[f, list] ...etc.
The f in front of parent makes better conventional sense until f is itself a
list which then we'll see things like f[a,b][c, g[3,h]] etc. It's worse when
there are arbitrary nesting of heads.
A _prefix notation_ in Mathematica is represented as ·@arg. Essentially, a
prefix notation limits you to one argument. More example:
·@·@·@c
is equivalent to
f[a[b[c]]] or in lispy (f (a (b c))).
A postfix notation is similar. In Mathematica it's, e.g. c//b//a//f. One can
say, for example
List[1,2,3]//Sin
which is syntactically equivalent to
Sin[List[1,2,3]] or ···@List[1,2,3]
which is semantically equivalent to
Map[Sin, List[1,2,3]]
in Mathematica. For infix notation, one puts symbols between arguments. In
Mathematica, the canonical form for infix notation is by sandwiching tilde
around the function name. e.g.
Join[List[1,2],List[3,4]]
can be written as
List[1,2] ~Join~ List[3,4].
In general, when we say C is a infix notation language, we don't mean it's
strictly infix but the situation is one-size-fits-all for convenience.
Things like i++ or ++i is more or less an arbitrary sugar syntax. (that is,
ad hoc syntax variation without any comprehensive design or theory base.)
In Mathematica for example, there is quite a lot syntax sugar besides the
above mentioned regular ones. For instance, Plus[a,b,c] can be written in
the following ways:
(a+b)+c
a+b+c
(a+b)~Plus~c
····@(3+4)
····@Plus[3,4]
Plus[3,4]//Plus
The gist being that certain functions such as Plus are assigned a special
symbol '+' with particular syntax form to emulate convention. One can also
do i++, ++i, i+=1 for instances. Another example: Times[a,b] can be written
as a*b or just a b.
In Haskell, which i'm starting to learn, there is also similar constructs
for the turning an Function (i.e. keyword) into an operator with infix
notation by regular constructs. e.g.
3 + 4
is equivalent to
(+) 3 4
anyone know Haskell better please extend.
As a side note, the Perl mongers are proud of their slogan of There Are More
Than One Way To Do It in their gazillion of ad hoc syntax sugars but unaware
that in functional languages (such as Mathematica) that there are consistent
and generalized constructs that can generate far far more syntax variations
than the ad hoc prefixed Perl both in theory AND in practice. (and in lisps,
macros does the same.) And, more importantly, they clamor about Perl's
"expressiveness" more or less on the useless syntax level but don't realize
that semantic expression is what's really important. I don't know about Lisp
really, but I know in Mathematica and i've read in Haskell that there are
constructs whose power and concept is beyond imperative programers can
apprehend.
Now back to syntax... anyone care to add?
Xah
···@xahlee.org
http://xahlee.org/PageTwo_dir/more.html
some error in my previous post.
i wrote
> (a+b)+c
> a+b+c
> (a+b)~Plus~c
> ····@(3+4)
> ····@Plus[3,4]
> Plus[3,4]//Plus
should've been
(a+b)+c
a+b+c
(a+b)~Plus~c
····@(a+b+c)
····@Plus[a,b,c]
Plus[a,b,c]//Plus
i wrote:
> I don't know about Lisp
> really, but I know in Mathematica and i've read in Haskell that there are
> constructs whose power and concept is beyond imperative programers can
> apprehend.
but I do know that the same thing can be said for lisp. I meant to say that
i wasn't qualified to expound on lisp.
i wrote:
> The gist being that certain functions such as Plus are assigned a special
> symbol '+' with particular syntax form to emulate convention. One can also
> do i++, ++i, i+=1 for instances. Another example: Times[a,b] can be written
> as a*b or just a b.
btw, the canonical form or FullForm as called in Mathematica for those i++
are:
Increment[i] for i++
AddTo[i,1] for i+=1
PreIncrement[i] for ++i
Xah
···@xahlee.org
http://xahlee.org/PageTwo_dir/more.html
From: Marco Antoniotti
Subject: Re: So-called "infix" languages (was: Re: why Haskell hasn't replaced CL yet?)
Date:
Message-ID: <lwwvnx4mza.fsf@parades.rm.cnr.it>
Xah <···@xahlee.org> writes:
> some error in my previous post.
>
> i wrote
> > (a+b)+c
> > a+b+c
> > (a+b)~Plus~c
> > ····@(3+4)
> > ····@Plus[3,4]
> > Plus[3,4]//Plus
>
> should've been
>
> (a+b)+c
> a+b+c
> (a+b)~Plus~c
> ····@(a+b+c)
> ····@Plus[a,b,c]
> Plus[a,b,c]//Plus
>
> i wrote:
> > I don't know about Lisp
> > really, but I know in Mathematica and i've read in Haskell that there are
> > constructs whose power and concept is beyond imperative programers can
> > apprehend.
>
> but I do know that the same thing can be said for lisp. I meant to say that
> i wasn't qualified to expound on lisp.
>
> i wrote:
> > The gist being that certain functions such as Plus are assigned a special
> > symbol '+' with particular syntax form to emulate convention. One can also
> > do i++, ++i, i+=1 for instances. Another example: Times[a,b] can be written
> > as a*b or just a b.
>
> btw, the canonical form or FullForm as called in Mathematica for those i++
> are:
>
> Increment[i] for i++
>
> AddTo[i,1] for i+=1
>
> PreIncrement[i] for ++i
Mathematica is (at least it was) a glorified Lisp interpreter, which,
BTW, got a lot of things wrong when it came to defining its "language"
(e.g. variable scoping).
Cheers
--
Marco Antoniotti ===========================================
PARADES, Via San Pantaleo 66, I-00186 Rome, ITALY
tel. +39 - 06 68 10 03 17, fax. +39 - 06 68 80 79 26
http://www.parades.rm.cnr.it/~marcoxa
Marco Antoniotti <·······@parades.rm.cnr.it> wrote
> Mathematica is (at least it was) a glorified Lisp interpreter, which,
> BTW, got a lot of things wrong when it came to defining its "language"
> (e.g. variable scoping).
your crime here is bloat of conceit. A major symptom among learned man.
Perhaps you need to make an appointment with doctor Naggum too. If you call
Mathematica in whatever stage a glorified Lisp interpreter, then you might
as well call any functional language that.
I don't like to advocate commercial software, but Mathematica today is quite
many things then a language or computer algebra system.
> BTW, got a lot of things wrong when it came to defining its "language"
> (e.g. variable scoping).
yeah? I'd be interested to hear your opinion in detail. If i'm not mistaken,
i think comp.lang.lisp people would also be interested in hearing occasional
informative discussions of lisp descendents.
Xah
···@xahlee.org
http://xahlee.org/PageTwo_dir/more.html
From: Marco Antoniotti
Subject: Re: So-called "infix" languages (was: Re: why Haskell hasn't replaced CL yet?)
Date:
Message-ID: <lwr9e54hon.fsf@parades.rm.cnr.it>
Xah <···@xahlee.org> writes:
> Marco Antoniotti <·······@parades.rm.cnr.it> wrote
> > Mathematica is (at least it was) a glorified Lisp interpreter, which,
> > BTW, got a lot of things wrong when it came to defining its "language"
> > (e.g. variable scoping).
>
> your crime here is bloat of conceit. A major symptom among learned man.
> Perhaps you need to make an appointment with doctor Naggum too. If you call
> Mathematica in whatever stage a glorified Lisp interpreter, then you might
> as well call any functional language that.
Why not? Isn't lambda calculus at the core of functional programming anyway?
Don't you remember that one of the very first implementations of
Haskell was actually a CMUCL image dump? :)
> I don't like to advocate commercial software, but Mathematica today is quite
> many things then a language or computer algebra system.
>
> > BTW, got a lot of things wrong when it came to defining its "language"
> > (e.g. variable scoping).
>
> yeah? I'd be interested to hear your opinion in detail. If i'm not mistaken,
> i think comp.lang.lisp people would also be interested in hearing occasional
> informative discussions of lisp descendents.
You should have checked my use of tenses. Anyway, while playing
(indeed many years ago) with the remote evaluation facility of
Mathematica you had to eventually hack up a cons-like data structure
to get it evaluated. Hence the re-affirmation of the principle that
every sufficiently complicated "system" contains a Lisp interpreter :)
As per the comment about language design choices, I'll have to dig out
old notes about "local" variable handling and so forth.
I know very well that Mathematica is a very useful and very powerful
system (so are Maple, and Macsyma). Yet, what I said is pretty
much the way I felt when working with it. If things have changed
recently, then it's for the best.
Cheers
--
Marco Antoniotti ===========================================
PARADES, Via San Pantaleo 66, I-00186 Rome, ITALY
tel. +39 - 06 68 10 03 17, fax. +39 - 06 68 80 79 26
http://www.parades.rm.cnr.it/~marcoxa
Marco Antoniotti <·······@parades.rm.cnr.it> wrote
> As per the comment about language design choices, I'll have to dig out
> old notes about "local" variable handling and so forth.
I surmise your are talking about that Mathematica does not have a lexically
scoped local variable. i.e. it's Block[] construct is dynamically scoped.
Mathematica had a Module[] and With[] constructs since at least v.2 back in
before 1993. Module is like Block except it's lexically scoped. With[] is
for local constants. As i recall, it's Scheme's several versions of 'let'
(except Scheme does not have dynamically scoped local variable by design, as
everyone here knew.) For those curious, here's an example of syntax:
Module[
List[ var1, var2, ..., Set[var9, val1], ...],
CompoundExpression[ f[...], ...]
]
You might be talking about v.1, which is in the late 80s. As far as i know,
Richard J. Fateman (http://http.cs.berkeley.edu/~fateman/) was very critical
of Mathematica and especially of this. He happened to have told me that
Wolfram people added Module because of his criticisms.
Btw, this site might be of interest to people here
http://members.aol.com/jeff570/mathsym.html
(earliest use of math symbols)
The design of mathematical notation and its meanings is a fascinating and an
unsolved subject in logician and philosopher's homes. (hand waving
explanation goes here)
I like Haskell foremost because it has two properties: no side effects
(purely functional), and lazy evaluation (non-strict).
Purely functional languages essentially intends to become a piece of live
mathematics. It's syntax/semantics are often called denotational semantics.
This is so because in such functional purity the code really looks like or
actually represent mathematical notations and equations, but with the
property being alive.
Pure perfection makes me giddy with elation. I envision a utopia of Borg
technology, where all software has no bugs, maintain themselves, improve
themselves, and in every other way just perfect. This pipe dream will begin
to happen only when languages _like_ Haskell takes rein of the world.
Xah
···@xahlee.org
http://xahlee.org/PageTwo_dir/more.html
From: Tim Bradshaw
Subject: Re: So-called "infix" languages (was: Re: why Haskell hasn't replaced CL yet?)
Date:
Message-ID: <ey3vh3hbizw.fsf@cley.com>
* Xah wrote:
>> "David J. Cooper" wrote:
>>> [...] really strange
>>> stuff like var++ (what's that??)
> Robert Monfera <·······@fisec.com> wrote
>> Postfix.
> shut your pie-hole if you don't know what you are talking about.
Do you claim that I++ is not postfix? A remarkable claim, I think.
--tim
Tim Bradshaw <···@cley.com> wrote:
> Do you claim that I++ is not postfix? A remarkable claim, I think.
my claims are often remarkable, especially in the right context.
Xah
···@xahlee.org
http://xahlee.org/PageTwo_dir/more.html
From: Harley Davis
Subject: Re: So-called "infix" languages (was: Re: why Haskell hasn't replaced CL yet?)
Date:
Message-ID: <38b3399f$0$236@newsreader.alink.net>
Xah <···@xahlee.org> wrote in message ······················@xahlee.org...
> Tim Bradshaw <···@cley.com> wrote:
> > Do you claim that I++ is not postfix? A remarkable claim, I think.
>
> my claims are often remarkable, especially in the right context.
Remarkably ignorant, in this case.
-- Harley
From: Tim Bradshaw
Subject: Re: So-called "infix" languages (was: Re: why Haskell hasn't replaced CL yet?)
Date:
Message-ID: <ey3r9e5ba0h.fsf@cley.com>
* Xah wrote:
> Tim Bradshaw <···@cley.com> wrote:
>> Do you claim that I++ is not postfix? A remarkable claim, I think.
> my claims are often remarkable, especially in the right context.
Not to mention wrong, of course.
From: Samuel A. Falvo II
Subject: Re: So-called "infix" languages (was: Re: why Haskell hasn't replaced CL yet?)
Date:
Message-ID: <slrn8brqjh.jec.kc5tja@garnet.armored.net>
In article <·················@xahlee.org>, Xah wrote:
>my claims are often remarkable, especially in the right context.
You don't happen to be Mike Restivo by any chance, are you?
--
KC5TJA/6, DM13, QRP-L #1447
Samuel A. Falvo II
Oceanside, CA
From: William Deakin
Subject: Re: the so-called "English" languages (was: Re: why Haskell hasn'treplaced CL yet?)
Date:
Message-ID: <38B292C0.192B292F@pindar.com>
Having tried to follow this thread for a while now, I have to ask: What are you
on about? Is english your first language? Given an average level of typos it is
usually possible to work out what is being said. But your level of grammatical
and structural errors fundamentally obscure the content of your postings.
Cheers
:) will
From: Dorai Sitaram
Subject: Re: the so-called "English" languages (was: Re: why Haskell hasn'treplaced CL yet?)
Date:
Message-ID: <88u9rk$rbj$1@news.gte.com>
In article <·················@pindar.com>,
William Deakin <········@pindar.com> wrote:
>Having tried to follow this thread for a while now, I have to ask: What are you
>on about? Is english your first language? Given an average level of typos it is
>usually possible to work out what is being said. But your level of grammatical
>and structural errors fundamentally obscure the content of your postings.
Xah seems to break rules (consciously or not)
the way an artist breaks rules. The result is very
cheerful and liberating, and often quite beautiful.
--d
From: William Deakin
Subject: Re: the so-called "English" languages (was: Re: why Haskell hasn'treplaced CL yet?)
Date:
Message-ID: <38B2B7B7.80BC9E36@pindar.com>
Dorai wrote:
> Xah seems to break rules (consciously or not) the way an artist breaks rules. The
> result is very cheerful and liberating, and often quite beautiful.
Ah, so Xah is posting *art* as technical discussion! It all becomes clear why it
makes no sense.
Thanks for clearing that up,
:) will
From: Espen Vestre
Subject: Re: the so-called "English" languages (was: Re: why Haskell hasn'treplaced CL yet?)
Date:
Message-ID: <w6zoss8a6o.fsf@wallace.nextel.no>
William Deakin <·····@pindar.com> writes:
> It all becomes clear why it makes no sense.
well, I'm still convinced that we witness an interesting experiment
in Computational Lingusitics ;-)
--
(espen)
From: William Deakin
Subject: Re: the so-called "English" languages (was: Re: why Haskell hasn'treplaced CL yet?)
Date:
Message-ID: <38B3C03C.7D1FCD17@pindar.com>
Espen wrote:
> ...I'm still convinced that we witness an interesting experiment in
> Computational Lingusitics ;-)
If so, IMHO whoever is behind this has passed the Turing test. I admit
that I am unable to distiguish between the computer and human generated
*art* [1]
Best Regards,
:) will
[1] or is that gibberish? ;)
From: Xah
Subject: Re: the so-called "English" languages (was: Re: why Haskell hasn'treplaced CL yet?)
Date:
Message-ID: <B4D92839.61CD%xah@xahlee.org>
William Deakin <·····@pindar.com> wrote
>Is english your first language?
My first language is body language. Have you not learned it? No wonder you
don't communicate well.
Cheerfully,
Xah
···@xahlee.org
http://xahlee.org/PageTwo_dir/more.html
From: William Deakin
Subject: Re: the so-called "English" languages (was: Re: why Haskell hasn'treplaced CL yet?)
Date:
Message-ID: <38B3F09C.1A702E35@pindar.com>
Xah wrote:
> My first language is body language. Have you not learned it? No wonder you
> don't communicate well.
Excellent, I think that makes half-a-dozen sentences I think I have
understood! Anyway, since you understand me, but *I* was the one experiencing
problems understanding *you*, surely it is *you* that doesn't communicate
well. That is, unless we are operating the standard usenet `you don't
understand what I say, so you are at fault' protocol. Just let me know.
`Soonest done, soonest mended.'
Anyway, keep up the good work,
:) will
ps: It may help clear up our little difficulties if you could send
instructions for encoding body language for newsgroup posting.
From: Xah
Subject: Re: the so-called "English" languages (was: Re: why Haskell hasn'treplaced CL yet?)
Date:
Message-ID: <B4D937A5.61DE%xah@xahlee.org>
Dear will & readers,
William Deakin <·····@pindar.com> wrote:
> ps: It may help clear up our little difficulties if you could send
> instructions for encoding body language for newsgroup posting.
It's all in standard collegiate dictionaries & writing guides.
For those precocious, more detail in intro to linguistics. For the advanced:
Introduction to Mathematical Logic. For the serious geniuses such as members
of the Naggum club: read Xah's opuses.
Xah
···@xahlee.org
http://xahlee.org/PageTwo_dir/more.html
From: William Deakin
Subject: Re: the so-called "English" languages (was: Re: why Haskell hasn'treplaced CL yet?)
Date:
Message-ID: <38B3FD65.7F467F00@pindar.com>
Xah wrote:
> Will wrote:
> > ps: It may help clear up our little difficulties if you could send
> > instructions for encoding body language for newsgroup posting.
>
> It's all in standard collegiate dictionaries & writing guides.
Obviously my mistake. However, not having the benefit of your clearly superior
education, I am at a loss. This also must be a geographical issue in-as-much-as
I am unaware of any college in the UK (or Europe for that matter) that supplies
these guides. Can you name a title or author so that I could buy such a guide
from Amazon?
Anyway, my hat goes off to the programmer who wrote the AI-that-is-Xah. Any
program that can interpret more than just text and analyse speech and body
language too is clearly uber rather that unter.
This is getting more William Gibson every moment...
Happy parsing,
:) will
From: Samuel A. Falvo II
Subject: Re: the so-called "English" languages (was: Re: why Haskell hasn'treplaced CL yet?)
Date:
Message-ID: <slrn8brqru.jec.kc5tja@garnet.armored.net>
In article <·················@xahlee.org>, Xah wrote:
>For those precocious, more detail in intro to linguistics. For the advanced:
Precocious what? Whoa...there are two completely independent sentence
fragments there, joined only by a solitary comma. Poor comma -- it must be
so lonely...
--
KC5TJA/6, DM13, QRP-L #1447
Samuel A. Falvo II
Oceanside, CA
Voice in the desert: Quiet, isn't it, Xah?
> anyone know Haskell better please extend.
I recommend comp.lang.functional, where they have extensive knowledge
of Haskell, Haskell compilers, and Haskell applications. They're also
very realistic about the current state of the language. Not only will
all your questions be answered there (if they can't answer, who can?),
your reception _may_ be a little friendlier.
After all, it's not polite to compare a language with 40 years of
maturity with a language that has only 10 years. Imagine if you
compared women like that? Hmm, human languages? "English is spoken by
more people than French, so why hasn't English replaced French?" Ask a
few French nationalists. I doubt their replies will be at all polite!
I gave up on the prefix/infix/postfix debate a few decades ago.
Followup adjusted.
--
Email address intentionally munged | You can never browse enough
"Ahh, well. Back to reality." -- Mark Radcliffe
: Martin Rodgers
> Voice in the desert: Quiet, isn't it, Xah?
Don't speak for yourself, Rogers.
>> anyone know Haskell better please extend.
>
> I recommend comp.lang.functional,
and i recommend the Haskell _mailing list_ found in the Haskell language
site.
if we have experts on Haskell syntax here, i'm interested in a live account.
I'm quite capable of reading the Haskell language spec otherwise.
> where they have extensive knowledge
> of Haskell, Haskell compilers, and Haskell applications. They're also
> very realistic about the current state of the language. Not only will
> all your questions be answered there (if they can't answer, who can?),
> your reception _may_ be a little friendlier.
>
> After all, it's not polite to compare a language with 40 years of
> maturity with a language that has only 10 years. Imagine if you
> compared women like that? Hmm, human languages? "English is spoken by
> more people than French, so why hasn't English replaced French?" Ask a
> few French nationalists. I doubt their replies will be at all polite!
I was misunderstood by you. I did not come here to do such comparison and i
did not seek petting replies, but informative or qualitative ones in one way
or another. Every geek in fact loves a good flame war that's devoid of
vacuous babble. The most common worst type are those "you're now in my
killfile" drivel. The common people should start to think about how to rate
the value of a message. Is it how much politeness it imparts? Is it how many
words it contains? Is it purely its relevancy? (and how do you judge
relevancy?) or is it the rarity of technical info or quality of an opinion
expressed? In face to face situations, politeness prevails for many reasons.
In writings, quality of content prevails. I do shameless pretend to be
giving a lesson here to the general public, in the absence of doctor Naggum.
> I gave up on the prefix/infix/postfix debate a few decades ago.
It is good to know, truly. As you intend it, many of us are not qualified to
say we were born few decades ago. Age & experience is not the criterion but
is not to be ignored.
> Followup adjusted.
Bad behavior. I sent an inappropriate reply to that group by mistake as a
result of this. You are not the director of this group, eh? I can post in
comp.lang.functional if i wanted to.
with all due respects, (^_~)
Xah
···@xahlee.org
http://xahlee.org/PageTwo_dir/more.html
Voice in the desert: Quiet, isn't it, Xah?
> > I recommend comp.lang.functional,
>
> and i recommend the Haskell _mailing list_ found in the Haskell language
> site.
Indeed, you'll find many people who are busy working on Haskell
standards and compilers.
> if we have experts on Haskell syntax here, i'm interested in a live account.
> I'm quite capable of reading the Haskell language spec otherwise.
It's rather pointless discussing the syntax of non-Lisp languages in a
Lisp ng. We're here because we prefer the _Lisp_ syntax. deja.com will
provide many excellent posts on the subject which will explain why.
> I was misunderstood by you. I did not come here to do such comparison and i
> did not seek petting replies, but informative or qualitative ones in one way
> or another.
"Petting"? What does that mean?
The earlier thread subject was "why Haskell hasn't "replaced CL yet?",
which is a very silly question, but it still deserves an answer. My
answer (I don't claim it to be unique or original) is that you can
find the answers in comp.lang.functional, where the use of compilers
for the language has been discussed already. ISTR that the concensus
is that after 10 years, Haskell is still an area of research and will
continue to be for some time yet. However, you should check this for
yourself, using the deja.com search engine.
I like Haskell. I like Lisp. I like a lot of things, but I don't see
any of them as mutually exclusive.
> Every geek in fact loves a good flame war that's devoid of
> vacuous babble.
One geek's cogent argument is another geek's vacuous babble. I'm
offering little more than the views of the people closest to Haskell
itself. See my comments about deja.com above.
> > I gave up on the prefix/infix/postfix debate a few decades ago.
>
> It is good to know, truly. As you intend it, many of us are not qualified to
> say we were born few decades ago. Age & experience is not the criterion but
> is not to be ignored.
It has nothing to do with age; I was a teenager at the time. While
others were arguing over the semicolon issue, I was discovering prefix
and postfix notations, and writing code that writes code. But then, I
have an unusually high aptitude for abstractions.
> > Followup adjusted.
An old UseNet convention.
> Bad behavior. I sent an inappropriate reply to that group by mistake as a
> result of this. You are not the director of this group, eh? I can post in
> comp.lang.functional if i wanted to.
I suggest that you do so, for the reasons given above. It'll save you
a lot of wasted effort. Investigation will reveal a false assumption
that the users of one language give a damn about the opinions of the
users of another language. E.g. Kent Pitman's views on Scheme and his
wise refusal to debate them. You may also observe a tendency for
language wars to end with a reference to Cobol. At the very least
you'll see how effectively Lisp users respond to attacks from users of
other languages. Experience helps considerably. So does being right.
So, I predict that you'll accomplish nothing positive with this
thread. You've already created a netagive impression. I remember you
from a few years ago, and the impression then was much more positive.
BTW, I think the word you're looking for is 'moderator', and it
doesn't apply here. I can set the followups in my posts as I see fit.
If you fail to honour them, that's entirely your choice.
--
Email address intentionally munged | You can never browse enough
"Ahh, well. Back to reality." -- Mark Radcliffe
xah wrote:
>> I was misunderstood by you. I did not come here to do such comparison and i
>> did not seek petting replies, but informative or qualitative ones in one way
>> or another.
Martin Rodgers wrote:
> "Petting"? What does that mean?
Allow me to break it to you gently. Bring out your Oxford English Dict. Turn
to the right page and peruse the fine print. Now read my message again. If
you detect a collusion of logicality, then throw logic into the trashcan and
grok. Shakespeare grokked human Sense and Sensibility, that's why he is
appreciated by those sensible. If you can grok me, then you'll be equally
appreciated by me too.
When i open a book by Emily Dickinson, it is all greeks to me. Only when i
toil with persistence that i possibly get a glimpse of a rainbow. Pretend
i'm Emily.
(In general, it is hard to be appreciated until one is dead.)
Martin:
> It has nothing to do with age; I was a teenager at the time. While
> others were arguing over the semicolon issue, I was discovering prefix
> and postfix notations, and writing code that writes code. But then, I
> have an unusually high aptitude for abstractions.
Translation: "Actually i am quite a handsome _young_ fellow; When i was a
teen, everyone else is really stupid. But then, i'm a genius after all."
Martin:
> So, I predict that you'll accomplish nothing positive with this
> thread.
On the contrary, i've accomplished quite a lot _already_. I have gotten
exactly the opinions i sought from a few CL experts, albeit i have yet to
hear from someone who really have _mastery_ of both CL and one of
Dylan/Haskell (Scott E. Fahlman being a possible exception. Such people are
very rare, after all. Lucky me if they read newsgroups at all.).
As a side effect, i've also pushed my respectfulness (literally or
satirically) into two diametric extremes. If you are a priest, you cannot be
a gigolo; if you are a gigolo, you cannot be a priest. Albeit there are a
lot people do church by day and brothel by night. (please don't get all
offended. It's just a meaningless figure of speech.)
Martin:
>>> Followup adjusted.
Xah
>> ... You are not the director of this group, eh? I can post in
>> comp.lang.functional if i wanted to.
Martin:
> ... BTW, I think the word you're looking for is 'moderator', and it
> doesn't apply here. I can set the followups in my posts as I see fit.
> If you fail to honour them, that's entirely your choice.
'moderator' vs. 'director'? What is their difference? Can you please explain
to me?
Frankly Rodgers, your crime is the heedlessness of other's intelligence. You
need to pay attention to detail and get a drift. But if you are playing with
me, then you need to wipe your deadpan face and give us a good laugh. Like i
said to Rainer, don't ruin a fine troll with banal noise. I knew about my
information resources and FAQs well. Don't chant in the ways of "the right
tool for the right job" or "Haskell is a 10-years old research language". If
you don't have information other than for clueless newbies, you can shut up
your beer-hole too. Don't pretend to be tall in a foreign field.
Martin:
> I remember you
> from a few years ago, and the impression then was much more positive.
For old time's sake, are we still friends?
PS
while i was doing research in answering another poster's reply, i found this
trickle of relevance:
http://www.deja.com/article/499362270
think of it as a gift from me.
Xah
···@xahlee.org
http://xahlee.org/PageTwo_dir/more.html
Voice in the desert: Quiet, isn't it, Xah?
> Translation: "Actually i am quite a handsome _young_ fellow; When i was a
> teen, everyone else is really stupid. But then, i'm a genius after all."
Completely wrong. You're using an outdated model (i.e. linear) of
intelligence. This is why experts on measuring I.Q. refuse to give
their own I.Q. - they don't consider it meaningful.
I said that I have an unusually high aptitude for abstractions. I
expect that this is true of Lisp programmers in general. I'd say the
same for users of FP and LP languages, but for slightly different
reasons.
My point was that age has little to do with it.
> On the contrary, i've accomplished quite a lot _already_.
That's debatable.
> (please don't get all offended. It's just a meaningless figure of speech.)
I hope you'll accept my scepticism in the same spirit.
--
Email address intentially munged | You can never browse enough
will write code that writes code that writes code for food
Martin Rodgers <···@thiswildcardaddressintentiallyleftmunged.demon.co.uk>
wrote
> Voice in the desert: Quiet, isn't it, Xah?
Except the echoes, Martin.
Xah:
>> (please don't get all offended. It's just a meaningless figure of speech.)
Martin:
> I hope you'll accept my scepticism in the same spirit.
There's nothing quite like finding a companion of mind.
According to that alt.troll article, he calls it thoughtstream. It's been
ecstatic reeling in it.
Xah
···@xahlee.org
http://xahlee.org/PageTwo_dir/more.html
Voice in the desert: Quiet, isn't it, Xah?
> It's been ecstatic reeling in it.
Remember my advise about comp.lang.functional.
--
Email address intentionally munged | You can never browse enough
"Ahh, well. Back to reality." -- Mark Radcliffe
From: Samuel A. Falvo II
Subject: Re: So-called "infix" languages (was: Re: why Haskell hasn't replaced CL yet?)
Date:
Message-ID: <slrn8brqgf.jec.kc5tja@garnet.armored.net>
In article <·················@xahlee.org>, Xah wrote:
>Allow me to break it to you gently. Bring out your Oxford English Dict. Turn
What if we don't have an Oxford English Dictionary?
>grok. Shakespeare grokked human Sense and Sensibility, that's why he is
>appreciated by those sensible. If you can grok me, then you'll be equally
I'm perfectly sensible. I hate Shakespear. In fact, I detest his works. I
fail to see how this relates to the conversation at hand.
>When i open a book by Emily Dickinson, it is all greeks to me. Only when i
>toil with persistence that i possibly get a glimpse of a rainbow. Pretend
>i'm Emily.
I'm afraid, my lady, that you have a run in your pantyhose -- you might want
to buy a new pair.
>(In general, it is hard to be appreciated until one is dead.)
And you're trying to coerce people to appreciate you in this thread. I
recommend you watch what you ask for...
>Frankly Rodgers, your crime is the heedlessness of other's intelligence. You
And your crime is the flagrant boasting of your own self-inflated ego.
>need to pay attention to detail and get a drift. But if you are playing with
I fear that you, too, need to follow this advice. Numerous people on this
list have taken great offense to your attitude here, and have vocalized
their feelings towards you. Yet you continue to inflict your torture on
these people.
I could put it bluntly, and say, "Buzz Off." But I fear that you may not be
familiar with such a high-level form of communications. So I'll translate
it to a form more amenable to your oratory and literary skills, "Please be
so kind as to conduct yourself in a manner congruent with the established
culture on this mailing list; should this not be possible, please be so kind
as to depart this place with dignity and honor, and never conduct your
business here until you have learned to do so."
Note that I'm not telling you to go away permanently -- I'm telling you that
you stick out like a sore thumb, and that you need to tone your written
voice down significantly. Treat people with respect here, and we'll treat
you with respect.
>me, then you need to wipe your deadpan face and give us a good laugh. Like i
Personal attack... Bad form...
>you don't have information other than for clueless newbies, you can shut up
>your beer-hole too. Don't pretend to be tall in a foreign field.
OOOooo!! Even MORE personal attacks. If I can get you really pissed off,
maybe I can control you like the puppet you are.
--
KC5TJA/6, DM13, QRP-L #1447
Samuel A. Falvo II
Oceanside, CA
From: Samuel A. Falvo II
Subject: Re: So-called "infix" languages (was: Re: why Haskell hasn't replaced CL yet?)
Date:
Message-ID: <slrn8brp1j.jec.kc5tja@garnet.armored.net>
In article <·················@xahlee.org>, Xah wrote:
>> Postfix.
>
>shut your pie-hole if you don't know what you are talking about.
The example given was X++, which is postfix. Deal with it.
>the only choice. For example, one could have Fully Parenthesized Postfix
>Notation by simply moving the semantics of the first element to the last.
>You write
>(arg1 arg2 ... f) instead of the usual (f arg1 arg2)
And if you remove the parentheses, you end up with Forth -- a fully postfix
programming language which, while not really Lisp, shares a great deal of
commonality with Lisp. And unlike programmers of other programming
languages, the programmers of Forth and Lisp are not hostile to each other
-- in fact, both language communities are usually quite open to the other.
>A _prefix notation_ in Mathematica is represented as ·@arg. Essentially, a
>prefix notation limits you to one argument. More example:
> ·@·@·@c
>is equivalent to
> f[a[b[c]]] or in lispy (f (a (b c))).
This is known as currying, and is semantically equivalent to f(a,b,c),
because any function f(a,b,c) -> z can be rewritten as F(a) -> G(b) -> H(c)
-> z.
>Things like i++ or ++i is more or less an arbitrary sugar syntax. (that is,
>ad hoc syntax variation without any comprehensive design or theory base.)
i++ and ++i are two VERY different operations.
If i is an l-value, i++ will return the value of i+1. ++i will return
the value of i (NOT i+1), yet will still increment i. So it's not "just"
syntactic sugar.
If i is an r-value, neither operation will succeed, and a compile-time error
is generated.
>really, but I know in Mathematica and i've read in Haskell that there are
>constructs whose power and concept is beyond imperative programers can
>apprehend.
^^^^^^^^^
comprehend.
--
KC5TJA/6, DM13, QRP-L #1447
Samuel A. Falvo II
Oceanside, CA
Dear Samuel A. Falvo II,
Your attacks on me appears to be "me too" at best. For a moment i thought i
had another companion of mind.
I do not deign to converse with those who are not Shakespeare and do not
have an Oxford English Dictionary.
The rabble judge others by the noises people make; the refined are
impervious to noises, but react on actions and meanings. If you look through
the thick mist of arrogant writings of certain individuals, you will see the
true value of a pungent garlic.
Your assignment today is to write a 500 words essay on arrogance. If you
don't get an A, i'll cut one of your finger off. After that, you should have
a deep understanding of the word "Arrogance" and can react better at people
that seems arrogant.
PS Please do not "sorry, i can't resist". You'll be able to resist quite a
lot of things if i shove a bomb in your ass. Members of the Naggum club
understands me. I thank Erik Naggum for that.
> Samuel A. Falvo II
> Oceanside, CA
Oceanside? I'd dearly love to be invited to your beach house.
Xah
···@xahlee.org
http://xahlee.org/PageTwo_dir/more.html
> From: ······@garnet.armored.net (Samuel A. Falvo II)
> Newsgroups: comp.lang.lisp
> Date: 2 Mar 2000 03:43:31 GMT
> Subject: Re: So-called "infix" languages (was: Re: why Haskell hasn't replaced
> CL yet?)
>
> In article <·················@xahlee.org>, Xah wrote:
>>> Postfix.
>>
>> shut your pie-hole if you don't know what you are talking about.
>
> The example given was X++, which is postfix. Deal with it.
>
>> the only choice. For example, one could have Fully Parenthesized Postfix
>> Notation by simply moving the semantics of the first element to the last.
>> You write
>> (arg1 arg2 ... f) instead of the usual (f arg1 arg2)
>
> And if you remove the parentheses, you end up with Forth -- a fully postfix
> programming language which, while not really Lisp, shares a great deal of
> commonality with Lisp. And unlike programmers of other programming
> languages, the programmers of Forth and Lisp are not hostile to each other
> -- in fact, both language communities are usually quite open to the other.
>
>> A _prefix notation_ in Mathematica is represented as ·@arg. Essentially, a
>> prefix notation limits you to one argument. More example:
>> ·@·@·@c
>> is equivalent to
>> f[a[b[c]]] or in lispy (f (a (b c))).
>
> This is known as currying, and is semantically equivalent to f(a,b,c),
> because any function f(a,b,c) -> z can be rewritten as F(a) -> G(b) -> H(c)
> -> z.
>
>> Things like i++ or ++i is more or less an arbitrary sugar syntax. (that is,
>> ad hoc syntax variation without any comprehensive design or theory base.)
>
> i++ and ++i are two VERY different operations.
>
> If i is an l-value, i++ will return the value of i+1. ++i will return
> the value of i (NOT i+1), yet will still increment i. So it's not "just"
> syntactic sugar.
>
> If i is an r-value, neither operation will succeed, and a compile-time error
> is generated.
>
>> really, but I know in Mathematica and i've read in Haskell that there are
>> constructs whose power and concept is beyond imperative programers can
>> apprehend.
> ^^^^^^^^^
> comprehend.
>
> --
> KC5TJA/6, DM13, QRP-L #1447
> Samuel A. Falvo II
> Oceanside, CA
From: Samuel A. Falvo II
Subject: Re: So-called "infix" languages (was: Re: why Haskell hasn't replaced CL yet?)
Date:
Message-ID: <slrn8btkak.vcb.kc5tja@garnet.armored.net>
In article <·················@xahlee.org>, Xah wrote:
>Dear Samuel A. Falvo II,
>
>Your attacks on me appears to be "me too" at best. For a moment i thought i
>had another companion of mind.
Taken to private e-mail, where it belongs.
--
KC5TJA/6, DM13, QRP-L #1447
Samuel A. Falvo II
Oceanside, CA
···@cs.cmu.edu (Scott E. Fahlman) writes:
> CL and Dylan are in slightly different parts of the design space. CL
> offers great runtime flexibility, but at the cost of carrying around
> substantial parts of its program development environment at runtime --
> rather like a tortoise carrying its house around. "Delivery modes"
> that produce compact CL applications arrived late in the game and are
> only a partial solution.
That's a standard point, but I think it's somewhat misleading.
Nothing (except lack of time and other resources) stops CL
implementors from doing things in a way more like Dylan's. It would
be implementation-specific (declarations or whatever), but so are the
"delivery modes".
Now, what is the "development env" that has to be carried around?
TRACE and STEP, probably the compiler. Some include the mere ability
to redefine things as "development env". There are some efficiency
costs, chiefly an indirection on function calls to allow the functions
to be redefined; and there's the space taken up by the compiler.
TRACE and STEP are tiny. But in any case, TRACE, STEP, and the
compiler can be "autoloaded" so that they're not there until something
tries to use them. (Autoloading has been around for decades.
Nowadays shared libraries or some other technique might be used).
Of course, some implementations might have substantially more
"development env" built-in with no good way to exclude it. But that's
a problem with those implementations, not with Common Lisp.
-- jd
On 25 Feb 2000, Jeff Dalton wrote:
> ···@cs.cmu.edu (Scott E. Fahlman) writes:
>
> > CL and Dylan are in slightly different parts of the design space. CL
> > offers great runtime flexibility, but at the cost of carrying around
> > substantial parts of its program development environment at runtime --
> > rather like a tortoise carrying its house around. "Delivery modes"
> > that produce compact CL applications arrived late in the game and are
> > only a partial solution.
Regarding this point, I was asked the following question: "how small of
an executable can you get from a CL program that simply prints HELLO WORLD
at the term".
I don't know the answer, infact I don't even know how to produce an
executable from a Lisp program, I never had any need for it.
Thanks,
Tunc
* Tunc Simsek <······@paleale.EECS.Berkeley.EDU>
| Regarding this point, I was asked the following question: "how small of
| an executable can you get from a CL program that simply prints HELLO WORLD
| at the term".
note that "how small an executable" actually means "how much do the
operating system and the executable agree on". e.g., in the case of C
programs under Unix, so much that the executable can effectively be a few
hundred bytes, as the shared libraries that are involved and the
initialization code for the program are, after all, what the entire
operating system is optimized for. this does not mean the memory
footprint of the executable when loaded into memory will be small, or
that it won't do a lot of work behind the scenes.
| I don't know the answer, infact I don't even know how to produce an
| executable from a Lisp program, I never had any need for it.
exactly. therefore, the smallest "executable Common Lisp program" that
does the same meaningless task as the typical "hello, world" demo that
shows off how functions and interactive invocation work under Unix, is
either simply the string "hello, world" typed to a listener (and what's
the sport in that?) or (defun hello () "hello, world") which is almost as
unchallenging.
what we have to realize and counter with is that building lots of tiny
little programs in C is a very inefficient way to build an interactive
environment. think of all the programs and scripts and whatnot as small
functions that can pass values around only as textual strings in pipes at
best. each program is ephemeral and must use the disk for its variables
or state if it has any, or it must receive environment variables and
options each time and the state is maintained in the caller. each
program is run by loading and dynamically linking a *huge* amount of
stuff every time. in contrast, a Common Lisp system got all of this
interactive development environment stuff _right_, with very much simpler
and faster function invocation once you start it up, but you also have to
start up a shell to start you hello program. so why focus on the size of
the "executable". refocus on the amount of work involved and how having
to use an executable on disk for such a trivial behavior is really not a
good thing to begin with.
#:Erik
On 26 Feb 2000 03:12:23 +0000, Erik Naggum <····@naggum.no> wrote:
> note that "how small an executable" actually means "how much do the
> operating system and the executable agree on". e.g., in the case of C
Compatibility is as important as functionality.
> exactly. therefore, the smallest "executable Common Lisp program" that
> does the same meaningless task as the typical "hello, world" demo that
It's not meaningless. The purpose is to show that you can deliver
an application. By making the application as trivial as possible, you
separate the issue of whether you can deliver anything, from the
issue of how complicated a program it might be.
When you deliver an application, you have to take into account that
the users might not have access to your Lisp environment. You also
have to take into account that they might want the application delivered
to their email inboxes, and that they might have a limit on the size of an
incoming message.
> what we have to realize and counter with is that building lots of tiny
> little programs in C is a very inefficient way to build an interactive
We have to be compatible with programs from other vendors, and such
programs tend to work together using well-known interfaces. One of the
most common of those interfaces is the "pipes and filters" interface. So
it's not feasible to just write that interface off as being too inefficient.
As another example, suppose I'm a naive user who uses your program
from my text-editor, invoking it with a filter-region command, to capture
its output in my edit buffer. If I know someone else who has the same
program written in C++, and I've noticed that they can do the filter-region
thing in a tiny fraction of a second, but I always have to wait almost a full
second, I might start to envy them, and wish mine were written in C++
instead of Lisp.
In the real world, we have to keep the users happy. We can't expect
them to bend over backwards to be compatible with us. We have to
deliver what they want, and make them see high quality in what we
deliver. We have to understand their point of view and make the quality
high from their point of view.
Lisp has a lot of advantages and some disadvantages. To sell Lisp,
we have to show that the advantages outweigh the disadvantages. We
can't do that by evading questions about the disadvantages. We have
to instead say something like, "yes, a Lisp program does take 750 ms
to start running, but here are the ways you can mitigate that, and here
are the advantages you get for tolerating that."
If we evade the question, the users will assume the worst. They want
to be sure the program will always start running within a certain number
of seconds, and that the smallest program will be below a certain
number of megabytes. If we evade those questions, they will assume
the answers are bad news, and probably won't have time to give Lisp
any further consideration as they go down their long list of possible
development languages and environments for their next project.
* ·············@not.for.spam
| When you deliver an application, you have to take into account that the
| users might not have access to your Lisp environment.
well, this is the meaningless part. when people deliver applications,
they take for granted that you already have the relevant portions of the
environment the application needs in the shape of DLLs (or other forms of
shared libraries and resources) to make it run. if you don't, you're
expected to download it, or in the worst case, get the application with a
bunch of such libraries.
therefore, the question is: what's considered the application? the DLLs
and the whole shebang or _just_ the executable? in my view, it doesn't
make sense to separate them (neither in the case of C nor CL), but in the
minds of people who compare sizes of executables, the DLLs are somehow
irrelevant, but if they are made aware of them for some languages, like
some not-so-helpful Lisp people seem to force them into, they will also
count the runtime system. this is a very bad move. don't call attention
to these things, and they'll never notice them the exact same way they
never notice the multimegabyte DLLs they install for other packages.
| You also have to take into account that they might want the application
| delivered to their email inboxes, and that they might have a limit on the
| size of an incoming message.
sorry to say so, but this is a specious argument at best. people need to
install some form of runtime system library _once_, and can thereafter
accept any small "executable" = application. this is not a big deal.
what's necessary to ship for Common Lisp programs is usually much smaller
than you need to ship for other languages once you're past this point.
| As another example, suppose I'm a naive user who uses your program from
| my text-editor, invoking it with a filter-region command, to capture its
| output in my edit buffer. If I know someone else who has the same
| program written in C++, and I've noticed that they can do the
| filter-region thing in a tiny fraction of a second, but I always have to
| wait almost a full second, I might start to envy them, and wish mine were
| written in C++ instead of Lisp.
this would have been a useful piece of input if it were true. it isn't.
that is, it used to be true 20 years ago, and today it's stale myth.
| In the real world, we have to keep the users happy.
well, in the mythical world, the users aren't happy. in the real world,
they don't care what language is used as long as they get what they want,
and users put up with a _lot_ of compromises. speed is no longer an
issue, since the hardware new stuff is being deployed on is really fast.
(just trust me on this if you don't believe it.)
| We have to instead say something like, "yes, a Lisp program does take 750
| ms to start running, but here are the ways you can mitigate that, and
| here are the advantages you get for tolerating that."
do tell me just _why_ do we have to lie? this is so blatantly stupid I
get _sick_. on my system, the default Allegro CL starts up in about 20
ms and with one my applications which has a lot of startup-time
compucation, it takes about 35 ms on a bad day.
| If we evade the question, the users will assume the worst.
and some will think _we're_ lying when we tell them that the startup-time
of a C++ program (and certainly a Java program) is longer than that of a
full-blown Common Lisp system. do you know how we can deal with that,
considering your strong desire to perpetuate old myths?
you're welcome to the real world any time, but if you have nothing more
to contribute than trite old myths, you're part of the problem of the
mythical world Lisp _still_ has to fight, not part of any solution in the
real world.
#:Erik
On 28 Feb 2000 02:19:59 +0000, Erik Naggum <····@naggum.no> wrote:
> do tell me just _why_ do we have to lie? this is so blatantly stupid I
> get _sick_. on my system, the default Allegro CL starts up in about 20
> ms and with one my applications which has a lot of startup-time
> compucation, it takes about 35 ms on a bad day.
20 ms implies you can run it 50 times per second in a script loop.
What computer is that on? Have you actually tested it in a script
loop to verify that you can run it 50 times per second?
In the past I've seen people ask how fast Allegro executables
could start up, and the answers I saw were "less than one
second" and "you shouldn't do it that way." I've never had an
opportunity to measure it myself, and did not intend to give an
impression of having actual numbers. The number I mentioned
was part of my example of how the answer should be given. It
could have been any number and still meant the same thing.
* ·············@not.for.spam
| 20 ms implies you can run it 50 times per second in a script loop.
| What computer is that on? Have you actually tested it in a script
| loop to verify that you can run it 50 times per second?
not until now, but a typical run among many goes like this:
time (for x in 0 1 2 3 4; do for y in 0 1 2 3 4 5 6 7 8 9; do ./allegro -batch -kill & done; done; wait)
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
; Exiting Lisp
; Exiting Lisp
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
Allegro CL Enterprise Edition 5.0.1 [Linux/X86] (2/15/0 21:48)
Copyright (C) 1985-1999, Franz Inc., Berkeley, CA, USA. All Rights Reserved.
; Exiting Lisp
real 0m0.986s
user 0m1.050s
sys 0m0.470s
the system is a dual 600MHz Pentium III with 512M RAM at 100MHz, which
runs Debian Linux 2.2 (potato) with kernel 2.2.14 SMP. no swap space,
but a bunch of really fast disks, which should be irrelevant since all of
the relevant blocks are in the disk cache, anyway.
| In the past I've seen people ask how fast Allegro executables could start
| up, and the answers I saw were "less than one second" and "you shouldn't
| do it that way." I've never had an opportunity to measure it myself, and
| did not intend to give an impression of having actual numbers. The
| number I mentioned was part of my example of how the answer should be
| given. It could have been any number and still meant the same thing.
French has an idiomatic number that means "a lot": 36, so you could argue
that startup-times are 36 ms, pardon your French, and be home safe. :)
#:Erik
On 28 Feb 2000 11:39:14 +0000, Erik Naggum <····@naggum.no> wrote:
>time (for x in 0 1 2 3 4; do for y in 0 1 2 3 4 5 6 7 8 9; do ./allegro -batch -kill & done; done; wait)
That looks like you're starting up 50 copies and waiting for them
all to exit. (If I understand what the '&' does.) What would the
results be if you waited for each to exit before starting the next?
Is Allegro written in Lisp?
Dear Mr. not.for.email,
·············@not.for.spam wrote:
> What would the results be if you waited for each to exit before starting the next?
Could you make it more explicit what tests you are looking for? It could look like you have already made your
mind up about how slow CL is, and are trying to refute all evidence to the contrary...
Best Regards,
:) will
* ·············@not.for.spam
| On 28 Feb 2000 11:39:14 +0000, Erik Naggum <····@naggum.no> wrote:
|
| >time (for x in 0 1 2 3 4; do for y in 0 1 2 3 4 5 6 7 8 9; do ./allegro -batch -kill & done; done; wait)
|
| That looks like you're starting up 50 copies and waiting for them all to
| exit. (If I understand what the '&' does.)
you don't. waiting for zombies doesn't take time, and processes aren't
started up _all_ in parallell. as you could have seen from the output if
you had cared to, the processes clearly start up and terminate fairly
sequentially. this is a pretty good indication that we are not starting
up 50 copies all in parallell, not to mention the fact that we manage to
squeeze 50 full instantiations within one second. regardless of what you
think, the fact that this loop terminates in less than 1 second actually
means that all of them started up and terminated within an average of 20
ms of real time each. and since this is a dual processor system, it
would be pretty stupid not to take advantage of it, so you got what you
asked for: 50 copies started and terminated in less than 1 second. now
is a good time to _believe_, heathen.
| What would the results be if you waited for each to exit before starting
| the next?
a miniscule waste of time in the looping construct. the user and system
time total 1.5 seconds. the real time is < 1 s. you do the math.
| Is Allegro written in Lisp?
yes. that is, more than 98% is written in Lisp. (writing a Common Lisp
system in anything else is _really_ painful.) the operating system
interface substrate is written in C because that's what the Unix
operating system requires for portable substrate code. if the Unix
operating system had been intelligently designed and hadn't outboarded so
much of the kernel interface to the C libraries, it would have been
easier to deal with the kernel through small, well-defined system calls
in assembly language, but that's just too much to maintain from version
to version, where the only stable link to the kernel is the C library.
I'm somewhat amazed by the resilience of your prejudices.
#:Erik
In article <················@naggum.no>, Erik Naggum wrote:
> ms of real time each. and since this is a dual processor system, it
> would be pretty stupid not to take advantage of it, so you got what you
> asked for: 50 copies started and terminated in less than 1 second. now
> is a good time to _believe_, heathen.
But doing things in this manner takes advantage of the fact that the OS
already has the code for the Lisp environment already in memory, which
causes us to measure the process instantiation time of the environment, NOT
the load-time of the environment.
EITHER ONE can be considered the "Start Time" of the Lisp environment. So
on the one hand, the test is valid, but on the other, it is not. It's also
not realistic, because:
1) Not everyone has an SMP box. I sure don't -- I can't afford one.
2) The test you perform is guaranteed to be under "light load". If the
system is in real-world use, I can guarantee you that test would take longer
than a second (potentially up to a minute depending on the tasks being
performed).
> a miniscule waste of time in the looping construct. the user and system
> time total 1.5 seconds. the real time is < 1 s. you do the math.
Real time is irrelevant; the combined user and system time is what counts.
> yes. that is, more than 98% is written in Lisp. (writing a Common Lisp
> system in anything else is _really_ painful.) the operating system
Well, it's got to be bootstrapped somehow. :-)
> easier to deal with the kernel through small, well-defined system calls
> in assembly language, but that's just too much to maintain from version
> to version, where the only stable link to the kernel is the C library.
Umm...C compiles to assembly language. You can still call the C library
using assembly.
Besides, there's no point -- on many architectures, C produces assembly code
better than most humans do today anyway. You deride C as if it's the
antichrist.
Also, the going trend in kernel designs is to move more and more OUT of the
kernel and into user space, which is where it belongs anyway. The kernel's
job is to maintain the system -- what you do with that system is up to you.
> I'm somewhat amazed by the resilience of your prejudices.
I'm somewhat amazed by yours.
--
KC5TJA/6, DM13, QRP-L #1447
Samuel A. Falvo II
Oceanside, CA
From: Raymond Toy
Subject: Re: [executables] was: why Haskell hasn't replaced CL yet?
Date:
Message-ID: <4nhfepj91t.fsf@rtp.ericsson.se>
>>>>> "Samuel" == Samuel A Falvo <······@garnet.armored.net> writes:
Samuel> In article <················@naggum.no>, Erik Naggum wrote:
>> a miniscule waste of time in the looping construct. the user and system
>> time total 1.5 seconds. the real time is < 1 s. you do the math.
Samuel> Real time is irrelevant; the combined user and system time is what counts.
I don't know about you but I wouldn't care that user+system time is 1
sec if the real time were 1 year. Real time is what really matters.
Samuel> Umm...C compiles to assembly language. You can still call
Samuel> the C library using assembly.
GCC compiles to assembly, but nothing says C has to. That last
sentence has nothing to do with C compiling to assembly, so I don't
follow. Or do you mean compile to machine code?
Ray
* Raymond Toy wrote:
> GCC compiles to assembly, but nothing says C has to. That last
> sentence has nothing to do with C compiling to assembly, so I don't
> follow. Or do you mean compile to machine code?
The Symbolics C compiler compiled to Lisp...
--tim
Centuries ago, Nostradamus foresaw a time when Samuel A. Falvo II would say:
>In article <················@naggum.no>, Erik Naggum wrote:
>> ms of real time each. and since this is a dual processor system, it
>> would be pretty stupid not to take advantage of it, so you got what you
>> asked for: 50 copies started and terminated in less than 1 second. now
>> is a good time to _believe_, heathen.
>
>But doing things in this manner takes advantage of the fact that the OS
>already has the code for the Lisp environment already in memory, which
>causes us to measure the process instantiation time of the environment, NOT
>the load-time of the environment.
The exact same thing is true for alternatives. If I've already got
Perl running, trying to spawn 50 instances in a second means that I'm
measuring the time required to instantiate Perl rather than to load it.
That seems to me to be a not unreasonable comparison of apples to apples.
>EITHER ONE can be considered the "Start Time" of the Lisp environment. So
>on the one hand, the test is valid, but on the other, it is not. It's also
>not realistic, because:
>
>1) Not everyone has an SMP box. I sure don't -- I can't afford one.
Oh, well. That probably means that you only get 25 copies spawned in a
second.
>2) The test you perform is guaranteed to be under "light load". If the
> system is in real-world use, I can guarantee you that test would take longer
> than a second (potentially up to a minute depending on the tasks being
> performed).
No, the test was a high load, as it was spawning processes as fast as it
could.
>> a miniscule waste of time in the looping construct. the user and system
>> time total 1.5 seconds. the real time is < 1 s. you do the math.
>
>Real time is irrelevant; the combined user and system time is what counts.
I ran something similar on my "wimpy" AMD 266, 96MB RAM.
I set up null scripts that invoke:
- CLISP
- Perl
- Python
- zsh
- CMUCL
- Guile
Elapsed times to invoke 50 null scripts in each language:
- CLISP: 5.31s
- Perl: 1.71s
- Python: 6.69s
- zsh: 2.06s
- CMUCL: 14.2s
- Guile: 44.4s
CLISP is neither fastest nor slowest. It probably has more functionality
built-in in the image that is being invoked than any of the other
languages. CMUCL was the slowest of the CL implementations; it's still
not *outrageously* slow to invoke at about 1/3s each.
Of course, this is a pretty worthless test; it shows how long it takes
to do *NOTHING* with all of these languages.
>> easier to deal with the kernel through small, well-defined system calls
>> in assembly language, but that's just too much to maintain from version
>> to version, where the only stable link to the kernel is the C library.
>
>Umm...C compiles to assembly language. You can still call the C library
>using assembly.
Which is to say that the only stable link to the kernel is the C library.
Accessing the Linux kernel via anything else than the C library gives
significant risk of nonportability with future versions of the Linux
kernel.
>Besides, there's no point -- on many architectures, C produces assembly code
>better than most humans do today anyway. You deride C as if it's the
>antichrist.
Most humans produce pretty poor code in whatever language they write in.
>Also, the going trend in kernel designs is to move more and more OUT of the
>kernel and into user space, which is where it belongs anyway. The kernel's
>job is to maintain the system -- what you do with that system is up to you.
I don't think that's at question.
>> I'm somewhat amazed by the resilience of your prejudices.
>
>I'm somewhat amazed by yours.
I'm surprised that you can't believe that someone could spawn 50
Lisp instances in a second. I was only able to do about 10 on *my*
system, but I was:
a) Not running the fastest speed-demon of Common Lisp implementations,
b) Running a fairly old K6-2 CPU.
--
There's no such thing as nonexistence.
········@hex.net - - <http://www.hex.net/~cbbrowne/lisp.html>
In article <······················@knuth.brownes.org>, Christopher Browne wrote:
>The exact same thing is true for alternatives. If I've already got
>Perl running, trying to spawn 50 instances in a second means that I'm
>measuring the time required to instantiate Perl rather than to load it.
Thank you for proving my point.
>That seems to me to be a not unreasonable comparison of apples to apples.
The original question was for load time, if memory serves me correctly. In
either case, I was merely pointing out that "Start Time" is an ambiguous
concept, as there are at least two ways to measure it. :)
>No, the test was a high load, as it was spawning processes as fast as it
>could.
With the computer doing absolutely nothing else in the process. Do you see
the difference here? In a real-world situation, a computer could be
handling network connections, performing back-ups, or otherwise running
other CPU-intensive tasks in the background. That all will definately
influence (and lengthen) startup times.
>CLISP is neither fastest nor slowest. It probably has more functionality
>built-in in the image that is being invoked than any of the other
>languages. CMUCL was the slowest of the CL implementations; it's still
>not *outrageously* slow to invoke at about 1/3s each.
Certainly not. I wasn't arguing about that though. :)
I was just pointing out that the measurements performed could be misleading
due to the circumstances in which the measurements were made.
>I'm surprised that you can't believe that someone could spawn 50
>Lisp instances in a second. I was only able to do about 10 on *my*
What's there NOT to believe? Did I argue that he's NOT spawning 50 in a
second? Please quote where I stated that.
--
KC5TJA/6, DM13, QRP-L #1447
Samuel A. Falvo II
Oceanside, CA
* Samuel A. Falvo II
| With the computer doing absolutely nothing else in the process. Do you
| see the difference here? In a real-world situation, a computer could be
| handling network connections, performing back-ups, or otherwise running
| other CPU-intensive tasks in the background. That all will definately
| influence (and lengthen) startup times.
oh, geez, when will this _end_?
I don't have this powerful a machine just to boast about it and play
games. it's a work-horse for serious development, and it has a number of
non-trivial duties. at the time I ran those tests, it turns out that it
was servicing a few thousand FTP requests from local network machines
that were upgrading some software automatically over the span of the few
minutes I ran the tests, it ran a bunch of Netscape frames with animated
advertising GIFs, and it provided monitoring and backup services for 6
other computers on its local network, which involves network traffic and
low CPU consistency checking. it also received four e-mail messages, the
processing of which fires up an Emacs in batch mode to handle the
filtering and processing of the incoming messages. the only thing not
strictly normal about this is the FTP load. regardless, I have no idea
exactly how big this load was during each of the individual _seconds_
that I ran my tests. I have reason to suspect that it had very little
effect on anything because the machine is in fact able to perform the
vast majority of its duties in zero noticeable time -- which is why it is
this powerful to begin with. now, this _could_ explain the 5 ms extra
execution time I noticed, but that's just pure speculation on my part,
and I see little point in spending the time to figure it out.
| I was just pointing out that the measurements performed could be
| misleading due to the circumstances in which the measurements were made.
so let's assume the measurement errors were on the order of 20 vs 25 ms
per invocation. that's the difference between 40 and 50 invocations per
second. this bothers you a great deal, apparently. it doesn't bother me.
and you were "just pointing out" that it could take _minutes_, which is
nothing more than really bad fiction on your part. in _minutes_, this
machine has compiled GNU Emacs from scratch (2:30), built a new Linux
kernel (2:10), installed staroffice (1:20), built CD images for Debian
2.2 (3:10), or upgraded and installed a 100 packages (2:50).
to suggest that this machine should suddenly only manage to start 50
Allegro CL processes because of other work it's doing is simply insane.
as long as any goddamn fool can cast doubt on anything anybody says, I
suggest a much more honest starting point: "I don't want to believe you!"
instead of trying to smear whoever is trying to answer their questions.
I'm getting sick of the rampant stupidity that comes with benchmarks and
any other myth-deflating devices. myths, apparently, are necessary for
the mental survival of some people. perhaps it is not a good idea to try
to destroy their misguided beliefs because they turn out to be lunatics
if they can't keep their myths alive and well.
#:Erik
On 03 Mar 2000 08:54:27 +0000, Erik Naggum <····@naggum.no> wrote:
> per invocation. that's the difference between 40 and 50 invocations per
40 vs 50 doesn't bother me at all in this case, but 40 vs 1000 might.
What I really want to know is how long a minimal executable built
by ACL would take to start on my machine. Someone else posted
a measurement that was many times yours, and I'm wondering if
there might be a lot of factors affecting speed besides just the raw
MHz. A better way to measure it might be to write a null program
in C++ and post the ratio between how many times per second it
can run vs how many times per second a null Lisp program can run.
Then I could run the same C++ program on my machine, and use
the same ratio to estimate how long ACL-built executables might
take to start. Or even better, Franz should post some ACL-built
executables on their web site, for just such purposes as this.
Lisp encourages better paradigms than C++, but C++ programmers
aren't likely to adopt those better paradigms till after they have a lot
of experience with Lisp. To really meet their needs, it has to fit not
only the better paradigms but also the ones they already use, even
if it doesn't fit them as well as C++ does. The programmers know
they will be working towards something better, but they need a
foundation to stand on while they work, and that means being able
to do what they do now, and advance from there one step at a time.
If a particular program takes N% longer to run when built by ACL
than when built in C++, that doesn't mean C++ programmers are
going to reject Lisp. They know they get lots of advantages, and
know there is a lot of serious learninig to do before they can make
good use of all those advantages, and they're probably willing to
make that tradeoff. But fear of other tradeoffs, such as a 1000 to 1
ratio of the above test, might be what keeps them from proceeding.
By posting the real numbers, even if they turn out to be 10 to 1 or
50 to 1, such fears can be mitigated, and more people are likely
to end up using Lisp. And posting real executable code, which
they can download and test for themselves, might mitigate their
fears even more.
Your post of your numbers was appreciated and surprising. I had
no idea ACL could start that fast on any machine. I'm a lot more
interested in the possibility of using it for a future project now than
I was before. I'm also wondering if LispWorks can start that fast,
and where I could find a good review of what advantages and
disadvantages they have vs each other.
·············@not.for.spam writes:
> 40 vs 50 doesn't bother me at all in this case, but 40 vs 1000 might.
> What I really want to know is how long a minimal executable built
> by ACL would take to start on my machine. Someone else posted
But why would you want to know such a thing? What if I can build a
minimal executable (language and implementation doesn't matter), that
starts 1000 times a second, but once I add a single line of code, it
only starts 10 times a second, or even less. What if nearly any
reasonable program you would want to write in that language/env would
need that line? And what if only 10% of the programs need that line?
Or only exactly your program?
What if something starts 40 times a second on an idle machine, but
once you get only a little load from other programs, some cache or
kernel algorithm starts thrashing, and this drops to 4 times a second?
What if something starts 500 times a second on my machine, but only
twice a second on your machine, although my machine is seemingly not
very different from your machine? What if my OS allows C++ programs
to start up very quickly, because it's dynamic linker implementation
isn't brain-dead, whereas your OS's is? Or vice-versa? And what if
on the same OSes, ACL doesn't use your OS's dynamic linker, and thus
starts much much quicker in relation to your C++, as ACL does on my OS
in relation to my C++? Or vice-versa? Or nothing of the sort?
The world's a strange place, the hardware world doubly so, and it's
getting stranger all the time. We've long left that nice cozy world of
"simple" VAXen and PDPs. Todays CPU architectures, memory hierarchies,
MMUs, busses and OSs are strange beasts indeed, and very unpredictable.
Benchmarking is a very, very difficult business, even when you can
benchmark the code you're gonna use on the OS and hardware you're
gonna use, with a load-profile _you think_ will be realistic. It
get's nearly impossible to do right in other situations. Transferring
benchmark results from one platform to another, is an exercise in
blind archery after a ride on a rollercoster: Yes a few Zen masters
will probably hit every time, and some drunken stranger might even hit
out of pure luck, but then again he might hit you. Let's try to avoid
that...
> By posting the real numbers, even if they turn out to be 10 to 1 or
> 50 to 1, such fears can be mitigated, and more people are likely
> to end up using Lisp. And posting real executable code, which
As I'm trying to tell you, there will be no "real" numbers for any
useful definition of real.
> they can download and test for themselves, might mitigate their
> fears even more.
If you are still interested in how fast a "null program" will start up
on ACL on your machine, then download one of the ACL Trial Editions
(which you can get for free), and time this for yourself. If you use
Linux or FreeBSD, timing the following might give you some indication
(but then again it might not. I'm not the one suggesting this is a
useful benchmark, so I'm not going to worry about measurement errors
and the like):
time lisp -qq -Q -kill > /dev/null
or put the line "lisp -qq -Q -kill" 1000 times in a shell script
(named time-acl50.sh), and do
time ./time-acl50.sh > /dev/null
Better yet, download the Trial Edition, start writing a useful, if
small, program, and time that instead. Or time anything of real
value, really.
> Your post of your numbers was appreciated and surprising. I had
> no idea ACL could start that fast on any machine. I'm a lot more
> interested in the possibility of using it for a future project now than
> I was before. I'm also wondering if LispWorks can start that fast,
> and where I could find a good review of what advantages and
> disadvantages they have vs each other.
I'd suggest you download the LispWorks Personal Edition, and time
that, only the PE doesn't allow saving new images (IIRC), and
therefore you won't be able to get a non-GUI image (again AFAIK).
You _might_ run across the same problems trying to do the tests with
ACL on Windows using Franz' free demo version for Windows, but I
wouldn't know that.
Regs, Pierre.
PS: Since C++ and C have come up, let's add them to our senseless
benchmarking table, just to throw more nonsense out into the world:
Running 1000 null processes from a subshell takes:
Implementation Real(s) User(s) Sys(s) Proc/s ms/Proc
- CMUCL 2.4.18a 103.598 42.500 60.420 9.652 103.60
- Python 1.5 51.963 43.360 8.260 19.244 51.96
- ACL 5.0 48.900 34.120 14.470 20.449 48.90
- Tclsh 8.0 29.340 22.070 7.110 34.083 29.34
- Python 1.5 (-S) 21.652 16.440 5.050 46.185 21.65
- CLISP 1997-12-06-1 19.034 8.840 10.060 52.537 19.03
- GCL 2.2.1 14.392 6.280 8.000 69.483 14.39
- Perl 5.005 10.191 6.150 3.980 98.125 10.19
- Perl 5.004 9.928 5.690 4.190 100.725 9.93
- BASH 2.02.1(1) 9.226 4.640 4.540 108.389 9.23
- G++ 2.95.1 7.911 4.650 3.210 126.406 7.91
- GCC 2.95.1 3.771 1.620 2.130 265.181 3.77
- ECL 0.27 3.673 0.730 2.940 272.257 3.67
- G++ 2.95.1 -static 2.076 0.490 1.570 481.695 2.08
- GCC 2.95.1 -static 2.040 0.510 1.490 490.196 2.04
See previous postings for test environment.
Non-serious conclusion: Dynamic vs. static linking on Linux 2.2
sometimes makes more difference than language choice. ;)
PPS: Yes the ECL binary in question is statically linked. :)
Regs, Pierre.
--
Pierre Mai <····@acm.org> PGP and GPG keys at your nearest Keyserver
"One smaller motivation which, in part, stems from altruism is Microsoft-
bashing." [Microsoft memo, see http://www.opensource.org/halloween1.html]
* ·············@not.for.spam
| Or even better, Franz should post some ACL-built executables on their web
| site, for just such purposes as this.
the Franz Inc sales staff and their engineers have related to me in the
past, and I'm sure I'm not misrepresenting them now, that they see
extremely little business value in catering to people who mainly execute
really tiny programs like the null program or "hello, world" programs.
rather, they have told me, and I have reason to believe them, that their
main customers use Common Lisp in large-scale applications. their
pricing model, licensing terms, and their Value Added Reseller programs
all work very well together to indicate to me that they regard themselves
somewhat like Oracle, which also provides a huge environment that people
mainly use to deploy Really Important Applications, not somewhat like
Larry Wall and the Perl team, who provide a large fuzzy toy environment
that people mainly use to deploy Really Unimportant Applications.
catering to the RUA people is antithetical to doing business well with
the RIA people. everybody in the computer business _knows_ this, except
the RUA people, but they don't _actually_ count, even though they think
they do. for some bizarre reason, RUA people think their RUAs grow into
RIAs when in fact they don't. vast networks of half-cooperating RUAs are
actually reimplemented by RIA people into a much smaller and leaner RIA
than the RUA people could ever hope to realize when push comes to shove.
RUA people can graduate into RIA people if they first learn to dispense
with the notion that RUAs _matter_. they don't. really. nobody is
interested in how many RUAs you have written when they are looking for
people to write RIAs. and I _mean_ nobody. RIA people need to show
their ability to deal with complexity by reducing problems by solving the
really big problems. RUA people show their ability to create complexity
by profilerating tiny solutions. if making something you yourself can
use takes 1 unit of time, making something somebody else can use takes 3
units of time, and making a system that somebody else can use to to build
something that starts the whole scale all over again, takes 9 units of
time. most people are extremely unprepared to build such systems, yet
this is what it takes to grow an RIA programmer from an RUA programmer.
that's why we need RIAs so people who think they are worth something in
this here overheated industry can write RUAs on top of RIAs and make
their employers happy -- they should not _ever_ believe that because they
are using an RIA to write RUAs, they are somehow equipped to write RIAs.
| To really meet their needs, it has to fit not only the better paradigms
| but also the ones they already use, even if it doesn't fit them as well
| as C++ does.
for some reason, everybody realizes that civil engineering is different
from building a toy city in a sandbox. you can't become a civil engineer
by presenting however many pictures of beautiful sandbox cities. it
takes much more than that, different skills, realizing different needs,
different attitudes, different time lines, different economies. for one
thing, you can't tear up a real city like you can destroy your sandbox
city and you can't just start over if you realize that you failed. this
is the really big difference between RUAs and RIAs. an RUA can be torn
down and replaced on short notice. that's what makes it an RUA. an RIA
can't be torn down without jeopardizing really serious investments, such
as the entire existence of a company.
there is hope for RUA people who are bored of writing small things, but
there is no hope at all for RUA people who still think "hello, world" is
interesting in any way, shape, or form. RIA people think differently,
too -- most of them enjoy discussing large-scale philosophical issues,
and are usually very hostile to the really petty issues that most people
think are inordinately important in their own lives. RUa people are well
suited to deal with their own lives in all its detail. RIA people deal
with thousands and millions of lives in some particular respect.
| The programmers know they will be working towards something better, but
| they need a foundation to stand on while they work, and that means being
| able to do what they do now, and advance from there one step at a time.
this is almost entirely false. it is true in the sense that people need
to make one step at a time to make any serious changes to their lives,
but deciding to go from RUA to RIA is like going from playing doctor with
the kid next door (while yourself a kid -- we're not talking about Visual
Basic, here) to actually putting in the planning and all the effort to
_become_ a doctor some fifteen years later, during which time you don't
play doctor all that much, I can tell you. deciding to go from RUA to
RIA is a _complete_ replacement of your whole mind-set towards what
computers can and should do. (e.g., an RUA person may think it's OK for
a computer to crash. an RIA person thinks of a dying machine the same
way a doctor does about a patient, or a military leader about soldiers:
it should not happen without conscious effort to avoid it to the best of
one's ability.)
| But fear of other tradeoffs, such as a 1000 to 1 ratio of the above test,
| might be what keeps them from proceeding.
no, what keeps them at bay is fear of insufficiency in becoming an RIA
person. trust me on this -- I try every single day to find RIA material
among the hundreds and thousands of RUA people I brush against on the Net
and in real life. perhaps one in 200 people are suitable, and the best
way you can spot them is they are _not_ exicited about trifling news and
hyped-up products or stale ideas in new packaging.
| Your post of your numbers was appreciated and surprising. I had no idea
| ACL could start that fast on any machine. I'm a lot more interested in
| the possibility of using it for a future project now than I was before.
I'm sort of glad you appreciate it, but to me, the whole point was to get
_rid_ of your false concerns, not help you validate them. I regret very
much if I did the latter. start-up time is _completely_ irrelevant. as
others have pointed out, if you need to perform a certain task often, you
investigate scaling issues and find that optimizing for scale is a very
different task from optimizing for individual execution. it's somewhat
like optimizing for having fun in your sandbox compared to saving a city
billions of dollars through excellence in civil engineering.
#:Erik
On 03 Mar 2000 22:58:49 +0000, Erik Naggum <····@naggum.no> wrote:
> extremely little business value in catering to people who mainly execute
> really tiny programs like the null program or "hello, world" programs.
That's silly. It should be obvious to you that people who want to
test "hello world" programs do not have such programs as their
main goal. The main purpose of such a program is to minimize
the complexity of a program to explore the issues of compiling,
installing, etc., independently of issues of program complexity.
My interest in null programs is because I happen to presently use
a lot of software in the "pipes and filters" paradigm, and I would
like to replace some of that software with my own versions, which
I might like to write in Lisp.
Note that I am not advocating using "pipes and filters" as a good
paradigm for any particular project. The reason I want to use it
is to be compatible with software I already have. I also want to
use Lisp or some such language for bigger projects, but would
rather use the same language and programming environment
for both types of projects.
Besides Lisp, I'm also investigating a number of other languages
and environments, such as Smalltalk, SML, OCaml, Eiffel, Dylan,
etc.
> like optimizing for having fun in your sandbox compared to saving a city
> billions of dollars through excellence in civil engineering.
That's not a good analogy because Lisp is a lot more like playing
than like doing civil engineering. Civil engineers rely on the
experience of thousands of years of civil engineers who came before
them. Programmers have to rely more on their own experience than
on any such long history. Civil engineers cause disasters that kill
people. Generations of civil engineers that follow them learn from
those disasters. Lisp programmers cause disasters that require them
to redo some work. Lisp programmers can play with their domain
objects to learn how to manage them in their programs. They can
very rapidly and very efficiently educate themselves in their domains
until they develop the knowledge and skills to develop and maintain
high quality software in those domains. Civil engineers are much less
efficiently educated in their domains. They can require years of
specialized study to learn and understand the same depth of domain
details a Lisp programmer can learn and understand in a few weeks
or months of interactive "playing" with the domain.
* ·············@not.for.spam
| On 03 Mar 2000 22:58:49 +0000, Erik Naggum <····@naggum.no> wrote:
|
| > extremely little business value in catering to people who mainly execute
| > really tiny programs like the null program or "hello, world" programs.
|
| That's silly.
then why do you argue that people spend time publishing results in that
area? clearly, your argument is that these things matter a great deal.
but I quite agree that it's silly to be concerned about such things, and
I'm delighted that you recognize silliness when properly framed -- you
might actually recognize that your core argument is indeed very silly.
| It should be obvious to you that people who want to test "hello world"
| programs do not have such programs as their main goal. The main purpose
| of such a program is to minimize the complexity of a program to explore
| the issues of compiling, installing, etc., independently of issues of
| program complexity.
if that _were_ the goal, I'd agree that it would be useful to help people
with such programs. however, it isn't, and you know it isn't. those who
argue for small executables do so on the basis of "overhead", which is
not a question of how much the language needs, but how well the operating
system is able to accomodate its needs. so small executable size is a
tribute to the operating system and the language, while large executable
overhead is a blemish on the operating system. oddly enough, people take
it out on the language. this is not just silly, it's idiotic.
| My interest in null programs is because I happen to presently use a lot
| of software in the "pipes and filters" paradigm, and I would like to
| replace some of that software with my own versions, which I might like to
| write in Lisp.
if you were truly interested, you would be willing to consider many ways
to accomplish your needs. "pipes and filters" does _not_ translate into
"small executable with short startup-up time" except to the permanently
braindamaged C victims. in particular, a good way to make use of Lisp is
to have a very heavy process that maintains a lot of state, but which
tiny C programs talk to via sockets, if this is hard to do directly from
whatever "scripts" are otherwise engaged in the "pipes and filters"
thing. (IMNSHO, the sorry fact that shells have not grown to be able to
make network connections instead of just pipes is _really_ pathetic.)
| Note that I am not advocating using "pipes and filters" as a good
| paradigm for any particular project. The reason I want to use it is to
| be compatible with software I already have. I also want to use Lisp or
| some such language for bigger projects, but would rather use the same
| language and programming environment for both types of projects.
you can, but you have to zoom out and _think_ about your problem. you
can't expect everything new to fit the same old mold. in this case, the
friggin obvious solution is to write a pipe-and-filter thingy in C that
talks to the Lisp process. that way, you reduce the start-up time to
that of C (which you seem to believe is short) plus the overhead of
connecting to the already running Lisp process, which is, like, _really_
short. if you have problems with this extra "layer" of code, yet observe
that you get dramatically improved performance, which you would if you
tried it instead of just rejecting any other solutions than "run the
program", I'd say you have a political agenda and not an engineering
problem, anymore.
it so happens that _every_ other person who has posted to this newsgroup
about his misgivings about startup times has had a political agenda and a
need to complain rather than get any real work done. you're not in good
company. if you don't like this, you need do nothing more than show that
you have worthy goals with your quest -- and that is best shown by simply
abandoning the bad solutions that you keep complaining about.
| That's not a good analogy because Lisp is a lot more like playing than
| like doing civil engineering.
I'm glad you show me I was right in judging you to be an RUA person, but
really, don't you think I spent all that time with a glimmer of hope that
you might recognize how RIA people _differ_ from yourself in what I
wrote?
time for the lament of the day: it is so often such a terrible _waste_ to
write anything non-mundane to this newsgroup it's truly _exasperating_.
the only thing you fucking dolts care about is whether people use nice
words or bad, and then if you get nice, approved words, your brains seal
shut with "oh, it's nothing dramatically new, so I'll just lull myself
into my cozy old stupidity and enjoy the peace and quiet from not having
to listen to anyone". I get _sick_ of such idiocy and stupidity! many
of you guys seem to want it more than anything else, and some even go out
of their way to _encourage_ nice and cozy, non-threatening stupidity.
you, in particular, don't know much about programming, Mr. anonymous not
for e-mail at not for spam dude, so it would help a lot if you didn't
pretend you did and that you didn't tell people who have outgrown your
childish approach to programming _decades_ ago about how you have _not_
understood that this here programming thing is _not_ about playing in a
sandbox. a few people have tried to share their experience with you, and
you just reject them because you refuse to believe that there's anything
beyond toy code (by our measures, not yours).
#:Erik, actually irritated, for once
On 04 Mar 2000 14:55:59 +0000, Erik Naggum <····@naggum.no> wrote:
> to have a very heavy process that maintains a lot of state, but which
> tiny C programs talk to via sockets, if this is hard to do directly from
Yes, I've done something like that. I used a named pipe and shared
memory, and the big program was not written in Lisp, but it's the same
idea. But in spite of that it's reasonable for me to ask about startup
time, because there is some value in not needing to do it that way, and
if some Lisps can start up a lot faster than others, that is one of many
factors to consider in choosing one vs another.
> talks to the Lisp process. that way, you reduce the start-up time to
> that of C (which you seem to believe is short) plus the overhead of
I've measured null program startup from a lot more than Lisp and C.
In my measurements, C was the fastest, Dylan and Eiffel were about
six times slower, SML was somewhat slower than those, and Lisp
and Smalltalk were about six times slower than SML. I haven't
measured OCaml yet because I don't have the right assembler on
my computer, but will have it soon. The particular implementations
I measured were not necessarily representative, but the measurements
continue. I'm presently downloading some more implementations of
Smalltalk, which became available for download in the past couple of
days.
I have no particular involvement in Lisp, and could just as easily choose
another language. The startup time is just one of many factors to take
into account. The possible need for a foreground/background solution
is a factor, not an obstacle.
> program", I'd say you have a political agenda and not an engineering
What kind of political agenda could I possibly have? Even if my point of
view seems like completely irrational engineering, that doesn't make it
political. I want a programming language and development environment
that meets several criteria, some of which may seem more rational to
you than others. I'm taking a lot of factors into account and probably
giving most of those factors different weights than you would. That
doesn't make me your political enemy.
* ·············@not.for.spam
| What kind of political agenda could I possibly have? Even if my point of
| view seems like completely irrational engineering, that doesn't make it
| political. I want a programming language and development environment
| that meets several criteria, some of which may seem more rational to you
| than others. I'm taking a lot of factors into account and probably
| giving most of those factors different weights than you would. That
| doesn't make me your political enemy.
it seems reasonable to assume that you failed to read the whole sentence
you just quoted a tiny little the part of. let me try it again:
if you have problems with this extra "layer" of code, yet observe that
you get dramatically improved performance, which you would if you tried
it instead of just rejecting any other solutions than "run the program",
I'd say you have a political agenda and not an engineering problem,
anymore.
the keyword here is "rejecting any other solutions". being dead set on
exploring only a particular solution space _is_ a political decision on
your part. you can argue for its engineering _necessity_, but it is
still a political decision. believing otherwise does you no good.
you seem to be extraordinarily focused on not seeing your problems other
than in light of how you can solve them with technology you already know.
this is the really exasperating part of trying to tell you something new
that might change your perception of the _problem_, not the solutions.
and as with every other political decision where people get "stuck" in
their pet problems, we find that they don't really want any solutions,
but will go on and on and on and on about their problem. so there's no
telling when some benchmark-crazed doofus will be satisfied, because
there's nothing he actually wants to _know_. such unfocusedness is
rampant in bad engineering circles where political agendas are much more
important than solving problems. you find them here in comp.lang.lisp at
times, too, where someone comes up with something he _desperately_ wants
to do only particular way and any suggestions otherwise fall on deaf ears.
#:Erik
EN> ... (IMNSHO, the sorry fact
EN> that shells have not grown to be able to make network
EN> connections instead of just pipes is _really_ pathetic.)...
Actaully, this can be remedied reasonably easily using little programs like
netcat. (goes by the name 'nc' usually).
BM
* Bulent Murtezaoglu <··@acm.org>
| Actaully, this can be remedied reasonably easily using little programs like
| netcat. (goes by the name 'nc' usually).
... which is the little C program that starts up in no time, right?
if the shells could do their own network connections, there wouldn't be
any need to start up those little programs. after all, the shells don't
run small programs do to filename globbing, anymore, and numerous other
common tasks have been incorporated into the shells, simply because it
makes a lot more sense to incorporate them than to run small programs all
the time, partly because start-up time for even small programs begin to
matter when you have to do it hundreds of times because everything you
_do_ is made up a whole school of tiny little programs.
in case it hasn't become obvious by now: the more people get good at
writing small programs that run in "barely noticeable time" each, the
more silly things like start-up time matter to them. the more they get
good at these silly things, the less intelligently they design their
software, and the less likely they are ever to produce software that
doesn't consist of tiny little fragments of code that never quite work
together.
when you reinvent serious programming languages in scripting languages,
which people have been doing in the Unix world for ages, what you get is
a lot of people who can do useful things in no time, and no people who
can figure out how to do stuff that obviates the need for tiny hacks or
at least that curbs their dramatic increase. the result is a never-
ending increase in the need for more tiny little programs, which costs
all parties involved in the processes a lot of money, and which drives up
the cost of hiring and doing business. the only people who profit from
this development are bad programmers.
I see no reason why Common Lisp should take part in that development.
instead, we should try to explain to people who think they have to hire
bad programmers that they don't have to -- they could hire a Common Lisp
programmer who knows how to change a mass of RUAs into a coherent system
that it takes far less effort to build and maintain than just to keep the
old system running. it's somewhat like the difference between a mass of
disorganized files and information strewn all over the place and a real
database system. and the funny thing is: some people _do_ get the idea.
#:Erik
[on netcat]
EN> ... which is the little C program that starts up in no time,
EN> right?
Well, I don't know how long it takes to start up. Usually whatever it
is you will be doing with the network dominates the times it takes to
use it I suppose. But specifics of netcat is not your point, anyway.
EN> if the shells could do their own network connections, there
EN> wouldn't be any need to start up those little programs.
Yes, though I for one don't think I would like it for shells to get
even more bloated than they are. Following your terminology, if the
task at hand is Really Unimportant, I don't particularly care what
little programs I would start up. If you are sitting at a Unix shell
prompt, needing to do something one-off or not terribly time critical,
you have already been dealt your hand and it is clear what you need to
do to play the game to a successful conclusion.
EN> after
EN> all, the shells don't run small programs do to filename
EN> globbing, anymore, and numerous other common tasks have been
EN> incorporated into the shells, simply because it makes a lot
EN> more sense to incorporate them than to run small programs all
EN> the time, partly because start-up time for even small programs
EN> begin to matter when you have to do it hundreds of times
EN> because everything you _do_ is made up a whole school of tiny
EN> little programs.
Yes, this is true from an elegance point of view. Olin Shivers makes a
similar argument in his scsh paper. I agree with him and I agree with
you. There are two semi distinct arguments here though. One concerns
the inelegance and inefficiency of the Unix way of doing things with
lots of little programs glued together by the shell scripts and/or pipes.
This is mostly an aesthetic argument as far as I am concerned. These
things work just fine for Real Unimportant Tasks. I think the more
significant point, which is distinct from the first, is what you say
below:
EN> in case it hasn't become obvious by now: the more people get
EN> good at writing small programs that run in "barely noticeable
EN> time" each, the more silly things like start-up time matter to
EN> them. the more they get good at these silly things, the less
EN> intelligently they design their software, and the less likely
EN> they are ever to produce software that doesn't consist of tiny
EN> little fragments of code that never quite work together.
This is an important observation and precisely why people entering the
field by writing shell scripts need to somehow (at school? by mentors at
work?) be told that even though what they know how to do works and works
fine for Real Unimportant/Simple Tasks, it most certainly is NOT the one
true way of doing things. When this is not done,
EN> when you reinvent serious programming languages in scripting
EN> languages, which people have been doing in the Unix world for
EN> ages, what you get is a lot of people who can do useful things
EN> in no time, and no people who can figure out how to do stuff
EN> that obviates the need for tiny hacks or at least that curbs
EN> their dramatic increase.
Yes. So it gives rise to inefficiency, and a waste of _probable_ talent.
The silver lining, IMHO, is that most of these little hacks only eat up
human resources once and then they are shared.
EN> the result is a never- ending
EN> increase in the need for more tiny little programs, which
EN> costs all parties involved in the processes a lot of money,
EN> and which drives up the cost of hiring and doing business.
EN> the only people who profit from this development are bad
EN> programmers.
I am not sure _I_ have seen enough evidence for this conclusion.
Clearly, ignorance passing as expertise would be more likely to be costly
than not. I am not sure that cost is paid by businesses, it might be
spread out to society at large. But if we will argue in this vein, than
we probably need to talk about non-monetary costs (eg the hypothetical
smart kid who could find a cure for cancer making a decent living as a bad
programmer hacking up HTML-generating Visual Basic for ipo.com.)
I am not willing to have this discussion in cll, though I would listen
to it elsewhere.
EN> I see no reason why Common Lisp should take part in that
EN> development. instead, we should try to explain to people who
EN> think they have to hire bad programmers that they don't have
EN> to -- they could hire a Common Lisp programmer who knows how
EN> to change a mass of RUAs into a coherent system that it takes
EN> far less effort to build and maintain than just to keep the
EN> old system running.
I agree that this would be possible if people could also be convinced
that Common Lisp programmers can be found by making a few phone calls.
They cannot be found that easily. If anyone pays me for my opinion on
anything like this, I probably am more likely to say get 5 perl hacks
and a slave driver because I know that can be done, than get Eric Naggum
and clone a spare. Depends on what the project is, of course. I am
assuming that a mass of RUA's can go a passable job.
EN> it's somewhat like the difference between
EN> a mass of disorganized files and information strewn all over
EN> the place and a real database system. and the funny thing is:
EN> some people _do_ get the idea.
I think you are being too optimistic here. In the case of databases,
they don't get the idea, they just follow the best practice as it is
widely known (which can be done sheepishly).
BM
On 05 Mar 2000 01:37:04 +0000, Erik Naggum <····@naggum.no> wrote:
> at least that curbs their dramatic increase. the result is a never-
> ending increase in the need for more tiny little programs, which costs
[...]
> bad programmers that they don't have to -- they could hire a Common Lisp
> programmer who knows how to change a mass of RUAs into a coherent system
Is the scsh design a step in the right direction? I would appreciate your
comments or opinions on this issue.
For those who are not familiar with scsh:
http://www-swiss.ai.mit.edu/ftpdir/scsh/
In particular, check the paper "A Universal Scripting Framework / or /
Lambda: the ultimate ``little language''" among the publications of Olin
Shivers. His site is:
http://www.ai.mit.edu/~shivers/
Paolo
--
EncyCMUCLopedia * Extensive collection of CMU Common Lisp documentation
http://cvs2.cons.org:8000/cmucl/doc/EncyCMUCLopedia/
curd:
doctor Naggum <····@naggum.no> at 04 Mar 2000 14:55:59 +0000 lamented:
> time for the lament of the day: it is so often such a terrible _waste_ to
> write anything non-mundane to this newsgroup it's truly _exasperating_.
> the only thing you fucking dolts care about is whether people use nice
> words or bad, and then if you get nice, approved words, your brains seal
> shut with "oh, it's nothing dramatically new, so I'll just lull myself
> into my cozy old stupidity and enjoy the peace and quiet from not having
> to listen to anyone". I get _sick_ of such idiocy and stupidity! many
> of you guys seem to want it more than anything else, and some even go out
> of their way to _encourage_ nice and cozy, non-threatening stupidity.
and my darling doctor, YOU need to effectively realize that interjecting
shit into communication not only shock people to attention, but also reduces
payload to the point of being redirected to a toilet. Such vehemence has the
following effects on the negative side:
* it becomes a drudging task for your fans to separate the husk from kernel.
* divide your audience into the tolerant and intolerant. The tolerant are
usually more learned and likely become a member of your fan club; but the
intolerant will write you off and ensure dramatic noise for this newsgroup,
which only serves as a fodder for more of your scatology. (which attracts
gadflies like me: world's wardens & prigs aghast with horror.)
With your and Kent's so many excuses of CL's warts by attribution to human
nature, why cannot you appreciate the human needy nature for the benefit of
a doubt in confrontation? We are not machines that take thought equations
for input and output apologies or admissions. A minimal civility in
one-to-one communication is especially lubricative among the above-average
people in this group that are full of ego. Your rare apologies and admission
of errors is at best a terse slur amidst brimming taunts. By not observing
this human need satisfactorily, the result is proportionally humane that is
chaotic and silliness. Not all educated have the exact same view on quality
criterions or values of mannerism of postings. You adamantly behaved as if
your judgements are the only ones.
Too hard a clout on the head knocks people dead and wastes your energy. Why
do you have such a morbid fascination on summing up tremendous energy to
clout people? You only wanted to slap them awake don't you? In any case you
are NOT the god of omniscience, besides, you at times happy-go-lucky
clouting people with personal opinions that's not exactly kosher.
(dissipation besmirching the virtues of discipline at work here.) With your
style, the distinction between a crotchet and 24 karat gold is fuzzy or
intentionally blurred. This is a characteristic of a charlatan. Have you
become aware of such attribution in your behavior? On the surface the
stately goal of education; beneath the smirk at fooling sophomores and
insulting fools & wise equally. In the middle another wandering demented
victim of knowledge with a blood that boil eggs.
Erik Naggum <····@naggum.no> sometimes in 02/18 +/- 5 days in this year at
probably somewhere in Norway of Planet Earth of Solar system wrote:
>� it's funny how you guys have to take so vocal parts in what you seem to
>� dislike that I do, and overdo it in so stupid ways, too -- I have yet to
>� see one of you being able to interject any technical contents to your
>� flames about me: they're all about how much you dislike me or what I do;
I'm quite incapable at interjection but ejaculation at your dictation. But i
ask you please do not interject. Leave interjections to advertisement
agencies. Do like me in this message for example, where one part is the
curd, and what follows is the whey clearly separated. This way, people can
read your technical or sound philosophical offerings with perspicuity, awoke
to your wrath with certainty, appreciate your help with sincerity, and
banter with your expletives with jocularity.
---------------------------------------------------
whey:
you fucking stupid shit. You, YOU, fucking stupid. Tooo stupid because i
feel to call you so. So stupid and moronic. Let's see how many words that
describes stupidity? (pull out my thesaurus: 1. [n.] A person lacking in
good sense: fool, imbecile, idiot, blockhead, nincompoop, nitwit, dimwit,
simpleton, dullard, jackass, ninny, tomfool, twit, dope, goose, donkey,
bonehead (informal), ding-a-ling (informal), dingbat (slang), clod, boob
(informal), dummy, clodhopper, jerk (slang), nit (British), schmo, turkey,
loon, mooncalf, genius (antonym), expert (antonym), master (antonym), sage
(antonym), savant (antonym) , 2. [n.] A mentally deficient person, half-wit,
cretin, feeble-minded person, idiot, mental defective, dullard, imbecile,
mentally retarded person, simpleton.) No, i don't feel like editing it and
making it look nice. Be glad that i copied this from my thesaurus and pasted
it here for you to enjoy the myriad ways to address you properly. Be glad
that you don't know shit so that you can enjoy your blessing of idiocy and
not get offended at all. Live like a veggie and die like a veggie. Live
looong, like a veggie. You veggie. You vegan veggie. You veggie vegan. You
pussy. Veggie and pussy. Yes, you, YOU, Y O U.
I'm nothing but a parasite of the brainy. As long as you are alive and
_kicking_ on newsgroups, i'm alive and kicking you too. What a _splendid_
symphony of symbiosis in the desert of arid technicality.
> #:Erik, actually irritated, for once
ecstatically,
Xah
···@xahlee.org
http://xahlee.org/PageTwo_dir/more.html
* not for email wrote:
> Civil engineers cause disasters that kill
> people. Generations of civil engineers that follow them learn from
> those disasters. Lisp programmers cause disasters that require them
> to redo some work.
That is a stupid thing to say. Lisp programmers, just like other
programmers, can cause disasters which kill people.
--tim
On 04 Mar 2000 18:04:24 +0000, Tim Bradshaw <···@cley.com> wrote:
>That is a stupid thing to say. Lisp programmers, just like other
>programmers, can cause disasters which kill people.
My point is not about the fatality of the disaster but about the time
lines involved. Civil engineers can't rely on their own experience
because they can't get enough experience to do their jobs. They
have to rely on the experience of thousands of years of civil
engineering. Lisp programming is entirely different. You can see
what you're doing, and can see its effects, before you commit to
doing it that way. Lisp programming involves learning how to do
what you want to do while you do it. Civil engineering requires
learning everything before you do anything. Civil engineering
uses the waterfall paradigm. That paradigm has been shown to
be a failure in software development. Thus programming is not
at all like civil engineering. The point I was refuting was that Lisp
programming is like civil engineering. It's not.
* not for email wrote:
> My point is not about the fatality of the disaster but about the time
> lines involved. Civil engineers can't rely on their own experience
> because they can't get enough experience to do their jobs.
Neither can programmers. Look at old code (not just in Lisp, in any
language) if you don't believe that. Look at language design.
There's a reason things are now different (better, perhaps), and
that's because people are learning from others' experience.
> The point I was refuting was that Lisp programming is like civil
> engineering. It's not.
It's much more like it than most people think. If programmers (lisp
included) behaved a bit more like civil engineers we wouldn't have so
much fouled up and broken software to deal with, and we wouldn't spend
so much time repeating the same mistakes over and over.
--tim
* ·············@not.for.spam
| Lisp programming is entirely different. You can see what you're doing,
| and can see its effects, before you commit to doing it that way. Lisp
| programming involves learning how to do what you want to do while you do
| it. Civil engineering requires learning everything before you do
| anything. Civil engineering uses the waterfall paradigm.
your belief system is severely misguided, and also self-reinforcing in a
sense that will make it impossible for you ever to graduate into serious
software development of the Really Important Application kind.
| That paradigm has been shown to be a failure in software development.
| Thus programming is not at all like civil engineering. The point I was
| refuting was that Lisp programming is like civil engineering. It's not.
I'm sorry to burst your bubble, Mr. ·············@not.for.spam, but the
waterfall paradigm works just fine at the coarse development level.
since you apparently only build Really Unimportant Applications, where
there _is_ no coarse development level, only the details level that you
keep describing with very good accuracy, you're missing the point: that
there is _more_ than the nitty-gritty details level.
but I give up. people who aren't equipped to understand big pictures
will only get increasingly hostile and adamant that only their small
pictures exist when you try to force them to open their eyes.
#:Erik
Centuries ago, Nostradamus foresaw a time when
·············@not.for.spam would say:
>On 03 Mar 2000 22:58:49 +0000, Erik Naggum <····@naggum.no> wrote:
>
>> extremely little business value in catering to people who mainly execute
>> really tiny programs like the null program or "hello, world" programs.
>
>That's silly. It should be obvious to you that people who want to
>test "hello world" programs do not have such programs as their
>main goal.
That is not at all obvious. Anyone that makes up a test merely based
on performing "Hello, world!" is quite evidently oriented towards
creating tiny programs of trivial importance.
>The main purpose of such a program is to minimize
>the complexity of a program to explore the issues of compiling,
>installing, etc., independently of issues of program complexity.
Yes, the purpose of *creating* such a program is to explore how one
creates a minimal program, which is useful enough in puzzling through
the way the computing environment works.
But if you do a BENCHMARK based on this, and consider the statistics
to be of some value, it is stupid to do so unless the statistics are
supposed to be representative of what you're trying to do.
>My interest in null programs is because I happen to presently use
>a lot of software in the "pipes and filters" paradigm, and I would
>like to replace some of that software with my own versions, which
>I might like to write in Lisp.
... Which makes the error of thinking that "pipes and filters" involve
null programs ...
The time when the "UNIX filter" notion starts to get particularly
valuable is when there are nontrivial quantities of data getting
thrown at them.
>Note that I am not advocating using "pipes and filters" as a good
>paradigm for any particular project. The reason I want to use it
>is to be compatible with software I already have. I also want to
>use Lisp or some such language for bigger projects, but would
>rather use the same language and programming environment
>for both types of projects.
Here I'll take a rather different tack from the (possibly less-than
constructive) comments #erik made; there's no fundamental problem with
using "pipes and filters" in Common Lisp as it's got a reasonably rich
set of functions relating to "streams." You can create "streams" that
accept/produce data and that you then connect to functions that will
process them.
That is virtually the same thing that UNIX provides with
pipes/filters, except that Lisp makes them all into first-class
objects so that you may give them names and preserve them.
Thus, if you create a function, SCAT, that takes a set of input
streams as input, and outputs their contents to an output stream, you
could do:
;;; Stream Catenate
(defun scat (instreams outstream)
(dolist ;;; For each stream in the input list
#'(lambda (ins)
(with-open-stream (s ins) ;; read each line
(write s outstream)) ;; write to output stream
nil)
instreams))
;;; Declare some filenames...
(setf inp1n (make-pathname :directory '(:absolute "tmp") :name "file1")
inp2n (make-pathname :directory '(:absolute "tmp") :name "file2")
inp3n (make-pathname :directory '(:absolute "tmp") :name "file3")
out4n (make-pathname :directory '(:absolute "tmp") :name "file4"))
;;; Open the files for the filenames...
(setf inp1 (open inp1n :direction :input)
inp2 (open inp2n :direction :input)
inp3 (open inp3n :direction :input)
out4 (open out4n :direction :output))
(scat (list inp1 inp2 inp3) out4)
[Note: this code doesn't *quite* wrok... There is something just a
bit wrong with SCAT, and I'd not mind being illuminated as to what
brain-o I committed here...]
The critical thing that happens here is that you *don't* spawn a
separate Lisp session for every component of this that comes along,
just the same way that you don't spawn an extra instance of the Korn
shell for every line that you add to a shell script.
You merely evaluate (scat) again, which doesn't mandate restarting the
whole Lisp system.
--
"We all know Linux is great...it does infinite loops in 5 seconds."
(Linus Torvalds about the superiority of Linux on the Amterdam Linux
Symposium)
········@ntlug.org- <http://www.ntlug.org/~cbbrowne/lsf.html>
········@news.hex.net (Christopher Browne) writes:
Skipping all the good reasoning, going directly to the probable brain-o:
> ;;; Stream Catenate
> (defun scat (instreams outstream)
> (dolist ;;; For each stream in the input list
^^^^^^
Didn't you indend this to be mapc?
> #'(lambda (ins)
> (with-open-stream (s ins) ;; read each line
If SCAT is supposed to read each line, you'll need a proper body
here.
> (write s outstream)) ;; write to output stream
> nil)
> instreams))
What you probably meant to write:
(defun scat (instreams outstream)
(with-open-stream (out outstream)
(dolist (stream instreams instreams)
(with-open-stream (in stream)
(loop for line = (read-line in nil nil)
while line
do (write-line line out))))))
Regs, Pierre.
--
Pierre Mai <····@acm.org> PGP and GPG keys at your nearest Keyserver
"One smaller motivation which, in part, stems from altruism is Microsoft-
bashing." [Microsoft memo, see http://www.opensource.org/halloween1.html]
Centuries ago, Nostradamus foresaw a time when Pierre R. Mai would say:
>········@news.hex.net (Christopher Browne) writes:
>
>Skipping all the good reasoning, going directly to the probable brain-o:
I'd not used with-open-stream yet, so I'm not surprised at all that it
blew up...
>> ;;; Stream Catenate
>> (defun scat (instreams outstream)
>> (dolist ;;; For each stream in the input list
> ^^^^^^
>Didn't you indend this to be mapc?
>
>> #'(lambda (ins)
>> (with-open-stream (s ins) ;; read each line
>
>If SCAT is supposed to read each line, you'll need a proper body
>here.
>
>> (write s outstream)) ;; write to output stream
>> nil)
>> instreams))
>
>What you probably meant to write:
>
>(defun scat (instreams outstream)
> (with-open-stream (out outstream)
> (dolist (stream instreams instreams)
> (with-open-stream (in stream)
> (loop for line = (read-line in nil nil)
> while line
> do (write-line line out))))))
Cool. Thanks!
In any case, the way that the gentle user *ought* to script up
things-vaguely-resembling-UNIX-scripts would be to create some
utilities and filters based on stuff like SCAT, and thereby have the
filters sit inside the Lisp instance.
Alternatively, if there are to be some "pipes" opened to UNIX
processes, it would make more sense to have Lisp invoke this, and
establish the streams from inside the Lisp environment than to build
ten-line Lisp scripts that are "duct-taped" together from the UNIX
side of the world.
--
Users should cultivate an ability to make the simplest molehill into a
mountain by finding controversial interpretations of innocuous
sounding statements that the sender never intended or imagined.
-- from the Symbolics Guidelines for Sending Mail
········@ntlug.org - - <http://www.ntlug.org/~cbbrowne/lsf.html>
In article <················@naggum.no>,
Erik Naggum <····@naggum.no> wrote:
> * ·············@not.for.spam
> | Or even better, Franz should post some
ACL-built executables on their web
> | site, for just such purposes as this.
>
> the Franz Inc sales staff and their engineers
have related to me in the
> past, and I'm sure I'm not misrepresenting
them now, that they see
> extremely little business value in catering to
people who mainly execute
> really tiny programs like the null program or
"hello, world" programs.
> rather, they have told me, and I have reason
to believe them, that their
> main customers use Common Lisp in large-scale
applications. their
> pricing model, licensing terms, and their
Value Added Reseller programs
> all work very well together to indicate to me
that they regard themselves
> somewhat like Oracle, which also provides a
huge environment that people
> mainly use to deploy Really Important
Applications, not somewhat like
> Larry Wall and the Perl team, who provide a
large fuzzy toy environment
> that people mainly use to deploy Really
Unimportant Applications.
>
> catering to the RUA people is antithetical to
doing business well with
> the RIA people. everybody in the computer
business _knows_ this, except
> the RUA people, but they don't _actually_
count, even though they think
> they do. for some bizarre reason, RUA people
think their RUAs grow into
> RIAs when in fact they don't. vast networks
of half-cooperating RUAs are
> actually reimplemented by RIA people into a
much smaller and leaner RIA
> than the RUA people could ever hope to realize
when push comes to shove.
>
> RUA people can graduate into RIA people if
they first learn to dispense
> with the notion that RUAs _matter_. they
don't. really. nobody is
> interested in how many RUAs you have written
when they are looking for
> people to write RIAs. and I _mean_ nobody.
RIA people need to show
> their ability to deal with complexity by
reducing problems by solving the
> really big problems. RUA people show their
ability to create complexity
> by profilerating tiny solutions. if making
something you yourself can
> use takes 1 unit of time, making something
somebody else can use takes 3
> units of time, and making a system that
somebody else can use to to build
> something that starts the whole scale all over
again, takes 9 units of
> time. most people are extremely unprepared to
build such systems, yet
> this is what it takes to grow an RIA
programmer from an RUA programmer.
> that's why we need RIAs so people who think
they are worth something in
> this here overheated industry can write RUAs
on top of RIAs and make
> their employers happy -- they should not
_ever_ believe that because they
> are using an RIA to write RUAs, they are
somehow equipped to write RIAs.
>
> | To really meet their needs, it has to fit not
only the better paradigms
> | but also the ones they already use, even if it
doesn't fit them as well
> | as C++ does.
>
> for some reason, everybody realizes that civil
engineering is different
> from building a toy city in a sandbox. you
can't become a civil engineer
> by presenting however many pictures of
beautiful sandbox cities. it
> takes much more than that, different skills,
realizing different needs,
> different attitudes, different time lines,
different economies. for one
> thing, you can't tear up a real city like you
can destroy your sandbox
> city and you can't just start over if you
realize that you failed. this
> is the really big difference between RUAs and
RIAs. an RUA can be torn
> down and replaced on short notice. that's
what makes it an RUA. an RIA
> can't be torn down without jeopardizing really
serious investments, such
> as the entire existence of a company.
>
> there is hope for RUA people who are bored of
writing small things, but
> there is no hope at all for RUA people who
still think "hello, world" is
> interesting in any way, shape, or form. RIA
people think differently,
> too -- most of them enjoy discussing
large-scale philosophical issues,
> and are usually very hostile to the really
petty issues that most people
> think are inordinately important in their own
lives. RUa people are well
> suited to deal with their own lives in all its
detail. RIA people deal
> with thousands and millions of lives in some
particular respect.
>
> | The programmers know they will be working
towards something better, but
> | they need a foundation to stand on while they
work, and that means being
> | able to do what they do now, and advance from
there one step at a time.
>
> this is almost entirely false. it is true in
the sense that people need
> to make one step at a time to make any serious
changes to their lives,
> but deciding to go from RUA to RIA is like
going from playing doctor with
> the kid next door (while yourself a kid --
we're not talking about Visual
> Basic, here) to actually putting in the
planning and all the effort to
> _become_ a doctor some fifteen years later,
during which time you don't
> play doctor all that much, I can tell you.
deciding to go from RUA to
> RIA is a _complete_ replacement of your whole
mind-set towards what
> computers can and should do. (e.g., an RUA
person may think it's OK for
> a computer to crash. an RIA person thinks of
a dying machine the same
> way a doctor does about a patient, or a
military leader about soldiers:
> it should not happen without conscious effort
to avoid it to the best of
> one's ability.)
>
> | But fear of other tradeoffs, such as a 1000 to
1 ratio of the above test,
> | might be what keeps them from proceeding.
>
> no, what keeps them at bay is fear of
insufficiency in becoming an RIA
> person. trust me on this -- I try every
single day to find RIA material
> among the hundreds and thousands of RUA people
I brush against on the Net
> and in real life. perhaps one in 200 people
are suitable, and the best
> way you can spot them is they are _not_
exicited about trifling news and
> hyped-up products or stale ideas in new
packaging.
>
> | Your post of your numbers was appreciated and
surprising. I had no idea
> | ACL could start that fast on any machine. I'm
a lot more interested in
> | the possibility of using it for a future
project now than I was before.
>
> I'm sort of glad you appreciate it, but to me,
the whole point was to get
> _rid_ of your false concerns, not help you
validate them. I regret very
> much if I did the latter. start-up time is
_completely_ irrelevant. as
> others have pointed out, if you need to
perform a certain task often, you
> investigate scaling issues and find that
optimizing for scale is a very
> different task from optimizing for individual
execution. it's somewhat
> like optimizing for having fun in your sandbox
compared to saving a city
> billions of dollars through excellence in
civil engineering.
>
> #:Erik
>
what planet are you from? your generalized RUA
name doesn't make sense. Are you talking about
OpenSource developers as a whole? Explain please.
I'll admit, i haven't written RIA's. But what do
you mean by saying RIA's are different from RUA's?
Is the linux OS an RIA or an RUA? sure, the
processes are different between corporations and
opensource, but i dont see how that matters, as
linux is an excellent os, and most commercial ones
are simply adequite.
Ill also admit there are craploads of apps under
opensource that are RUA's, but thats a nobrainer;
these are often dead-wood projects,
projects-in-embryo, or just flat-out failures that
people started for fun. But the funny thing is,
these RUA's DO turn into RIA's often. Maybe you
have a chip on your shoulder? :^)
please do explain.
······@uswest.net
Sent via Deja.com http://www.deja.com/
Before you buy.
* ··········@my-deja.com
| what planet are you from?
| your generalized RUA name doesn't make sense.
| Maybe you have a chip on your shoulder? :^)
| please do explain.
really? you're a waste of space. go away.
#:Erik
In article <·················@news.earthlink.net>,
·············@not.for.spam wrote:
> 40 vs 50 doesn't bother me at all in this case, but 40 vs 1000 might.
Why?
I rarely care about the difference between 40 and 3000ms for C++
startup because my C++ programs run for several seconds during
almost all of their development and useful life. The only times
I notice C++ startup are when I'm testing "do nothing" and when I'm
pipeing a number of things with null inputs. Neither is common.
However, I probably wouldn't even notice a lisp startup that took
10-15 seconds.
The difference comes from the different way that one uses a lisp
image vs using a C++ image. During development, I reload my lisp
image far more often than most people, which means that I reload
it every 5-10 edits, or 2-3 times an hour, or easily an order of
magnitude less often than I reload C++. I stay in a lisp image for
quite a while, doing lots of stuff that would require C++ recompiles.
Everyone I know reloads far less often - some people don't reload for
days. During production, the image runs for quite a while, so again
startup time is almost irrelevant.
Hmm - startup time doesn't include compile-time. I'll bet that most
people spend more time waiting for a C++ compile than they do waiting
for startup. During some stages of development, I compile constantly,
and don't run at all. Most people using lisp tend to run far more
than they compile. (I'm using "compile" in the sense of "wait while
some tool pokes around your program", so it doesn't include JITs, which
merely affect run-time.)
I'm reminded of a comparison between the boot times of early sun
boxes and lisp machines. The sun boxes then could reboot in well
under a minute, which was important, because rebooting them was
common - some people had to reboot dozens of times/day. The Lisp
machines in that era took 10-15 minutes to reboot, but people didn't
reboot for weeks at a time, so the reboot time was far less important.
Not only did people spend less time rebooting lisp machines, the
time lost while rebooting was less. (You can't wander off and do
something else during "I've got to figure out why it crashed again"
while you can during a "it's Monday, so I'll reboot while I
get coffee and see who's playing chess".).
BTW - There are granularity and contxt issues. For example, the
difference between 10ms and 500ms can be important in some cases
(like character processing) and irrelevant in others (like startup).
Meanwhile, the difference between 3 minutes and 5 minutes is almost
always irrelevant because 3 minutes is a huge break in concentration
and once a human has task-switched, there's no big penalty to staying
away longer.
-andy
Sent via Deja.com http://www.deja.com/
Before you buy.
·············@not.for.spam writes:
> Your post of your numbers was appreciated and surprising. I had
> no idea ACL could start that fast on any machine. I'm a lot more
> interested in the possibility of using it for a future project now than
> I was before. I'm also wondering if LispWorks can start that fast,
> and where I could find a good review of what advantages and
> disadvantages they have vs each other.
Such as? Obvious decision criteria I'd suggest include
- size as displayed by ls -l
- size(1) output
- default font height in IDE (if supplied)
- time taken to run an infinite loop
- maybe run "strings" on the executable and see how many obscene words
it contains
-dan "I hear `mauve' has the most RAM"
In article <················@naggum.no>, Erik Naggum wrote:
> oh, geez, when will this _end_?
When you give a more precise description of the testing environment, which
you then proceeded to do. I would like to thank you for completing the
picture for me.
> non-trivial duties. at the time I ran those tests, it turns out that it
> was servicing a few thousand FTP requests from local network machines
Not bad; this is about as heavy as a streaming backup would be. And with 6
other backups going concurrently worst case that would be the type of load
that I'm talking about.
> so let's assume the measurement errors were on the order of 20 vs 25 ms
> per invocation. that's the difference between 40 and 50 invocations per
> second. this bothers you a great deal, apparently. it doesn't bother me.
No it doesn't bother me. I was just pointing out that the conditions of the
test would affect the results. Upon further investigation, and a repost
from another reader of the newsgroup, I'd apparently glossed over where
you'd stated that already.
> as long as any goddamn fool can cast doubt on anything anybody says, I
> suggest a much more honest starting point: "I don't want to believe you!"
> instead of trying to smear whoever is trying to answer their questions.
I personally don't think my text is "smearing." If you feel that way, you
should have said this up front, instead of attempting to smear back at me.
I'm human -- ergo, I'm not perfect. And neither are you.
> I'm getting sick of the rampant stupidity that comes with benchmarks and
> any other myth-deflating devices. myths, apparently, are necessary for
> the mental survival of some people. perhaps it is not a good idea to try
> to destroy their misguided beliefs because they turn out to be lunatics
> if they can't keep their myths alive and well.
This, arguably, is itself a myth.
--
KC5TJA/6, DM13, QRP-L #1447
Samuel A. Falvo II
Oceanside, CA
* Samuel A. Falvo II
| But doing things in this manner takes advantage of the fact that the OS
| already has the code for the Lisp environment already in memory, which
| causes us to measure the process instantiation time of the environment, NOT
| the load-time of the environment.
this is amazingly misguided. the load-time of the environment is related
to such tasks as mapping pages of memory, handling shared libraries
(which is _very_ expensive), and running a bunch of initilization code.
_none_ of this is magically remembered from one instantiation of the
system to the next. (and doing so would be a veritable _disaster_.)
| 2) The test you perform is guaranteed to be under "light load". If the
| system is in real-world use, I can guarantee you that test would take longer
| than a second (potentially up to a minute depending on the tasks being
| performed).
this is getting _really_ silly, but it goes to show that some people will
do just about anything to deny the fact that Allegro Common Lisp starts
up real fast on modern computers. which sort of proves my point...
| Umm...C compiles to assembly language. You can still call the C library
| using assembly.
look, are you for _real_? Common Lisp compiles to assembly, damnit!
| You deride C as if it's the antichrist.
oh, shut up and return to your home planet. I do no such thing.
#:Erik
In article <················@naggum.no>, Erik Naggum wrote:
> this is amazingly misguided. the load-time of the environment is related
> to such tasks as mapping pages of memory, handling shared libraries
> (which is _very_ expensive), and running a bunch of initilization code.
> _none_ of this is magically remembered from one instantiation of the
> system to the next. (and doing so would be a veritable _disaster_.)
But **ALL** of the above is insanely faster than loading a fresh copy of the
code from even today's fastest storage media. *PLEASE* re-read my text
carefully -- I quite clearly distinguish load time from process
instantiation time. I also stated that BOTH can be used as a metric for
"Start Time."
What part of this didn't you understand?
> this is getting _really_ silly, but it goes to show that some people will
> do just about anything to deny the fact that Allegro Common Lisp starts
> up real fast on modern computers. which sort of proves my point...
The only point I saw was that you can launch 2 copies of ACL 25 times a
second each.
> look, are you for _real_? Common Lisp compiles to assembly, damnit!
Are you for real? Why are you getting so upset? Why can't you conduct
yourself like an adult? What did I do to deserve the personal attacks on me
by you? Where have I attacked you? And what did I attack you with?
All I did was point out that there were ambiguities in the measurements made
due to certain "basic" assumptions. I'm not interested in the theoretical
performance of ACL. I'm interested in the real-world, down-to-Earth,
in-the-trenches performance of ACL.
> oh, shut up and return to your home planet. I do no such thing.
My home planet happens to be the one you live on, sir. If you don't like
it, that's not my problem; you'll just have to grin and bear it.
--
KC5TJA/6, DM13, QRP-L #1447
Samuel A. Falvo II
Oceanside, CA
······@garnet.armored.net (Samuel A. Falvo II) writes:
>
> In article <················@naggum.no>, Erik Naggum wrote:
> > this is amazingly misguided. the load-time of the environment is related
> > to such tasks as mapping pages of memory, handling shared libraries
> > (which is _very_ expensive), and running a bunch of initilization code.
> > _none_ of this is magically remembered from one instantiation of the
> > system to the next. (and doing so would be a veritable _disaster_.)
>
> But **ALL** of the above is insanely faster than loading a fresh copy of the
> code from even today's fastest storage media. *PLEASE* re-read my text
> carefully -- I quite clearly distinguish load time from process
> instantiation time. I also stated that BOTH can be used as a metric for
> "Start Time."
>
> What part of this didn't you understand?
>
> > this is getting _really_ silly, but it goes to show that some people will
> > do just about anything to deny the fact that Allegro Common Lisp starts
> > up real fast on modern computers. which sort of proves my point...
>
> The only point I saw was that you can launch 2 copies of ACL 25 times a
> second each.
>
> > look, are you for _real_? Common Lisp compiles to assembly, damnit!
>
> Are you for real? Why are you getting so upset? Why can't you conduct
> yourself like an adult? What did I do to deserve the personal attacks on me
> by you? Where have I attacked you? And what did I attack you with?
>
> All I did was point out that there were ambiguities in the measurements made
> due to certain "basic" assumptions. I'm not interested in the theoretical
> performance of ACL. I'm interested in the real-world, down-to-Earth,
> in-the-trenches performance of ACL.
first you show two metrics, then you insist you are searching for one.
the man w/ two watches is wondering "when does lunch come?".
thi
In article <···············@netcom9.netcom.com>, thi wrote:
>first you show two metrics, then you insist you are searching for one.
>the man w/ two watches is wondering "when does lunch come?".
I show two metrics. I also mention which one of the two metrics matters
most to me. So what is your point? The fact that I do show two metrics
shows that I can see more than one point of view at a time. That doesn't
necessarily mean that I agree with all of them.
--
KC5TJA/6, DM13, QRP-L #1447
Samuel A. Falvo II
Oceanside, CA
Samuel A. Falvo II <······@garnet.armored.net> wrote in message
··························@garnet.armored.net...
> I show two metrics. I also mention which one of the two metrics matters
> most to me. So what is your point? The fact that I do show two metrics
> shows that I can see more than one point of view at a time. That doesn't
> necessarily mean that I agree with all of them.
So specify the metrics and the weightings. Specify a test load that you
want running on the machine at the time. Specify the machine for the
benchmark. If you CAN'T do that, you probably aren't intelligent enough to
interpret the results correctly anyway. If you WON'T do that, it's probably
likely that your mind is already made up and, as it is no use speaking to a
brick, we probably would do well to ignore you. And, if you can and will do
that, hopefully you'd be smart enough to acknowledge the results once you
have been proven incorrect (though I have my doubts).
faa
In article <····················@news.uswest.net>, Frank A. Adrian wrote:
>brick, we probably would do well to ignore you. And, if you can and will do
>that, hopefully you'd be smart enough to acknowledge the results once you
>have been proven incorrect (though I have my doubts).
I haven't been proven incorrect at all -- read the other posts. I pointed
out flaws in the metrics used. Period. No if, ands, ors, or buts about it.
Unbeknownst to me, these flaws had already been pointed out in an earlier
message. And for the record, I did acknowledge my error, quite publicly.
Your comprehension of what I had written is in error, and your anger towards
me is unwarrented. If anything, it is YOU who should be ignored.
--
KC5TJA/6, DM13, QRP-L #1447
Samuel A. Falvo II
Oceanside, CA
* Samuel A. Falvo II
| What part of this didn't you understand?
why you can't figure out that start-up time from storage media is utterly
and completely irrelevant when you start 50 processes within the same
second.
incidentally, I consider your question an instance of losing your temper.
control your own temper, you hypocrite, or shut up about that of others!
| The only point I saw was that you can launch 2 copies of ACL 25 times a
| second each.
this is obviously an unwarranted conclusion on your part, since it took
1.5 second user+system time and 1 second real time, and only user+system
matters. you have no data to support your conclusion, but you do have
data to support that I could fire up 33 instances a second on one CPU
from this data. so I just wish you could engage your brain before you
engage your agenda.
| Are you for real? Why are you getting so upset? Why can't you conduct
| yourself like an adult? What did I do to deserve the personal attacks on
| me by you? Where have I attacked you? And what did I attack you with?
you're being obnoxious, stupid, impenetrably dense, and behave like an
asshole with an irrelevant axe to grind. that's what I object to. and
now you can't even control your own temper. how sickeningly _pathetic_.
| All I did was point out that there were ambiguities in the measurements made
| due to certain "basic" assumptions. I'm not interested in the theoretical
| performance of ACL. I'm interested in the real-world, down-to-Earth,
| in-the-trenches performance of ACL.
and that's what you got, dude. now, will you _ever_ be satisfied?
#:Erik
······@garnet.armored.net (Samuel A. Falvo II) writes:
> In article <················@naggum.no>, Erik Naggum wrote:
> > ms of real time each. and since this is a dual processor system, it
> > would be pretty stupid not to take advantage of it, so you got what you
> > asked for: 50 copies started and terminated in less than 1 second. now
> > is a good time to _believe_, heathen.
>
> But doing things in this manner takes advantage of the fact that the OS
> already has the code for the Lisp environment already in memory, which
> causes us to measure the process instantiation time of the environment, NOT
> the load-time of the environment.
>
> EITHER ONE can be considered the "Start Time" of the Lisp environment. So
> on the one hand, the test is valid, but on the other, it is not. It's also
> not realistic, because:
Look, the question at hand was start time for filters, which are
spawned by some other process to do some work. You can have three
scenarios with this:
a) You only start the filter _very_ seldomly, and therefore the
environment has to be loaded into memory from disk afresh each
time. Since you only do the thing very seldomly, start-up speed by
definition doesn't matter as long as it's below some useful
threshhold. Since we can all consider 5s to be a useful
threshhold for this, I'd wager that any non-IDE implementation of
CL will start within this time frame on every reasonably current
(i.e. produced in the last 3-4 years) computer.
I just tested this on my old 5x86-133 (P90 int performance, quite
slow disks), and this is indeed true for all implementations
currently installed there, which includes CMUCL.
b) You do it indeed depend on relative start-up times, because you
call the script very frequently. Nowadays a common example is
CGI scripts. By definition the environment will be in memory.
This is what Erik (and me, and others) measured. Note that in this
situation, it is usually advisable to do things very differently,
and the whole world is starting to do things differently. Why
should CL learn to do the mistakes of others first?
c) You do b), but are stupid enough not to have enough memory, and
therefore start to thrash. If you consider this a real problem,
then you have a real problem. With any system.
> 1) Not everyone has an SMP box. I sure don't -- I can't afford one.
Well, then you should use ECL, and be thankful that it will outperform
_all other scripting solutions together_ by a factor of 3 and much
more, get much more invocations than Erik on his "high-end"
equipment. Happy now?
All benchmarks save your own are invalid by this meassure, since if I
benchmark on an AMD K6-2 350 on my gear, the numbers will not be easy
to compare to the numbers you get on your PII-400, since the
differences in environment and system and CPU architecture make any
transfer of numbers highly doubtful.
If someone really needs 50 start-ups per second, they'll have the
money to get an SMP box, since paying for real hardware is always
cheaper than paying for programmers that struggle with lesser
languages.
> 2) The test you perform is guaranteed to be under "light load". If
> the system is in real-world use, I can guarantee you that test
> would take longer than a second (potentially up to a minute
> depending on the tasks being performed).
What nonsense is this? If I load up a machine with unrelated tasks I
can virtually guarantee any real-time non-performance I want,
regardless of what environment/program I benchmark. Does this make
any sense? When have you last seen performance benchmarks of
e.g. WebCaches, where they compare performance under "real-world" use,
i.e. while thrashing the machine with 400 unrelated memory-hogs?
And even if your argument were valid (it isn't, especially in our
discussion), then it would apply equally to both CL and all the
other tested language environments. And I'd still guarantee that
ECL "outperformed" them all. Oh dear.
> > a miniscule waste of time in the looping construct. the user and
> > system time total 1.5 seconds. the real time is < 1 s. you do
> > the math.
>
> Real time is irrelevant; the combined user and system time is what counts.
Only real-time is ever relevant. You even acknowledge that in the
paragraph above, because if you only consider CPU time, then any
amount of unrelated CPU load will not influence that figure. Start
reading a book on benchmarking.
> > yes. that is, more than 98% is written in Lisp. (writing a Common Lisp
> > system in anything else is _really_ painful.) the operating system
>
> Well, it's got to be bootstrapped somehow. :-)
?? You can bootstrap from the same language (albeit a different
compiler). See how your favourite C compiler got bootstrapped.
What has all of this got to do with "performance" benchmarks?
> Besides, there's no point -- on many architectures, C produces assembly code
> better than most humans do today anyway. You deride C as if it's the
> antichrist.
On many architectures, Fortran still produces better assembly than any C
compiler. But that wasn't the point of Erik's argument, which wasn't
about speed (or even C in itself) at all. It was about the misdesign of
Unix kernels, which don't have a language neutral, _stable_ interface to
the world, unlike many other operating systems.
Regs, Pierre.
--
Pierre Mai <····@acm.org> PGP and GPG keys at your nearest Keyserver
"One smaller motivation which, in part, stems from altruism is Microsoft-
bashing." [Microsoft memo, see http://www.opensource.org/halloween1.html]
In article <··············@orion.dent.isdn.cs.tu-berlin.de>, Pierre R. Mai
wrote:
>Look, the question at hand was start time for filters, which are
>spawned by some other process to do some work. You can have three
>scenarios with this:
Fine.
>> 1) Not everyone has an SMP box. I sure don't -- I can't afford one.
>
>Well, then you should use ECL, and be thankful that it will outperform
>_all other scripting solutions together_ by a factor of 3 and much
>more, get much more invocations than Erik on his "high-end"
>equipment. Happy now?
^^^^^^^^^^
Yet another person who indulges in personal attacks and can't control his
temper.
With people like this promoting the language, is it any wonder that LISP has
little press these days outside of academia?
>to compare to the numbers you get on your PII-400, since the
>differences in environment and system and CPU architecture make any
>transfer of numbers highly doubtful.
This is true of Erik's measurements as well. I fail to see the distinction.
>If someone really needs 50 start-ups per second, they'll have the
>money to get an SMP box, since paying for real hardware is always
>cheaper than paying for programmers that struggle with lesser
>languages.
While this is admittedly true, not all companies think this way.
Furthermore, I wasn't _arguing_ anything. I was pointing out that there are
ambiguities in the measurements performed because of the environment in
which the measurements were conducted.
>And even if your argument were valid (it isn't, especially in our
>discussion), then it would apply equally to both CL and all the
>other tested language environments. And I'd still guarantee that
>ECL "outperformed" them all. Oh dear.
Tone your voice down, will you? I did nothing to warrent your wrath. And
you'll notice that I'm *NOT* advocating any other language. You think I
don't know that this will also affect other languages? You're stating the
obvious, which doesn't need to be stated at all.
>about speed (or even C in itself) at all. It was about the misdesign of
>Unix kernels, which don't have a language neutral, _stable_ interface to
>the world, unlike many other operating systems.
Such as?
--
KC5TJA/6, DM13, QRP-L #1447
Samuel A. Falvo II
Oceanside, CA
* ······@garnet.armored.net (Samuel A. Falvo II)
| >Happy now?
| ^^^^^^^^^^
|
| Yet another person who indulges in personal attacks and can't control his
| temper.
how about you? if you can't control your reactions to a "happy now?"
maybe the simple fact of the matter is that the problem _is_ with you,
and not with anything else at all. you know, this happens a lot in real
life, so it's not particularly surprising that it happens on the Net.
| With people like this promoting the language, is it any wonder that LISP
| has little press these days outside of academia?
see, this is a fairly insulting personal attack, but fortunately, we know
from the history of just about everything that only clueless morons come
up with this shit, so nobody has to deal with it except to laugh at you.
in case you need spoonfeeding, too: we're responding to your "arguments"
and attitude problems, we're not promoting the language. if you think
you are engaging people in a "promote the language" debate with your
silly misgivings about the measurements, you're even more arrogant than
anybody here probably thinks right now.
| Tone your voice down, will you?
great idea! hey, maybe you could show us with your excellent example
behavior?
| I did nothing to warrent your wrath.
in your own eyes, obviously, but since when do people's reactions to you
depend (especially solely) on what _you_ think you did? come on, now,
show some _insight_ into human behavior if you want to be the critic.
matter of fact, if you see somebody angry at you, the first thing to do
is consider the question: "what did I do?", _not_ "I don't deserve this!"
and go self-defensive. however, if you actually react with a "I don't
deserve this!" what the hell were you thinking when you lashed out at
others -- they clearly didn't deserve it in their own eyes, either, and
they certainly did _nothing_ to warrant your idiotic comment about
promoters of the language.
I find it utterly fascinating that the people who make so much fuss about
other people's temper have no control over their own and have so much
less into these things than those who _can_ control their temper and
direct it specifically at targets that _do_ deserve it. watch these guys
when they defend themselves! it's _ridiculous_ how unfocused they become
and how willing they are to drift away from any topic at all. this leads
me to conclude that we're looking at people who lack the ability to deal
with conflict in general, and who spend their life avoiding conflict in
any form, only to be _really_ upset when others don't share their views,
and then they behave as rank amateurs in any conflict they get into.
#:Erik
Erik Naggum <····@naggum.no> writes:
> matter of fact, if you see somebody angry at you, the first thing to do
> is consider the question: "what did I do?", _not_ "I don't deserve this!"
> and go self-defensive.
Excellent advice.
Stig Hemmer,
Jack of a Few Trades.
······@garnet.armored.net (Samuel A. Falvo II) writes:
> >> 1) Not everyone has an SMP box. I sure don't -- I can't afford one.
> >
> >Well, then you should use ECL, and be thankful that it will outperform
> >_all other scripting solutions together_ by a factor of 3 and much
> >more, get much more invocations than Erik on his "high-end"
> >equipment. Happy now?
> ^^^^^^^^^^
>
> Yet another person who indulges in personal attacks and can't control his
> temper.
While I don't think that my posting contained in any way a personal
attack, I'd like to apologize for coming over as harsh and aggressive
as I did. Yes, my temper did run lose a bit, and I over-reacted.
Sorry.
> With people like this promoting the language, is it any wonder that
> LISP has little press these days outside of academia?
I think Lisp has more press outside of academia nowadays, than it has
within. Take a look at recent conference proceedings and journals.
Academia has found other languages to make them happy. BTW: Java is
quite remarkable in this way, in that it has captured academic interests
in a way that other hyped languages never did.
> >to compare to the numbers you get on your PII-400, since the
> >differences in environment and system and CPU architecture make any
> >transfer of numbers highly doubtful.
>
> This is true of Erik's measurements as well. I fail to see the distinction.
Exactly. This is why benchmarks have as limited a value as they do.
Complaining of the inherent limitations of benchmarks, when someone
posts numbers with the explicit statement that such numbers are indeed
useless measurements, seems to me not very useful. That's what I was
reacting against. See below...
> While this is admittedly true, not all companies think this way.
> Furthermore, I wasn't _arguing_ anything. I was pointing out that
> there are ambiguities in the measurements performed because of the
> environment in which the measurements were conducted.
See above... <g>: "Ambiguities" will result in any environment in
which you conduct the measurements, except for the environment
that the final production code will run in. Given the complexity
of todays computing environments, even slight differencs will
have non-linear impacts. Reacting differently to the use of an
SMP machine, than to the use of any other machine seems a bit
arbitrary. Especially when the "benchmark" in question was never
intended by the anyone to be taken as any serious measurement of
anything seriously worth measuring.
> Tone your voice down, will you? I did nothing to warrent your wrath. And
Sorry again for the raised voice...
> you'll notice that I'm *NOT* advocating any other language. You think I
I didn't react to your advocating (or not) another language. I did
react to your pointing out things which where either obvious, fairly
irrelevant to interpreting the numbers at hand, wrong (=> CPU time
vs. real-time) or at least grossly misleading. And missing the point
that the numbers at hand shouldn't be taken seriously at all, anyway.
Anyone who bases any decisions on the numbers posted is obviously
missing the point.
> don't know that this will also affect other languages? You're stating the
> obvious, which doesn't need to be stated at all.
If I was stating the obvious in my previous posting, it was because I
was under the (mistaken) impression that you didn't have a firm grasp
of the obvious. I'm sorry if this caused you offense.
> >about speed (or even C in itself) at all. It was about the misdesign of
> >Unix kernels, which don't have a language neutral, _stable_ interface to
> >the world, unlike many other operating systems.
>
> Such as?
ProDOS ;)
Regs, Pierre.
--
Pierre Mai <····@acm.org> PGP and GPG keys at your nearest Keyserver
"One smaller motivation which, in part, stems from altruism is Microsoft-
bashing." [Microsoft memo, see http://www.opensource.org/halloween1.html]
In article <··············@orion.dent.isdn.cs.tu-berlin.de>, Pierre R. Mai
wrote:
>While I don't think that my posting contained in any way a personal
>attack, I'd like to apologize for coming over as harsh and aggressive
>as I did. Yes, my temper did run lose a bit, and I over-reacted.
>Sorry.
Apology accepted. I too must apologize as I thought my post contained
points which were patently obvious to me, yet too many people took as an
attack against ACL.
>Academia has found other languages to make them happy.
I don't know why, but this statement made me laugh so hard that I was
hunched into a ball on the floor. :)
>posts numbers with the explicit statement that such numbers are indeed
>useless measurements, seems to me not very useful. That's what I was
>reacting against. See below...
AH HA...I didn't see any reference to the prior knowledge that the
benchmarks were known to be questionable. That is entirely my fault.
>ProDOS ;)
In what way is it language independent?
As an author of an operating system myself, I'm beginning to increasingly
feel that the language the OS was written in primarily determines the
language that "everyone else uses" when writing software for the platform.
This is the impetus behind choosing C for my own OS. If I were writing the
OS strictly for myself, I'd be using Forth.
The way I see it, -how- the parameters to the kernel get passed isn't really
an issue, so much as it is the abstract capabilities of the kernel.
--
KC5TJA/6, DM13, QRP-L #1447
Samuel A. Falvo II
Oceanside, CA
* Samuel A Falvo wrote:
> Also, the going trend in kernel designs is to move more and more OUT of the
> kernel and into user space, which is where it belongs anyway. The kernel's
> job is to maintain the system -- what you do with that system is up to you.
Not real-world ones. Look at NT sometime.
--tim
In article <···············@cley.com>, Tim Bradshaw wrote:
>> kernel and into user space, which is where it belongs anyway. The kernel's
>> job is to maintain the system -- what you do with that system is up to you.
>
>Not real-world ones. Look at NT sometime.
I administer NT and Linux boxes for a living. The vast majority of NT's
functionality is in user-space, in the form of COM components and normal
DLLs. And even Linux is starting to emphasize user-space components now.
--
KC5TJA/6, DM13, QRP-L #1447
Samuel A. Falvo II
Oceanside, CA
* Samuel A Falvo wrote:
> In article <···············@cley.com>, Tim Bradshaw wrote:
>>> kernel and into user space, which is where it belongs anyway. The kernel's
>>> job is to maintain the system -- what you do with that system is up to you.
>>
>> Not real-world ones. Look at NT sometime.
> I administer NT and Linux boxes for a living. The vast majority of NT's
> functionality is in user-space, in the form of COM components and normal
> DLLs. And even Linux is starting to emphasize user-space components now.
Where the `vast majority' lives is irrelevant. The point is that NT
has moved (post v3) a whole load of user-level functionality (like the
graphics subsystem) into kernel space. Why do you think it's so
unreliable?
--tim
* not for email wrote:
> On 28 Feb 2000 11:39:14 +0000, Erik Naggum <····@naggum.no> wrote:
>> time (for x in 0 1 2 3 4; do for y in 0 1 2 3 4 5 6 7 8 9; do ./allegro -batch -kill & done; done; wait)
> That looks like you're starting up 50 copies and waiting for them
> all to exit. (If I understand what the '&' does.) What would the
> results be if you waited for each to exit before starting the next?
I tried this and it's basically the same (but my machine is a bunch
slower to start than Erik's for reasons I don't really know or care).
I'd assume (perhaps wrongly) that Erik will see a slowdown of 2 for
the serial one because he has a 2-processor machine so he ought to be
able to schedule one process on each cpu for the pll version whereas
the serial one can not do that. Of course that just makes the startup
time he's measuring be the effective time for a ~ 1.2GHz serial
machine. So maybe it's 40ms or something, who cares?
--tim
* Tim Bradshaw <···@cley.com>
| I tried this and it's basically the same (but my machine is a bunch
| slower to start than Erik's for reasons I don't really know or care).
| I'd assume (perhaps wrongly) that Erik will see a slowdown of 2 for the
| serial one because he has a 2-processor machine so he ought to be able to
| schedule one process on each cpu for the pll version whereas the serial
| one can not do that. Of course that just makes the startup time he's
| measuring be the effective time for a ~ 1.2GHz serial machine. So maybe
| it's 40ms or something, who cares?
nonono, while we're making wild conjectures about the behavior of
completely irrelevant tasks, we must not also make serious mistakes, or
the data might suddenly become statistically valid.
SMP systems don't behave the way people tend to think. like, we used up
<1 second of real time, and 1.5 seconds of CPU time, which is not at all
close to 40 ms, but more like 30 ms with all the overhead accounted for.
however, each process takes less than that, but I don't really know why
it takes more than 25 ms per process to fire it up in such a packed loop
when it takes close to 20 ms when run once, by itself, but it's fairly
consistent. it is not a measurement error -- the real time reported is
in 1 ms units, while CPU time is reported in 10 ms units. however, the
processor a process runs on doesn't grow any faster just because there's
one more of it that can run another process -- it's still a 600MHz
processor. it's only during heavy multitasking that such a system can
approach 1.2GHz combined performance. otherwise, the big win is only in
the interactive response time when one CPU is idle. not that I complain.
#:Erik
In article <················@naggum.no>, Erik Naggum wrote:
> however, each process takes less than that, but I don't really know why
> it takes more than 25 ms per process to fire it up in such a packed loop
> when it takes close to 20 ms when run once, by itself, but it's fairly
Paging.
> consistent. it is not a measurement error -- the real time reported is
> in 1 ms units, while CPU time is reported in 10 ms units. however, the
> processor a process runs on doesn't grow any faster just because there's
> one more of it that can run another process -- it's still a 600MHz
> processor. it's only during heavy multitasking that such a system can
But you're executing and retiring TWO environments concurrently when you use
the '&' operator on the command-line shell. Think superscalar here. If it
takes 1s to run a particular program, and you run 50 of them in a tight
loop, it can be expected to take roughly 50s to complete. If you run two
concurrently, it will take approximately 25s. So the amortized time per
execution drops to 0.5s per run.
--
KC5TJA/6, DM13, QRP-L #1447
Samuel A. Falvo II
Oceanside, CA
From: John Markus Bjorndalen
Subject: Re: [executables] was: why Haskell hasn't replaced CL yet?
Date:
Message-ID: <hv66v9b09f.fsf@johnmnb.cs.uit.no>
Tim Bradshaw <···@cley.com> writes:
> I'd assume (perhaps wrongly) that Erik will see a slowdown of 2 for
> the serial one because he has a 2-processor machine so he ought to be
Btw, I tried running acl as you suggested on a dual machine and it was
somewhere between a factor 1.5-2 difference (the computer did some
other work at the time though).
--
// John Markus Bj�rndalen
From: John Markus Bjorndalen
Subject: Re: [executables] was: why Haskell hasn't replaced CL yet?
Date:
Message-ID: <hvaeklb0d7.fsf@johnmnb.cs.uit.no>
Tim Bradshaw <···@cley.com> writes:
> I tried this and it's basically the same (but my machine is a bunch
> slower to start than Erik's for reasons I don't really know or care).
I noticed the same thing. Then I did a small "bench" of various
tools[1] with the output redirected to /dev/null. This is the time
reported for all 50 iterations running on my notebook:
empty C program 0.07s user 0.05s system 0.12s real 99%
empty C++ 0.15s user 0.08s system 0.23s real 99%
bc -qlv 0.17s user 0.09s system 0.27s real 98%
perl -e ";" 0.18s user 0.08s system 0.27s real 97%
sawmill-client 0.47s user 0.03s system 0.50s real 99%
bash -c '' 0.48s user 0.28s system 0.76s real 100%
clisp -x '' 0.85s user 0.46s system 1.31s real 100%
python -c "" 1.13s user 0.16s system 1.29s real 99%
zsh -c '' 1.01s user 0.62s system 1.63s real 100%
acl -batch -kill 1.35s user 0.52s system 1.87s real 99%
cmucl -eval '(quit)' 0.89s user 1.78s system 2.68s real 99%
java test 5.77s user 1.43s system 7.20s real 100%
Apart from the Java startup time (maybe some other JVM would have been
better) almost all of them were within an order of magnitude from each
other. Not too bad when you consider the difference in these systems.
Actually, if someone worries whether it takes 4 or 40ms to start up a
program they should consider ditching the fork+exec model before they
do any further work. But that's just _my_ opinion :)
1. The C and C++ programs were tested with and without -O6
(egcs-1.1.2), and with and without the return statement:
#include <stdio.h>
int main()
{
return 0;
}
The Java program was:
public class test {
public static void main(String args[]) {}
}
compiled with javac -O test.java (jdk117_v3).
Of course, the compile time wasn't included in the test ;-)
--
// John Markus Bj�rndalen
John Markus Bj�rndalen wrote:
>I noticed the same thing. Then I did a small "bench" of various
>tools[1] with the output redirected to /dev/null. This is the time
>reported for all 50 iterations running on my notebook:
>
>empty C program 0.07s user 0.05s system 0.12s real 99%
>empty C++ 0.15s user 0.08s system 0.23s real 99%
>bc -qlv 0.17s user 0.09s system 0.27s real 98%
>perl -e ";" 0.18s user 0.08s system 0.27s real 97%
>sawmill-client 0.47s user 0.03s system 0.50s real 99%
>bash -c '' 0.48s user 0.28s system 0.76s real 100%
>clisp -x '' 0.85s user 0.46s system 1.31s real 100%
>python -c "" 1.13s user 0.16s system 1.29s real 99%
>zsh -c '' 1.01s user 0.62s system 1.63s real 100%
>acl -batch -kill 1.35s user 0.52s system 1.87s real 99%
>cmucl -eval '(quit)' 0.89s user 1.78s system 2.68s real 99%
>java test 5.77s user 1.43s system 7.20s real 100%
perl startup time is pretty impressive.
on real-world systems most of the time will go into dlopen (find-file)
calls I assume. that will be the java problem here.
--
Reini Urban
http://xarch.tu-graz.ac.at/autocad/news/faq/autolisp.html
·············@not.for.spam writes:
> On 28 Feb 2000 02:19:59 +0000, Erik Naggum <····@naggum.no> wrote:
>
> > do tell me just _why_ do we have to lie? this is so blatantly stupid I
> > get _sick_. on my system, the default Allegro CL starts up in about 20
> > ms and with one my applications which has a lot of startup-time
> > compucation, it takes about 35 ms on a bad day.
>
> 20 ms implies you can run it 50 times per second in a script loop.
> What computer is that on? Have you actually tested it in a script
> loop to verify that you can run it 50 times per second?
Again if you run something 50 times per second, then the simple filter
approach will lose big, and even the Unix crowd has seen this: Look
at all the approaches to get away from the stupid basic CGI approach
in Apache, for example.
And it's very simple to behave more intelligently even if you have to
be a filter: Create a persistent process that keeps state, and
thereby eliminates both environment startup time as well as filter
startup time. Then create a small script that connects via a socket
to the running process, hands over any information needed, waits until
the process returns the result, and funels it back. If you get really
intelligent about it, you might try to include the "script" in the
"client", e.g. as a module into Apache.
Look at current developments in webserving to see that nobody believes
plain CGI is the solution for anything but trivial problems. But
instead of recognizing that CL has been there at least 20 years
earlier, most people will insist that CL start making the stupid
mistakes that others are currently trying hard to stop from doing.
> In the past I've seen people ask how fast Allegro executables
> could start up, and the answers I saw were "less than one
> second" and "you shouldn't do it that way." I've never had an
Well, it seems the CL community has the following problem then:
Instead of answering question like this:
Q: I'm doing this really stupid thing T in X, how fast can I do T in
Common Lisp?
A: Well, doing T is fundamentally stupid, and while doing it in CL is
fast enough anyway, you'd better start doing something intelligent,
regardless of whether you do it in X or CL, and here's how: ...
We should answer like this:
Q: I'm doing this really stupid thing T in X, how fast can I do T in
Common Lisp?
A: Sure thing CL is very good at doing stupid things, too, so doing T
in CL will be very cool and fast indeed. Don't bother to learn that
doing T is stupid.
Somehow I doubt that the world will be a better place for this,
though.
> opportunity to measure it myself, and did not intend to give an
> impression of having actual numbers. The number I mentioned
> was part of my example of how the answer should be given. It
> could have been any number and still meant the same thing.
BTW: On the lowly AMD K6-2 350 I'm writing this on, I can only start
~20 ACL processes per second in a loop. Oh my. But while we are
comparing stupid things, let's do it right:
Running 1000 null processes from a subshell takes:
Implementation Real(s) User(s) Sys(s) Proc/s ms/Proc
- CMUCL 2.4.18a 103.598 42.500 60.420 9.652 103.60
- Python 1.5 51.963 43.360 8.260 19.244 51.96
- ACL 5.0 48.900 34.120 14.470 20.449 48.90
- Tclsh 8.0 29.340 22.070 7.110 34.083 29.34
- Python 1.5 (-S) 21.652 16.440 5.050 46.185 21.65
- CLISP 1997-12-06-1 19.034 8.840 10.060 52.537 19.03
- GCL 2.2.1 14.392 6.280 8.000 69.483 14.39
- Perl 5.005 10.191 6.150 3.980 98.125 10.19
- Perl 5.004 9.928 5.690 4.190 100.725 9.93
- BASH 2.02.1(1) 9.226 4.640 4.540 108.389 9.23
- ECL 0.27 3.673 0.730 2.940 272.257 3.67
The above numbers were achieved on an unloaded AMD K6-2 350 with 128
MB SDRAM-100 and Linux 2.2.13. Of course these numbers mean shit, in
any real-world system, where caching issues, context-switches
etc. will have serious effects, and anyway noone would do the above
anyway, so please don't take these numbers seriously at all, if you
can think right and know what you are doing. If not, then read the
following:
So, CL is the best choice even for doing very stupid things.
Satisfied now? I expect you to start using ECL for all your filters
immediately, of course. Oh, there are other factors involved, too?
Isn't live just very unfair...
Regs, Pierre.
--
Pierre Mai <····@acm.org> PGP and GPG keys at your nearest Keyserver
"One smaller motivation which, in part, stems from altruism is Microsoft-
bashing." [Microsoft memo, see http://www.opensource.org/halloween1.html]
In article <··················@news.earthlink.net>,
·············@not.for.spam writes:
> On 28 Feb 2000 02:19:59 +0000, Erik Naggum <····@naggum.no> wrote:
>
>> do tell me just _why_ do we have to lie? this is so blatantly stupid I
>> get _sick_. on my system, the default Allegro CL starts up in about 20
>> ms and with one my applications which has a lot of startup-time
>> compucation, it takes about 35 ms on a bad day.
>
> 20 ms implies you can run it 50 times per second in a script loop.
> What computer is that on? Have you actually tested it in a script
> loop to verify that you can run it 50 times per second?
if there is a need to run it 50 times per second i am sure the
application deliverer will make sure that it can be done. a shell
script isn't the only way to start an application. you probably are
aware (at least you should be) that web servers apply a few methods to
speed up the start up of cgi-scripts (fast-cgi) for the languages that
you just put up as an example for fast startup, which implies that these
languages aren't always as fast to start up as you imply (the startup
time of a program obviously is more a function of the application than
the language it is written in)
--
Hartmann Schaffer
It is better to fill your days with life than your life with days
* not for email wrote:
> Lisp has a lot of advantages and some disadvantages. To sell Lisp,
> we have to show that the advantages outweigh the disadvantages. We
> can't do that by evading questions about the disadvantages. We have
> to instead say something like, "yes, a Lisp program does take 750 ms
> to start running, but here are the ways you can mitigate that, and here
> are the advantages you get for tolerating that."
Look, once and for all, let's get rid of this stupid bloody myth about
startup times. We've been through this *recently* here, so I'll just
quote the figures I produced last time through:
On a 333MHz ultrasparc with enough memory, gcl 2.3 is around 0.03
secs to run a null program
clisp (recent version) seems to be about 0.08
cmucl seems to be about 0.2
Your figures are out by more than a factor of 25 for gcl, almost 10
for clisp, a mere 4 for cmucl. On a slow machine even. Gcl is about
as fast as perl to start.
Yes I *know* lisps were way slow to start 15 years ago on vaxen, I was
there. But that was then and this is now.
--tim
On 28 Feb 2000 02:05:13 +0000, Tim Bradshaw <···@cley.com> wrote:
> On a 333MHz ultrasparc with enough memory, gcl 2.3 is around 0.03
> secs to run a null program
To clarify this, do you mean you compiled a null program into an
executable binary, and tested it with "time" or in a script loop?
That's very impressive. What's the URL of a document explaining
how to compile a Lisp program into an executable binary with gcl?
* not for email wrote:
> To clarify this, do you mean you compiled a null program into an
> executable binary, and tested it with "time" or in a script loop?
No, I ran the default gcl image but gave it a null program.
If you make a binary with a really null top level it's under 0.02 secs
on my machine.
> That's very impressive. What's the URL of a document explaining
> how to compile a Lisp program into an executable binary with gcl?
(I don't know if this is the official way, this works anyway):
(defun system:top-level ())
(save "/tmp/null")
--tim
Tunc Simsek wrote:
>On 25 Feb 2000, Jeff Dalton wrote:
>> ···@cs.cmu.edu (Scott E. Fahlman) writes:
>> > CL and Dylan are in slightly different parts of the design space. CL
>> > offers great runtime flexibility, but at the cost of carrying around
>> > substantial parts of its program development environment at runtime --
>> > rather like a tortoise carrying its house around. "Delivery modes"
>> > that produce compact CL applications arrived late in the game and are
>> > only a partial solution.
>
>Regarding this point, I was asked the following question: "how small of
>an executable can you get from a CL program that simply prints HELLO WORLD
>at the term".
>
>I don't know the answer, infact I don't even know how to produce an
>executable from a Lisp program, I never had any need for it.
gail anderson from the edinburgh university presented at the berkeley
conference a very rich ACL application which fits onto a 1.4MB floppy.
it is a layout generator, has an interpreter for an internal geometric
rule language, is production quality, with graphical interface, produces
the yellow papers for the british telephone books. (sorry, lost the link
to the webpage)
what else do you want?
--
Reini Urban
http://xarch.tu-graz.ac.at/autocad/news/faq/autolisp.html
On Sun, 27 Feb 2000, Reini Urban wrote:
> Tunc Simsek wrote:
> >On 25 Feb 2000, Jeff Dalton wrote:
> >> ···@cs.cmu.edu (Scott E. Fahlman) writes:
> >> > CL and Dylan are in slightly different parts of the design space. CL
> >> > offers great runtime flexibility, but at the cost of carrying around
> >> > substantial parts of its program development environment at runtime --
> >> > rather like a tortoise carrying its house around. "Delivery modes"
> >> > that produce compact CL applications arrived late in the game and are
> >> > only a partial solution.
> >
> >Regarding this point, I was asked the following question: "how small of
> >an executable can you get from a CL program that simply prints HELLO WORLD
> >at the term".
> >
> >I don't know the answer, infact I don't even know how to produce an
> >executable from a Lisp program, I never had any need for it.
>
> gail anderson from the edinburgh university presented at the berkeley
> conference a very rich ACL application which fits onto a 1.4MB floppy.
>
> it is a layout generator, has an interpreter for an internal geometric
> rule language, is production quality, with graphical interface, produces
> the yellow papers for the british telephone books. (sorry, lost the link
> to the webpage)
>
> what else do you want?
Good example, thanks. I'd also be interseted in learning how small
executables are produced (for example in ACL or CMUCL).
> --
> Reini Urban
> http://xarch.tu-graz.ac.at/autocad/news/faq/autolisp.html
>
>
* Tunc Simsek wrote:
> Good example, thanks. I'd also be interseted in learning how small
> executables are produced (for example in ACL or CMUCL).
I believe that the layout app was done just using using whatever tool
acl (this was the PC acl 3.x) used for dumping images. We will try
and put the paper up for www access in the next week or so (it is
probably available elsewhere but I don't remember where).
In general I don't really understand the stress on tiny applications
-- I mean my *calculator* has more than 1Mb of memory, and memory is
*so* cheap. And any substantial difference in size between Lisp and
C/Java/C++/blah is only going to exist for really small applications
anyway. Any SW you sell will likely be on a CDROM and really you have
enough space on one of those for a lisp image.
The place where I can see it mattering is applications which are
delivered on-the-fly over the network into a browser or something.
But that battle has been won for us -- everyone *knows* you need the
java VM to run java you fetch, or the flash plugin to do shockwave or
whatever (cause netscape to freeze in my case). So it's only
reasonable that you should need the lisp plugin to run fasls you
fetch. And .fasl files are easily competitive with random .o files
per unit of functionality.
So I'd like to turn this around: why does anyone *care* about a
sub-1Mb standalone application any more?
--tim
* Tim Bradshaw <···@cley.com>
| So I'd like to turn this around: why does anyone *care* about a sub-1Mb
| standalone application any more?
that's simple, Tim: because they want to keep their myths alive and well.
"Lisp is big and slow" is not a fact, it's a religious belief. you can't
turn people's religions around with facts. if people deep down trust
that this is the reason Lisp doesn't win, they'll return to it every time
Lisp doesn't win, true or not, supported by evidence or not -- it's just
how they "feel", anyway.
most people have only one _real_ desire in their life: to feel safe in
the correctness of their beliefs. the only way to make this happen is to
hand them something obviously better and correct to believe in while you
burn down whatever it was they considered safe thoroughly.
consider this a theory of the ecology of ideas, where the winning ideas
are like predators sneaking up on whoever feels smugly safe. there will
be a lot of screaming and shouting while the stale ideas are killed and
their proponents act like scared monkeys, but afterwards, you won't have
a problem with stale ideas resurfacing. sadly, the Lisp community has
not been willing to kill off the idiotic ideas with sufficient force, and
when I try, a whole bunch of monkeys scream so much it's time to call the
Discovery Channel crew and film them.
#:Erik
* Erik Naggum wrote:
> that's simple, Tim: because they want to keep their myths alive
> and well. "Lisp is big and slow" is not a fact, it's a religious
> belief. you can't turn people's religions around with facts. if
> people deep down trust that this is the reason Lisp doesn't win,
> they'll return to it every time Lisp doesn't win, true or not,
> supported by evidence or not -- it's just how they "feel", anyway.
It's interesting that at the moment there are two sorts of myth about
Lisp. The old traditional `big & slow' one is still alive and well,
but seems basically now to be people repeating the lies their parents
told them. But more interesting is the `lisp is too small' myth,
which people are beginning to believe -- more interesting because in
some sense it's true -- lisp certainly is small compared to
monstrosities like Java and C++...
Anyway you were at LUGM so you know my theory on this (someday I'll
get the HTML version done, so everyone will), but what is most
interesting is that people can believe *both these things at once*. I
live in hope that when I point this out to someone they will whirr and
click furiously for a few seconds, and then there will be a grinding
noise, smoke will pour from their ears and the top of their head will
come off and a multitude of cogs & gears will spill out, like
computers in 60s scifi films which have been forced to confront some
inconsistency. Sadly I've been disappointed so far... (perhaps we
should try this on Xah Lee, I'm fairly sure he's a computer).
--tim
Tim Bradshaw <···@cley.com> wrote in message
····················@cley.com...
> It's interesting that at the moment there are two sorts of myth about
> Lisp. The old traditional `big & slow' one is still alive and well,
> but seems basically now to be people repeating the lies their parents
> told them. But more interesting is the `lisp is too small' myth,
> which people are beginning to believe -- more interesting because in
> some sense it's true -- lisp certainly is small compared to
> monstrosities like Java and C++...
>
> Anyway you were at LUGM so you know my theory on this (someday I'll
> get the HTML version done, so everyone will), but what is most
> interesting is that people can believe *both these things at once*.
Just to point out that it is possible that both are true without
contradiction - Lisp could be very big but filled with the wrong stuff, so
it ends being not enough.
In conjunction with this, I want to point out again to those who wish to
evangelize Lisp that being right means squat, and that appearing to be in
the majority means everything if you want to actually be in the majority.
This applies as much to technical folk as it does it to the less nerdly
variety. Also, as a rule, it tends to alienate people when you tell them
they're stupid or that what they're doing is stupid.
-- Harley
Voice in the desert: Quiet, isn't it, Harley Davis?
[various good points skipped]
> Also, as a rule, it tends to alienate people when you tell them
> they're stupid or that what they're doing is stupid.
You've made some good points there, but this is the crux.
--
Email address intentially munged | You can never browse enough
will write code that writes code that writes code for food
Tim Bradshaw wrote:
>...
>The place where I can see it mattering is applications which are
>delivered on-the-fly over the network into a browser or something.
>But that battle has been won for us -- everyone *knows* you need the
>java VM to run java you fetch, or the flash plugin to do shockwave or
>whatever (cause netscape to freeze in my case). So it's only
>reasonable that you should need the lisp plugin to run fasls you
>fetch. And .fasl files are easily competitive with random .o files
>per unit of functionality.
>
>So I'd like to turn this around: why does anyone *care* about a
>sub-1Mb standalone application any more?
well, I DO care in practice.
I want easy packaging and end-user installation.
That's why I try to do even assembler only for Win32 GUI's. :)
=> http://www.eskimo.com/~htak/win95asm/win95asm.htm
e.g. a typical rich assembler Win32 EXE with lot's of GUI (in fact just
kernel and gui dll calls) has about 10.000 byte.
lisp vm's are not that common as system dll's, browser plugins or the
java vm. endusers are usually very impressed by small sized and
standalone executables.
hd space is a big myth: small == fast.
exe/dll packers are also VERY popular.
So you can cheat and provide a very small exe which dynaloads your lisp,
as MzScheme or CormanLisp do. Just a psychological trick.
But aggressively treeshaked lisp apps with lot's of dynamic foreign
function calls instead of lisp library calls might come close.
--
Reini Urban
http://xarch.tu-graz.ac.at/autocad/lisp/ffis.html
* Reini Urban wrote:
> hd space is a big myth: small == fast.
Wait a minute. We're talking sub-1Mb once-off executables (and
obviously a much smaller per-executable size if you expect to have a
lot of them & can amortise the space in a library.
Can you *get* a disk smaller than 1Gb now? That's a thousand of these
things. I don't know about you but I have 40 times that much disk (in
3 drives) and it's reasonably fast. Last time I bought any disk (a
while ago. Despite being a lisp shop our huge executables don't seem
to fill up the disk too fast...) I was told I had to have a 9Gb unit
because the 4Gb ones were no longer made...
The only place where small images matter is downloading stuff over the
net. Yet people happily fetch xyz plugin if they need it, and those
things are usually way bigger than 1Mb. The same trick would obviously
work for Lisp.
So whatever it is they care about, it ain't size: they won't use Lisp
*even if it's tiny*. We need to look somewhere else for the reasons,
and not hark on at the feeble excuses -- if lisp was `cool' they would
use it even if it was huge (which it isn't).
--tim
Voice in the desert: Quiet, isn't it, Tim Bradshaw?
> So whatever it is they care about, it ain't size: they won't use Lisp
> *even if it's tiny*. We need to look somewhere else for the reasons,
> and not hark on at the feeble excuses -- if lisp was `cool' they would
> use it even if it was huge (which it isn't).
I was asking myself questions like this a few years ago. I've no idea,
and neither does anyone else, it seems.
Even 3 years ago, size wasn't an issue. I recall asking a Franz
salesperson about executable size, and noticed that a statically
linked C++ program using MFC was the same size.
Even when VB coders were telling me that VB was slow, this didn't stop
them using it. So speed isn't an issue either.
People boggle when I tell them I use Lisp. Why? I tell someone that I
can write code that writes code that writes code, and they think I'm
giving them marketing BS. Why?
Paul Graham created Viaweb using Lisp and Yahoo bought it for $49M.
What do non-Lispers think when you tell them this?
--
Email address intentially munged | You can never browse enough
will write code that writes code that writes code for food
Martin Rodgers <···@thiswildcardaddressintentiallyleftmunged.demon.co.uk>
wrote in message ·······························@news.demon.co.uk...
> Voice in the desert: Quiet, isn't it, Tim Bradshaw?
>
> > So whatever it is they care about, it ain't size: they won't use Lisp
> > *even if it's tiny*. We need to look somewhere else for the reasons,
> > and not hark on at the feeble excuses -- if lisp was `cool' they would
> > use it even if it was huge (which it isn't).
>
> I was asking myself questions like this a few years ago. I've no idea,
> and neither does anyone else, it seems.
This is an odd attitude. Lots of people including Lispers have a fairly good
understanding of market dynamics. Most of the originators and promulgators
of Java were Lisp retreads. Once they had their killer app they knew how to
ride the wave fairly well.
If you yourself don't understand why the basic dynamics of the market will
work to prevent a niche technology platform from achieving widespread
success, there are several good basic books on marketing that will help you
out.
-- Harley
Voice in the desert: Quiet, isn't it, Harley Davis?
> Once they had their killer app they knew how to ride the wave fairly well.
Where's the "killer app" for Lisp today?
--
Email address intentially munged | You can never browse enough
will write code that writes code that writes code for food
Martin Rodgers <···@thiswildcardaddressintentiallyleftmunged.demon.co.uk> writes:
> Voice in the desert: Quiet, isn't it, Harley Davis?
>
> > Once they had their killer app they knew how to ride the wave fairly well.
>
> Where's the "killer app" for Lisp today?
Paul Graham mentions Emacs, Autocad and Interleaf, but those
are all getting a bit long in the tooth. I'd like to suggest that the
Lisp "killer-apps" are actually the development environments...
I think this makes sense, as the only people who should be
concerned that an application is written in Lisp are twose who write
it (ok, possibly also advanced users who are able to extend the
application, *easily*, because the use of Lisp as a platform may make
such extensions easier to provide than other platforms).
--
Raymond Wiker, Orion Systems AS
+47 370 61150
Voice in the desert: Quiet, isn't it, Raymond Wiker?
> > Where's the "killer app" for Lisp today?
>
> Paul Graham mentions Emacs, Autocad and Interleaf, but those
> are all getting a bit long in the tooth. I'd like to suggest that the
> Lisp "killer-apps" are actually the development environments...
I very nearly added the qualifier "apart from Emacs". ;) This is a
very hot religious topic, but if someone is already using Emacs then
perhaps Lisp is already half sold. Non Emacs users are another matter.
The trouble with Emacs is...It looks too much like an editor. Until
you look deeper, you can't appreciate what makes it so special. Hmm.
Development environments tightly integrated with a language are not
uncommon, esp these days. While Lisp environments go further than
environments for most other languages, this distinction suffers from
the same problems as Lisp itself: mass ignorance of Lisp.
_We_ know what makes Lisp special. Non-Lispers can't see it until they
look, and until they do Lisp will be just one more language they don't
know about. After all, everyone claims that _their_ tools are special.
Does anyone remember HotJava? There was a killer app. However, it
wasn't long before it was no longer so unique, so "killer". Now all
the top browsers have Java, and JavaScript too. So the emphasis has
moved to the server end, where Java _might_ still offer uniqueness.
I said "might" because I'm thinking of CL-HTTP.
Perhaps the "killer" factor of an app is inversely proportional
to the availablity - and hype - of alternatives?
> I think this makes sense, as the only people who should be
> concerned that an application is written in Lisp are twose who write
> it (ok, possibly also advanced users who are able to extend the
> application, *easily*, because the use of Lisp as a platform may make
> such extensions easier to provide than other platforms).
Emacs, for example. As I said above, to most people it'll just look
like an editor. Whatever power you or I may get from using Lisp isn't
easy to communicate to non-Lispers. As Paul Graham asked in "On Lisp",
what can you do in Lisp that you can't do in other languages?
On page 398 he made a point that is even more relevant to this thread,
"The lack of a a distinct name for the concepts underlying Lisp may be
a serious barrier to the language's acceptance." He concludes with the
not too helpful obvervation that, "Perhaps we should resign ourselves
to the fact that the only accurate name for what Lisp offers is Lisp."
--
Email address intentially munged | You can never browse enough
will write code that writes code that writes code for food
Tim wrote:
> if lisp was `cool' they would use it even if it was huge (which it
> isn't).
This then begs the question: how do you make lisp `cool'?
Best Regards,
:) will
William Deakin <·····@pindar.com> writes:
> Tim wrote:
>
> > if lisp was `cool' they would use it even if it was huge (which it
> > isn't).
>
> This then begs the question: how do you make lisp `cool'?
You don't _make_ lisp cool, because it already _is_ cool.
When I was young, I was always considered a nerd. I was laughed
at, and it bothered me, but I pressed on because I knew who I
was. Now I'm seeing nerd-mania and even seeing an article about how
many women find Dilbert-types sexy, and I think "OK, I've become
cool." But I didn't change, the world's perception changed.
--
Duane Rettig Franz Inc. http://www.franz.com/ (www)
1995 University Ave Suite 275 Berkeley, CA 94704
Phone: (510) 548-3600; FAX: (510) 548-8253 ·····@Franz.COM (internet)
Duane Rettig wrote:
> When I was young, I was always considered a nerd. I was laughed at, and
> it bothered me, but I pressed on because I knew who I was.
To be honest, I (along with a large number of the people who read this
newsgroup, i'm sure ;) find something in common with this experience.
> Will writes:
> > Tim wrote:
> >> if lisp was `cool' they would use it even if it was huge (which it
> >> isn't).
> > This then begs the question: how do you make lisp `cool'?
> You don't _make_ lisp cool, because it already _is_ cool.
I _know_ that, you _know_ that...but ask the development manager or the
Orion development team here or my brother. Common Lisp, urggh or who he?
If you compare the profile of CL with Linux say (an apples and oranges
comparison but...) I have read articles about Linux in the Times, (that is
the London Times), The Observer, La Reppublica &c. Not about CL.
> Now I'm seeing nerd-mania and even seeing an article about how many women
> find Dilbert-types sexy, and I think "OK, I've become cool." But I
> didn't change, the world's perception changed.
Yes, agreed. The question is still: how do you make lisp `cool'? But by
this I mean: how do you change the world's perception of cool to that of
more than just the illuminati[1]? Or is Erik right in his assertion that CL
is something you mature into[2]?
Best Regards,
:) will
[1] This makes me think of an episode of a BBC comedy called Black Adder.
In this the male protagonist finds out that he is attracted to men, which,
as this is the unreconstructed middle-ages, upsets him. In his despiration
to `cure' his problem he contacted a hideously ugly crone or `wisewoman.'
She then offers three ways of resolving his dilema: he can kill himself,
ending his internal stuggle; he can kill the object of his desire; or (as
neither of these choices are acceptable) kill everybody else in the world.
But I digress.
[2] I'm sure I've misattributed this and look forward to being put right ;)
* Duane Rettig wrote:
> William Deakin <·····@pindar.com> writes:
>> Tim wrote:
>>
>> > if lisp was `cool' they would use it even if it was huge (which it
>> > isn't).
>>
>> This then begs the question: how do you make lisp `cool'?
> You don't _make_ lisp cool, because it already _is_ cool.
The quote marks were very deliberate. I meant something like
`perceived to be cool by lots of people'.
--tim
Voice in the desert: Quiet, isn't it, Tim Bradshaw?
> > You don't _make_ lisp cool, because it already _is_ cool.
>
> The quote marks were very deliberate. I meant something like
> `perceived to be cool by lots of people'.
This is simple. Find the small group of people that everyone else
turns to for 'coolness'. With clothes and music, this tends to be the
very young. Later on the marketing people get hold of it and add some
hype, and we get something that is 'coo$' (e.g. MTV). By the time the
mass public start buying it, the 'cool' people have moved on to the
next Big Thing. Or so the experts on 'coolness' tell us.
Once upon a time, 'cool' programming tools were created at places like
MIT. In fact, MIT is a very good example - for Lisp. Alas, today they
use Java, so perhaps they've sold out and become 'coo$'.
The Internet version of MTV is Slashdot, but without the "voice of
corporate America". Slashdot may be a "killer app", or at least a
"killer" website, but Slashcode wasn't created using CL-HTTP. Still,
you could look to the next Big Thing, write it in Lisp, and then
promote it on Slashdot and make sure everyone knows you used Lisp.
Hmm. You could just sell your company to Yahoo, instead. I'm not sure
which strategy will work best. ;)
--
Email address intentially munged | You can never browse enough
will write code that writes code that writes code for food
Centuries ago, Nostradamus foresaw a time when Tim Bradshaw would say:
>* Duane Rettig wrote:
>> William Deakin <·····@pindar.com> writes:
>>> Tim wrote:
>>>
>>> > if lisp was `cool' they would use it even if it was huge (which it
>>> > isn't).
>>>
>>> This then begs the question: how do you make lisp `cool'?
>
>> You don't _make_ lisp cool, because it already _is_ cool.
>
>The quote marks were very deliberate. I meant something like
>`perceived to be cool by lots of people'.
I certainly saw that implication; you forgot to also add `due to
some degree of buzz/hype not really attributable to the thing in
question.'
For instance, I got the latest issue of Acura's "magazine for
Acura owners," and apparently Linux is *so* cool
(crowd-shouts "How cool is it?"), *so* cool that Acura decided they
needed to have an interview with the CEO of TurboLinux Inc.
This is a *CAR* magazine.
They normally have articles on places that you might want to drive,
resorts, the new models coming out (which could be taken a *completely*
different way than is meant!), and on wine selection. And for some
reason, they decided that Linux was "cool enough" to be relevant to this.
I'm still having a hard time believing that they published the article.
Lisp is not, by that token, `cool.'
--
"We defeated the enemy with teamwork and the hammer of not bickering."
- The Shoveller, Mystery Men
········@hex.net - - <http://www.ntlug.org/~cbbrowne/lisp.html>
Christopher Browne wrote:
> For instance, I got the latest issue of Acura's "magazine for
> Acura owners," and apparently Linux is *so* cool
> (crowd-shouts "How cool is it?"), *so* cool that Acura decided they
> needed to have an interview with the CEO of TurboLinux Inc.
>
> This is a *CAR* magazine.
>
> They normally have articles on places that you might want to drive,
> resorts, the new models coming out (which could be taken a *completely*
> different way than is meant!), and on wine selection. And for some
> reason, they decided that Linux was "cool enough" to be relevant to this.
>
> I'm still having a hard time believing that they published the article.
So am I. I have included most of this message extant because of this
amazement, and because if I quote it, it may become more belivable. How could
this be? How can I buy a copy? I think this is the point, like in `The
Forbidden Planet' when my id breaks in through the window and carried me off
screaming.
> Lisp is not, by that token, `cool.'
Uhuh. And I rest my case.
Best Regards,
:) will
Centuries ago, Nostradamus foresaw a time when William Deakin would say:
>Christopher Browne wrote:
>> For instance, I got the latest issue of Acura's "magazine for
>> Acura owners," and apparently Linux is *so* `cool'
>> (crowd-shouts "How cool is it?"), *so* cool that Acura decided they
>> needed to have an interview with the CEO of TurboLinux Inc.
>>
>> This is a *CAR* magazine.
>>
>> They normally have articles on places that you might want to drive,
>> resorts, the new models coming out (which could be taken a *completely*
>> different way than is meant!), and on wine selection. And for some
>> reason, they decided that Linux was "cool enough" to be relevant to this.
>>
>> I'm still having a hard time believing that they published the article.
>
>So am I. I have included most of this message extant because of this
>amazement, and because if I quote it, it may become more belivable. How could
>this be? How can I buy a copy? I think this is the point, like in `The
>Forbidden Planet' when my id breaks in through the window and carried me off
>screaming.
Unfortunately, in order to obtain a copy of "Acura Style," it appears that
you need to be an owner of an Acura automobile.
I unfortunately seem to have left the magazine at the office; I'd be game
to take a digital photograph of it and distribute via whatever relevant
means, should that be meaningful as proof.
If you want to pass on word, feel free to quote me on it...
If you want a copy, you might contact a local Acura dealer. They
probably haven't the faintest clue what's *in* the magazine...
>> Lisp is not, by that token, `cool.'
>
>Uhuh. And I rest my case.
Remember, I said `cool,' not cool... :-)
--
The human race will decree from time to time: "There is something at
which it is absolutely forbidden to laugh."
-- Nietzche on Common Lisp
········@ntlug.org - - <http://www.hex.net/~cbbrowne/lsf.html>
Christopher Browne wrote:
>
> Centuries ago, Nostradamus foresaw a time when William Deakin would say:
> >Christopher Browne wrote:
> >> For instance, I got the latest issue of Acura's "magazine for
> >> Acura owners," and apparently Linux is *so* `cool'
> >> (crowd-shouts "How cool is it?"), *so* cool that Acura decided they
> >> needed to have an interview with the CEO of TurboLinux Inc.
> >>
> >> This is a *CAR* magazine.
> >>
Would you believe a cdr magazine?
mlf
Christopher Browne wrote:
> Unfortunately, in order to obtain a copy of "Acura Style," it appears that
> you need to be an owner of an Acura automobile.
And I wouldn't know an Acura even if it ran me over.
> I unfortunately seem to have left the magazine at the office; I'd be game
> to take a digital photograph of it and distribute via whatever relevant
> means, should that be meaningful as proof.
>
> If you want to pass on word, feel free to quote me on it...
>
> If you want a copy, you might contact a local Acura dealer. They
> probably haven't the faintest clue what's *in* the magazine...
Thanks for this. I never knew that such a car existed. I not sure there's a lot of
call for the Acura in North Yorkshire (but I could be wrong ;).
> >> Lisp is not, by that token, `cool.'
> >
> >Uhuh. And I rest my case.
>
> Remember, I said `cool,' not cool... :-)
Yes, I *think* that is what I said. cool not `cool'. Thanks again,
:) will
William Deakin <·····@pindar.com> wrote in message
·····················@pindar.com...
> Christopher Browne wrote:
>
> > Unfortunately, in order to obtain a copy of "Acura Style," it appears
that
> > you need to be an owner of an Acura automobile.
>
> And I wouldn't know an Acura even if it ran me over.
In Europe the Acura cars are marketed under the Honda brand. Honda owns
them; Acura is their "luxury" brand in America. Toyota has an "Infiniti"
luxury brand here as well.
-- Harley
Harley Davis wrote:
> nd I wouldn't know an Acura even if it ran me over.
>
> In Europe the Acura cars are marketed under the Honda brand. Honda owns
> them; Acura is their "luxury" brand in America. Toyota has an "Infiniti"
> luxury brand here as well.
toyota is lexus. Nissan is infiniti
dave
Christopher Browne <········@knuth.brownes.org> wrote in message
····························@knuth.brownes.org...
> For instance, I got the latest issue of Acura's "magazine for
> Acura owners," and apparently Linux is *so* cool
> (crowd-shouts "How cool is it?"), *so* cool that Acura decided they
> needed to have an interview with the CEO of TurboLinux Inc.
Linux is cool to a non-techie audience because there's huge amounts of money
involved now. Drivers of Acuras think this is important. Also, Acura has
always promoted a high-tech image - hence the VTEC and the NSX.
-- Harley
Tim Bradshaw <···@cley.com> wrote in message
····················@cley.com...
> * Duane Rettig wrote:
>
> The quote marks were very deliberate. I meant something like
> `perceived to be cool by lots of people'.
>
> --tim
>
How about a nifty Mascot? Linux has Tux, BSD has their little daemon, and
Java has the whole coffee allusion :)
Actually, I think just keeping in the 'public eye' is a big plus. Snazzy
web pages, fun public gatherings, developers announcing release on
high-profile sites (and doing it often)....
It may seem "Shallow" but things like this give the Lisp community a sense
of vibrance and direction...making it appealing to other developers.
-Drew
Andrew McDowell wrote:
> How about a nifty Mascot? Linux has Tux, BSD has their little daemon, and
> Java has the whole coffee allusion :)
Could I suggest a wav file of Michael Palin struggling with the letters t and
r (I would suggest Arkwright from `open all hours' but that would lay me open
to accusations of being a too Euro- (or is that UK) centric, and anyway he
stutters. ;)
A more serious suggestion would be a piglet (very intelligent animals, pigs)
or a sand-piper (they hang around beaches and run with there legs whirring
about but with their body hardly moving up-and-down, a sort of GC analogy
perhaps ;)
Loathed as I am to say it, this all sounds like a job for a marketing company.
Or at least a competition.
Cheers,
;) will
> A more serious suggestion would be a piglet (very intelligent animals, pigs)
> or a sand-piper (they hang around beaches and run with there legs whirring
> about but with their body hardly moving up-and-down, a sort of GC analogy
> perhaps ;)
What about snakes? They lithp and kinda look like braces...
*looking at Linux' success*
And make it cute.
*looking at Tombraiders success*
...and make it a big-breasted snake. With guns.
...Michael...
Tim Bradshaw wrote:
>
> * Duane Rettig wrote:
>
> > You don't _make_ lisp cool, because it already _is_ cool.
>
> The quote marks were very deliberate. I meant something like
> `perceived to be cool by lots of people'.
>
But what kind of people? Investors, managers, engineers, painters, writers,
system architects, coders, or just plain people?
You are right in that ordinary users do not worry about 1-Mb plugins, or
1-minute vs 5-minute download times. Heck, they don't worry about these things
at all. They just retrieve the stuff and use it, and when the disk becomes
crowded, they buy a new. Bandwidth? A well-known company is making it's outmost
to reduce that problem ... hrrm.
Lisp aficionados are not unlike HiFi enthusiasts, or lovers of old cars. But a
HiFi freak owns 17 loudspeakers and one test record, and Hot-Rod people want
only the rear suspension and axle of a Jaguar XJ6. This kind of people worries
about 1-Mb disk space and execution speeds. I think.
Raymond Wiker suggested that developer environments are the real killer-apps. I
agree, because who else but the developer will be attracted anyway? I don't know
anything about the full-fledged commercial CL systems, but I suspect that much
of the interest in a <whisper> lisp-os came from disappointment at the
environment support tools in the non-commercial versions of CL.
Regards,
Lars
On 01 Mar 2000 12:18:37 +0000, Tim Bradshaw <···@cley.com> wrote:
>`perceived to be cool by lots of people'.
The reason Lisp is not perceived to be cool by the masses might
just be because the Lisp community does not perceive the masses
to be cool.
> >`perceived to be cool by lots of people'.
>
> The reason Lisp is not perceived to be cool by the masses might
> just be because the Lisp community does not perceive the masses
> to be cool.
Aren't we an elitist bunch?
To get on the more serious side, which computer language is
considered cool at all? Especially the most often used
languages (erm, like COBOL, C, Perl...) are often critiqued,
where Lisp is often cited for elegance. It seems that Lisp
_is_ cool, but many people are afraid to touch it's icy
skin... ;)
Languages are to remote for the normal user, so they'll
never be cool like operating systems, applications or
games. Cool for developers? Well, if every developer could
choose the language they program with...
Lisp is like ABBA. The person you're talking to might like
them as much as you do, but none of you will admit it. Ever.
...Michael...
On Wed, 01 Mar 2000 17:26:22 +0100, Michael Dingler <········@mindless.com>
wrote:
>To get on the more serious side, which computer language is
>considered cool at all? Especially the most often used
>languages (erm, like COBOL, C, Perl...) are often critiqued,
Java. The hype of Java is not just from the vendors, but also from
the programmers who use it. It's very common for programmers
who use C++ etc. to actually be envious of Java programmers, and
to spend a lot of time and effort trying to persuade their employers
to use Java.
·············@not.for.spam writes:
> On Wed, 01 Mar 2000 17:26:22 +0100, Michael Dingler <········@mindless.com>
> wrote:
>
> >To get on the more serious side, which computer language is
> >considered cool at all? Especially the most often used
> >languages (erm, like COBOL, C, Perl...) are often critiqued,
>
> Java. The hype of Java is not just from the vendors, but also from
> the programmers who use it. It's very common for programmers
> who use C++ etc. to actually be envious of Java programmers, and
> to spend a lot of time and effort trying to persuade their employers
> to use Java.
And when C++ was hyped, many programmers using such "yucky" languages
as Ada, C, COBOL, etc., were trying to persuade their employers to use
C++. And many employers fell for it, given that all the trade-press
was also jumping on the OOP (C++/Smalltalk) band-waggon.
Now one might conjecture, that for language X to get hyped, you need
to have:
a) Some kind of hook to hang the new language on, be that AI, OOP or
the Internet,
b) No stable standard, and better yet no really mature implementations,
c) Lot's of money somewhere in the picture.
Given that CL will not get b) or c) anytime soon, I rather think that
CL won't get hyped in this way for some time to come.
And that's A Good Thing (TM), IMHO, since one should bear in mind that
Hyping of this sort will always lead to a serious backlash some time
in the future, when the hyped thing is dropped for the new Hype
Language of the Decade.
OTOH there are other ways to gain developer mind-share, which are much
more stable, and have less down-sides. Taking a look at the
relationship between Zope and the Python community, or Emacs and the
Lisp community might be useful.
Regs, Pierre.
--
Pierre Mai <····@acm.org> PGP and GPG keys at your nearest Keyserver
"One smaller motivation which, in part, stems from altruism is Microsoft-
bashing." [Microsoft memo, see http://www.opensource.org/halloween1.html]
In article <··············@orion.dent.isdn.cs.tu-berlin.de>,
····@acm.org (Pierre R. Mai) writes:
> ...
> Now one might conjecture, that for language X to get hyped, you need
> to have:
>
> a) Some kind of hook to hang the new language on, be that AI, OOP or
> the Internet,
it seems that the main two things were in all the hype cases i remember:
1. the language is an improvement over what is commonly available
2. a cheap implementation is widely available for a commonly used
platform
3. lots of publicity
neither pascal or c++ had a hool to hang on (i doubt that oop counts)
> b) No stable standard, and better yet no really mature implementations,
pascal started the publicity quite early (when the first implementation
was ready), c, and c++ gained popularity at universities (unix) and
spread with graduates to the industry. it really took off in the dos
world with borland
> c) Lot's of money somewhere in the picture.
there definitely was not much money involved in the popularisation of
pascal, and i don't think that c or c++ had much money behind them
> ...
--
Hartmann Schaffer
··@inferno.nirvananet (Hartmann Schaffer) writes:
> it seems that the main two things were in all the hype cases i remember:
I have been working at an ISP since before java was launched, and as I
see it, the java marketing was extremely well timed: Java was the ultimate
"thing that goes bing" at that time: It made the web "alive" and
"interactive" in a manner not seen before, and the idea of the rise of
the network and the death of the pc was born.
Unfortunately, this didn't all work out according to the dreams of L. Ellison
et al., so what we see now is that the kind of "NC" that really works and
is used a lot is a pc running citrix or some other TS client, and only
occasionally java.
Some java fans actually admit that "java is too slow for gui, servlets are
the future", but as I see it, common lisp (still) has tremendous advantages
over java on the server side. And since java hasn't been such a great
success on the "applet" side either, there is still a chance for lisp to
grab that niche too. Both on tradititional pc (or "NC") platforms and on
small devices, mobile phones etc.
--
(espen)
On 02 Mar 2000 09:00:03 +0100, Espen Vestre
<·····@*do-not-spam-me*.vestre.net> wrote:
>the future", but as I see it, common lisp (still) has tremendous advantages
>over java on the server side. And since java hasn't been such a great
>success on the "applet" side either, there is still a chance for lisp to
>grab that niche too. Both on tradititional pc (or "NC") platforms and on
>small devices, mobile phones etc.
Yes, I think this is THE niche for CL. Do you know if there's
any lisp app server for Apache? :-? Sometimes, using CLHTTP isn't
possible...
//-----------------------------------------------
// Fernando Rodriguez Romero
//
// frr at mindless dot com
//------------------------------------------------
Fernando <·······@must.die> writes:
> >the future", but as I see it, common lisp (still) has tremendous advantages
> >over java on the server side. And since java hasn't been such a great
> >success on the "applet" side either, there is still a chance for lisp to
> >grab that niche too. Both on tradititional pc (or "NC") platforms and on
> >small devices, mobile phones etc.
>
> Yes, I think this is THE niche for CL. Do you know if there's
> any lisp app server for Apache? :-? Sometimes, using CLHTTP isn't
> possible...
>
I completely agree with you (in that lisp is *the* servlet language,
if we keep with Sun's Newspeak), but I might have been somewhat
unclear in the paragraph you quote, because I actually wanted to say
that lisp could have a chance as an *applet* langauge as well.
--
(espen)
On 1 Mar 2000 18:35:00 -0500, ··@inferno.nirvananet (Hartmann Schaffer)
wrote:
> neither pascal or c++ had a hool to hang on (i doubt that oop counts)
Maybe structured programming and object-oriented programming?
Paolo
--
EncyCMUCLopedia * Extensive collection of CMU Common Lisp documentation
http://cvs2.cons.org:8000/cmucl/doc/EncyCMUCLopedia/
On 1 Mar 2000 18:35:00 -0500,
Hartmann Schaffer <··@inferno.nirvananet> wrote:
[snip]
>> b) No stable standard, and better yet no really mature implementations,
>
>pascal started the publicity quite early (when the first implementation
>was ready), c, and c++ gained popularity at universities (unix) and
>spread with graduates to the industry. it really took off in the dos
>world with borland
You seem to be confusing C and C++. They are separate languages.
>> c) Lot's of money somewhere in the picture.
>
>there definitely was not much money involved in the popularisation of
>pascal, and i don't think that c or c++ had much money behind them
Hmm. C++ didn't have money behind it?
--
Kenneth P. Turvey <······@SprocketShop.com>
--------------------------------------------
I wake up each morning determined to change the World... and also to
have one hell of a good time. Sometimes that makes planning the day
a little difficult. -- E.B. White
In article <·····················@pug1.sprocketshop.com>,
Kenneth P. Turvey <······@SprocketShop.com> wrote:
>On 1 Mar 2000 18:35:00 -0500,
>Hartmann Schaffer <··@inferno.nirvananet> wrote:
>[snip]
>>> b) No stable standard, and better yet no really mature implementations,
>>
>>pascal started the publicity quite early (when the first implementation
>>was ready), c, and c++ gained popularity at universities (unix) and
>>spread with graduates to the industry. it really took off in the dos
>>world with borland
>
>You seem to be confusing C and C++. They are separate languages.
>
More or less. One of the selling points of C++ has been that, up to
a point, you don't have to know what language you're writing in,
whether it's C or C++. You could be 'cool' in C++ without knowing
the first thing about OOP, as long as you remembered to use function
prototypes.
>>> c) Lot's of money somewhere in the picture.
>>
>>there definitely was not much money involved in the popularisation of
>>pascal, and i don't think that c or c++ had much money behind them
>
>Hmm. C++ didn't have money behind it?
>
By the time it was hyped, it was already surprisingly popular. Read
Stroustrup's "Design and Evolution ov C++". (Do it. It's a very
good book.)
It's important to have some sort of development environment (it doesn't
have to be good, since vi and cc seem to have been adequate) freely
available. Obviously, that's not sufficient. It's easy to get a free
CL environment going on many Unices, particularly if you don't insist
on native code compilation (and hence use CLisp).
However, I have no frippin' idea why C got to be so popular, unless it's
that Pascal is really that bad.
--
David H. Thornley | If you want my opinion, ask.
·····@thornley.net | If you don't, flee.
http://www.thornley.net/~thornley/david/ | O-
* David Thornley wrote:
> However, I have no frippin' idea why C got to be so popular, unless
> it's that Pascal is really that bad.
There was this operating system...
--tim
In article <···············@cley.com>, Tim Bradshaw <···@cley.com> wrote:
>* David Thornley wrote:
>
>> However, I have no frippin' idea why C got to be so popular, unless
>> it's that Pascal is really that bad.
>
>There was this operating system...
>
Yup. C was going to have an edge on Unix systems. It was going to have
an advantage in academia, which is useful (imprinting lots of students
on C, frightening thought).
What I don't understand is what happened on the non-Unix desktops.
The Macintosh OS interfaces were documented in Pascal, and intended to
be called from existing Pascal implementations. MS-DOS was not
obviously favoring one or the other. (Yeah, there's solid reasons why
Lisp couldn't take over. Pity, that.)
One possibility, which is a bit depressing for this group, is that a
new hot language can't be all that different from the last hot language.
Pascal wasn't that different from Fortran or Basic, C not that different
from Pascal, C++ not that different from C, and Java copied far too much
of the C syntax and semantics for my taste. There's been continuity
as to how to write a given five lines of code or so, and the difference
has been in structuring these fragments.
--
David H. Thornley | If you want my opinion, ask.
·····@thornley.net | If you don't, flee.
http://www.thornley.net/~thornley/david/ | O-
········@visi.com (David Thornley) writes:
> One possibility, which is a bit depressing for this group, is that a
> new hot language can't be all that different from the last hot language.
> Pascal wasn't that different from Fortran or Basic, C not that different
> from Pascal, C++ not that different from C, and Java copied far too much
> of the C syntax and semantics for my taste. There's been continuity
> as to how to write a given five lines of code or so, and the difference
> has been in structuring these fragments.
One plus point about Java is that it is garbage collected; it makes it
very likely that the next language du jour will also have garbage
collection - and that has to be a Good Thing.
Small steps.
Cheers,
Michael
--
very few people approach me in real life and insist on proving they are
drooling idiots. -- Erik Naggum, comp.lang.lisp
In article <··················@news.earthlink.net>,
·············@not.for.spam writes:
> On Wed, 01 Mar 2000 17:26:22 +0100, Michael Dingler <········@mindless.com>
> wrote:
>
>>To get on the more serious side, which computer language is
>>considered cool at all? Especially the most often used
>>languages (erm, like COBOL, C, Perl...) are often critiqued,
>
> Java. The hype of Java is not just from the vendors, but also from
> the programmers who use it. It's very common for programmers
> who use C++ etc. to actually be envious of Java programmers, and
> to spend a lot of time and effort trying to persuade their employers
> to use Java.
reminds me of the time when all the basic and fortran programmers wanted
to upgrade to pascal
--
Hartmann Schaffer
* Michael Dingler wrote:
> To get on the more serious side, which computer language is
> considered cool at all?
Java. Whatever Sun did to make Java so fashionable is worth studying.
--tim
> > To get on the more serious side, which computer language is
> > considered cool at all?
>
> Java. Whatever Sun did to make Java so fashionable is worth studying.
AFAIK Java was the first (and until now) only language that
actually got some marketing at all. So more and more managers
almost demand that you'll use Java. Believe me, I've seen
projects where Java just doesn't make any sense at all...
And faced between C++ and Java, the decision is quite
easy for the programmer. I seriously doubt that the hype
comes (or is intended for) programmers.
On the other hand, a lot of people just don't know any
langauges besides (Visual/GW) Basic, C and Pascal, so
Java might seem quite revolutionary to them. About the
same group who dislikes Lisp because they've done some
undergraduate courses with some ancient interpreter
(or scheme).
...Michael...
Michael Dingler <········@mindless.com> wrote in message
······················@mindless.com...
> > > To get on the more serious side, which computer language is
> > > considered cool at all?
> >
> > Java. Whatever Sun did to make Java so fashionable is worth studying.
>
> AFAIK Java was the first (and until now) only language that
> actually got some marketing at all.
This is not completely true. What is true is that computing in general is
being more heavily marketed now than at any time in the past, so the
percentage that goes to some popular language of the day has gotten more
visible as well.
But let's not forget our history. To some extent the Lisp community has
only itself to blame for its current relative obscurity. For the standards
of the time Lisp was intensely marketed in the mid to late 80's and got
quite a bit of press - companies like Symbolics (well, especially Symbolics
but some others were along for the ride) were among the most highly-profiled
in the public eye. AI winter and all that, remember?
Now maybe it's been long enough since then (but it's only been 10 years)
that the "public" has forgotten the last debacle with Lisp and might be
willing to try again, but I doubt that the term Lisp has lots its negative
connotations so quickly.
-- Harley
Harley Davis wrote:
> Now maybe it's been long enough since then (but it's only been 10 years)
> that the "public" has forgotten the last debacle with Lisp and might be
> willing to try again, but I doubt that the term Lisp has lots its negative
> connotations so quickly.
Well, just to throw in my newbie "two-cents" perspective:
I'm not quite sure what debacle you're referring too, but I'm still
working to overcome the strange association of Lisp with the mental
image of the "dried up hack in the corner working on an obscure chess
program" that seems to be embedded in the collective consciousness.
You just don't _hear_ about Lisp being used in 'cutting-edge' or
'technically sexy' projects....and I'm not quite certain that this isn't
by design. Does the Lisp community _want_ Lisp advocacy? Or do they
want to keep Lisp in the hush to preserve their competitive advantage?
-Drew
* Andrew McDowell <····@getaway.net>
| You just don't _hear_ about Lisp being used in 'cutting-edge' or
| 'technically sexy' projects....and I'm not quite certain that this isn't
| by design.
it depends on whether your ears are open or shut. it is a well-known
fact in the marketing industr that a very large number of people don't
actually hear about anything that doesn't already fit their mental models
of the world, even if you shout into their ears with hundreds of millions
of advertising money.
it amazes me somewhat that this actually needs explaining during U.S.
presidental nominations, but maybe you haven't heard about it. :)
#:Erik
Erik Naggum <····@naggum.no> wrote in message
·····················@naggum.no...
> it depends on whether your ears are open or shut. it is a well-known
> fact in the marketing industr that a very large number of people don't
> actually hear about anything that doesn't already fit their mental
models
> of the world, even if you shout into their ears with hundreds of
millions
> of advertising money.
So just give up without trying?
* "Andrew McDowell" <·············@msfc.nasa.gov>
| So just give up without trying?
no. the only alternative to naive optimism is not depressed cynicism, OK?
#:Erik
Erik Naggum wrote:
> no. the only alternative to naive optimism is not depressed cynicism, OK?
Which was sorta my point....
I'm not trying to start an argument or anything...I'm just trying to
find a way to "Open Ears" :)
-Drew
From: David J. Cooper
Subject: Re: [executables] was: why Haskell hasn't replaced CL yet?
Date:
Message-ID: <38BE8ECC.680E617C@genworks.com>
Andrew McDowell wrote:
>
> You just don't _hear_ about Lisp being used in 'cutting-edge' or
> 'technically sexy' projects....
>
Would it be considered technically "sexy" to generate fully detailed
geometry for the inner panel assembly of an automobile hood ("bonnet"),
send the geometry to a CAE app for automated stress analysis, and
optimize the orientation of the resulting part in a standard shipping
rack to save hundreds of thousands or millions of dollars in rack and
shipping costs over the life of a vehicle program, all in about 12 minutes,
versus the traditional process which requires about 12 weeks of manual
error-prone nonrepeatable CAD work by a highly trained CAD operator?
This afternoon I will be onsite at a major auto manufacturer here in
Detroit, doing a code freeze and preparing a "dxl" image of exactly
such an application for distribution to half a dozen sites around the
world.
A couple reasons that this kind of CL-based project is perhaps not
more widely known:
o large auto and aero manufacturers do in fact tend to keep rather
quiet about exactly what they are doing in their engineering and
design activities, for obvious reasons of perceived competitive
advantage;
o the KBE technologies to which I am referring pose a very real threat
to a large, entrenched empire involving CAD and associated traditional
(C/C++ -based) software development organizations within large companies
and from outside consulting houses. It is analogous to some alternative
engine coming along and threatening the internal-combustion engine --
how long do you think it would take to displace the current empire
in engineering, manufacturing, etc. which is based around the internal
combustion engine, regardless of how technically superior some
alternative might be? It would happen, but it would take time.
The Lisp-based activities which increasingly are delivering much more
meaningful results with far fewer developers, lines of code, etc., tend
to be drowned out politically by the sheer numbers of individuals involved
in these "traditional" camps. Realize that, very often, compensation for
managers and supervisors in large companies depends largely on the number
of people they have reporting to them. They are not necessarily interested
in hearing about or promoting a technology which allows them to achieve
greater results with a fraction of the number of people they currently
have working in their C/C++ sweatshops. Sometimes it takes a top-down force
from higher-level management for them to wake up (for example, having
one's 2000 budget slashed to 40% of its 1999 level, when one had 45+ C/C++
programmers slaving away in a sweatshop on "ancillary" CAD applications
which were delivering questionable results at best when compared with
CL-based projects like the one I cited above).
Yours,
-dave
--
David J. Cooper Jr, Chief Engineer Genworks International
·······@genworks.com 5777 West Maple, Suite 130
(248) 932-2512 (Genworks HQ/voicemail) West Bloomfield, MI 48322-2268
(248) 407-0633 (pager) http://www.genworks.com
"David J. Cooper" wrote:
> Sometimes it takes a top-down force
> from higher-level management for them to wake up (for example, having
> one's 2000 budget slashed to 40% of its 1999 level, when one had 45+ C/C++
> programmers slaving away in a sweatshop on "ancillary" CAD applications
> which were delivering questionable results at best when compared with
> CL-based projects like the one I cited above).
(borrowing on Erik Naggum's terminology..)
Hrmm...now only if we could get those kinds of stories into the ears at
the top.
Or better yet...how do we get those kinds of stories into the ears of
those looking to _replace_ those at the top?
From: Jon S Anthony
Subject: Re: [executables] was: why Haskell hasn't replaced CL yet?
Date:
Message-ID: <38BD81BD.719@synquiry.com>
Tim Bradshaw wrote:
>
> * Michael Dingler wrote:
>
> > To get on the more serious side, which computer language is
> > considered cool at all?
>
> Java. Whatever Sun did to make Java so fashionable is worth studying.
A _lot_ of $$$$$$$$
/Jon
--
Jon Anthony
Synquiry Technologies, Ltd. Belmont, MA 02478, 617.484.3383
"Nightmares - Ha! The way my life's been going lately,
Who'd notice?" -- Londo Mollari
In article <·················@mindless.com>, Michael Dingler
<········@mindless.com> wrote:
>To get on the more serious side, which computer language is
>considered cool at all?
i harbor a secret envy for those who appear to know apl, intercal,
algol, pdp assembler, and how to decipher the menus reserved for
natives at chinese restaurants.
>Lisp is like ABBA. The person you're talking to might like
>them as much as you do, but none of you will admit it. Ever.
the mind reels. you mean i can confess my love of barry manilow
to you guys without fear of ridicule?
sashank
From: Frank A. Adrian
Subject: Re: [executables] was: why Haskell hasn't replaced CL yet?
Date:
Message-ID: <Rifv4.28$Y_.1637@news.uswest.net>
Sashank Varma <·······@vuse.vanderbilt.edu> wrote in message
·····························@129.59.212.53...
> the mind reels. you mean i can confess my love of barry manilow
> to you guys without fear of ridicule?
I'm sorry, but there ARE still SOME boundaries which should not, on the
basis of good taste, be crossed. I hope that we are not succumbing to one
of these Jerry Springer-like confessional things here.
faa
In article <················@news.uswest.net>, "Frank A. Adrian"
<·······@uswest.net> wrote:
>Sashank Varma <·······@vuse.vanderbilt.edu> wrote in message
>·····························@129.59.212.53...
>> the mind reels. you mean i can confess my love of barry manilow
>> to you guys without fear of ridicule?
>
>I'm sorry, but there ARE still SOME boundaries which should not, on the
>basis of good taste, be crossed. I hope that we are not succumbing to one
>of these Jerry Springer-like confessional things here.
>
>faa
announcer: "this week, trailer park lisp programmers."
me: "yeah, i used the foreign function interface to call...c++ code."
audience: "oooooooohhhh."
sashank
<·············@not.for.spam> wrote in message
·······················@news.earthlink.net...
> On 01 Mar 2000 12:18:37 +0000, Tim Bradshaw <···@cley.com> wrote:
>
> >`perceived to be cool by lots of people'.
>
> The reason Lisp is not perceived to be cool by the masses might
> just be because the Lisp community does not perceive the masses
> to be cool.
I quite agree with this.
I love to be in competition with people writing in
C++/Java/VB/Perl/JavaScript/etc.
Lisp gives me an unfair advantage and those guys don't even realise it....
Marc Battyani
·············@not.for.spam writes:
>
> On 01 Mar 2000 12:18:37 +0000, Tim Bradshaw <···@cley.com> wrote:
>
> >`perceived to be cool by lots of people'.
>
> The reason Lisp is not perceived to be cool by the masses might
> just be because the Lisp community does not perceive the masses
> to be cool.
>
coolness or uncoolness lies at the moment of perception.
thi
Duane Rettig <·····@franz.com> wrote in message
··················@beta.franz.com...
> William Deakin <·····@pindar.com> writes:
>
> > Tim wrote:
> >
> > > if lisp was `cool' they would use it even if it was huge (which it
> > > isn't).
> >
> > This then begs the question: how do you make lisp `cool'?
>
> You don't _make_ lisp cool, because it already _is_ cool.
>
> When I was young, I was always considered a nerd. I was laughed
> at, and it bothered me, but I pressed on because I knew who I
> was. Now I'm seeing nerd-mania and even seeing an article about how
> many women find Dilbert-types sexy, and I think "OK, I've become
> cool." But I didn't change, the world's perception changed.
This also begs the question of how the perception changed, which is
everything after all if you're after popularity...
Anyway, in this case it's pretty straightforward that nerds became "popular"
(if indeed they are anywhere but in the media) because they were getting
rich, and this fact was noticed by the popular press. The perception of
having the dough still brings in the chicks, as Rich Rockwell may testify
to.
-- Harley
"Harley Davis" <·············@nospam.museprime.com> writes:
> Duane Rettig <·····@franz.com> wrote in message
> ··················@beta.franz.com...
> > William Deakin <·····@pindar.com> writes:
> >
> > > Tim wrote:
> > >
> > > > if lisp was `cool' they would use it even if it was huge (which it
> > > > isn't).
> > >
> > > This then begs the question: how do you make lisp `cool'?
> >
> > You don't _make_ lisp cool, because it already _is_ cool.
> >
> > When I was young, I was always considered a nerd. I was laughed
> > at, and it bothered me, but I pressed on because I knew who I
> > was. Now I'm seeing nerd-mania and even seeing an article about how
> > many women find Dilbert-types sexy, and I think "OK, I've become
> > cool." But I didn't change, the world's perception changed.
>
> This also begs the question of how the perception changed,
It started with me. I had to become comfortable with who I
am and not try to make myself into what I am not. Reading that
article evoked a "That's cute; I'll have to remember that ..."
out of me, but did not dramatically change my life.
> which is everything after all if you're after popularity...
If you equate "coolness" and "popularity" (and "success", even more
importantly), then your argument is tenable. But I, for one,
draw distinctions from the meanings of all three words.
In case anyone thinks this conversation is digressing, let me assure
you that it still has very much to do with Lisp...
--
Duane Rettig Franz Inc. http://www.franz.com/ (www)
1995 University Ave Suite 275 Berkeley, CA 94704
Phone: (510) 548-3600; FAX: (510) 548-8253 ·····@Franz.COM (internet)
Tim Bradshaw wrote:
> So whatever it is they care about, it ain't size: they won't use Lisp
> *even if it's tiny*. We need to look somewhere else for the reasons,
> and not hark on at the feeble excuses -- if lisp was `cool' they would
> use it even if it was huge (which it isn't).
I think a lot of it is having a good story to tell.
Languages tend to take off by having a reputation for being good at a
particular thing. Once it becomes popular, a language may then start
being used for a lot of other things, but the key initial asset is a
reputation that "this language is good if you want X", for some X that's
in widespread demand.
Here are some traditional X's for various languages:
Fortran: Number crunching
Lisp: AI
Cobol: Business software
C: Systems programming, anything where you need speed, low-level
control, native system access
Forth: Tiny size, malleability
Basic: Ease of learning
Pascal: Teaching structured programming
Smalltalk: Pure object-orientation, GUI stuff
C++: C with objects
Ada: US DoD, other embedded applications
Visual Basic: Ease of learning, rapid GUI development
Perl: Scripting
Python: Like Perl except with a nicer syntax
Java: Cross-platform, Internet stuff, also for those who think C++ is
nice except for being too complicated
Each of these contains or did at one time contain some truth; none of
them reflects the whole truth. But the point is, this is how these
languages are widely *perceived*, true or not.
Now look at Lisp's entry: AI. Indeed, it's probably the dominant
language in that domain. From a mass market point of view though it's
not a very good story to have, simply because only a tiny minority of
programmers work in AI.
So if you want to sell Lisp to the mass market then, IMO the first thing
to do is come up with another story, one with wider appeal.
--
"To summarize the summary of the summary: people are a problem."
Russell Wallace
···············@iol.ie
As I know has been pointed out a number of times, the argument about executable
size is a phoney. Any UNIX (or M$) box has a large set of c-libraries that
allow the OS to load and unload large amounts of stuff dynamically,
dramatically reducing the size of executables.
Reini wrote:
> it is a layout generator, has an interpreter for an internal geometric rule
> language, is production quality, with graphical interface, produces the
> yellow papers for the british telephone books.
Just to be pedantic, CL is used to paginate the Yellow Pages and Business Pages
for all of the UK (except for Kingston-upon-Hull aka Hull which has their own
arrangement, which AFAIK uses Quark). Currently British Telecom Phone Books are
produced using a C based pagination system, which causes alot of problems.
However, moves are afoot... Anyway, I wish I had seen Gail's talk.
;) will
Willwrote:
> As I know has been pointed out a number of times, the argument about executable
> size is a phoney. Any UNIX (or M$) box has a large set of c-libraries that
> allow the OS to load and unload large amounts of stuff dynamically,
> dramatically reducing the size of executables.
Erik, please accept my appologies. You said this, just better than me.
:( will
Reini Urban <······@x-ray.at> wrote:
> Tunc Simsek wrote:
> >Regarding this point, I was asked the following question: "how small of
> >an executable can you get from a CL program that simply prints HELLO WORLD
> >at the term".
> gail anderson from the edinburgh university presented at the berkeley
> conference a very rich ACL application which fits onto a 1.4MB floppy.
[snip]
> what else do you want?
Ability to create a small executable that runs on any of the major Linux
distributions out of the box.
Yes, I realize that this isn't fair at all. Using Perl, for instance,
requires installing a massive amount of supporting software, too, and
no-one complains. Nowadays, on a Unix/Linux system, one more or less
expects the availability of Perl, Python, Tcl beside C.
The trick seems to be to create applications that make installing the
required runtime support seem worthwhile to a lot of people.
Michael
--
Michael Schuerig
···············@acm.org
http://www.schuerig.de/michael/
········@acm.org (Michael Schuerig) writes:
> Ability to create a small executable that runs on any of the major Linux
> distributions out of the box.
I'm not sure how simple this is even using C, frankly.
libc version? curses version? termcap or terminfo? dbm vendor and version?
I'll settle for source compatibility, to be honest. I have multiple
architectures at home anyway.
-dan
Anyone who REALLY wants this can use clicc:
snapdragon:~ > cat hw.lisp
(princ "Hello, world.")
(terpri)
snapdragon:~ > clicc hw.lisp
snapdragon:~ > cliccl hw.c
gcc -I/homes/gilham/clicc/lib -O -c hw.c -o hw.o
gcc -I/homes/gilham/clicc/lib -O -o hw /homes/gilham/clicc/lib/main.o hw.o \
-L/homes/gilham/clicc/lib -lrtl -lrtc -lrtl -lm
snapdragon:~ > strip hw
snapdragon:~ > ll hw
-rwxrwxr-x 1 gilham gilham 215712 Feb 28 08:34 hw
snapdragon:~ > ./hw
Hello, world.
Caveats:
It uses a carefully defined SUBSET of Common Lisp (CLTL1) and CLOS.
Eval is definitely missing. I don't know how limited a programming
environment you'd wind up with---though it can compile itself, and
that's saying something. I haven't tried to do anything significant
with it. I had to make some fixes to clicc to get it to compile under
the current version of CMUCL. I don't know whether it could be made
to work using other lisps.
--
Fred Gilham ······@csl.sri.com
I have over the years been viewed as a man of the left and a man of
the right, and the truth is that I've never put much stake in such
labels. But this I have learned: the left patrols its borders and
checks membership credentials ever so much more scrupulously, even
ruthlessly, than does the right. -- Richard John Neuhaus
OK I had to do this.
I re-did clicc so that it used shared libraries.
Here's what I get:
snapdragon:~ > ./hello
Hello, world.
snapdragon:~ > ll hello
-rwxrwxr-x 1 gilham gilham 4624 Feb 28 09:21 hello
snapdragon:~ > file hello
hello: ELF 32-bit LSB executable, Intel 80386, version 1 (FreeBSD), dynamically linked, stripped
snapdragon:~ >
Small enough for ya?
A minimal clicc environment, consisting of the clicc compiler, runtime
libraries and other misc. stuff, is < 3 Mb.
I'm not advocating that anyone use clicc, but if you REALLY CARE about
this kind of thing there is a lisp option.
--
Fred Gilham ······@csl.sri.com
I have over the years been viewed as a man of the left and a man of
the right, and the truth is that I've never put much stake in such
labels. But this I have learned: the left patrols its borders and
checks membership credentials ever so much more scrupulously, even
ruthlessly, than does the right. -- Richard John Neuhaus