From: Xah Lee
Subject: the necessity of Lisp's Objects?
Date: 
Message-ID: <4b91e167-0570-4f38-abd1-c5b5bdfb56ce@n22g2000prh.googlegroups.com>
2008-01-22

I've been programing mathematica since 1993, worked as intern at
Wolfram Research Inc in 1995. And since have been coding Mathematica
daily up to 1998, and on and off since then.

I started to learn emacs in around 1997, and use it daily, about 8
hours a day, ever since and still today.

I started to learn lisp in about 1997. Mostly, i red word-for-word, 3
chapters of the book Structure And Interpretation Of Computer Programs
by Harold Abelson et al. (out of 5 chapters total) And, i have read
maybe half of the Scheme lisp spec $B!H(BRevised(4) Report on the
Algorithmic Language Scheme$B!I(B. These are all during 1997-1998. However,
at the time i've never really wrote any programs in lisp other than
toy factorial function.

In 2005, i started to casually learn emacs lisp. Because i use emacs
extensively and daily since 1998, thus emacs lisp is quite practical
for me, and i have kept learning it and in fact have wrote quite a few
small scale, but nevertheless practical, programs. (typically less
than a thousand lines of code; for text processing tasks)

--------------

The Mathematica language is very similar to lisp. From a Mathematica
expert's point of view (me), the first thing that seems odd of lisp is
the low-level-ness. Perhaps in order of the low level impression:

1. The cons business. (lists are made up of cons cells and it is
necessary for the programer to understand this to write any non-
trivial lisp program)

2. No highlevel treatment of lists. (In Mathematica, there are a
system of functions to extract or manipulate (nested) lists considered
as trees. e.g. getting all elements at a level, or arbitrary set of
nodes of a tree. All these based on a tree index concept. (i.e. first
branch's second branch's third's branch will be 1,2,3.) In lisp,
programers uses cons, car, cdr, caadr, etc. That is bizarre and
downright fucking stupid.)

3. The syntax is a bit quirky. In particular, a mathematica programer
sees that sometimes list is written as $B!H(B(list a b c)$B!I(B, but sometimes
there's this oddity $B!H(B'(a b c)$B!I(B (which is syntactically equivalent to
$B!H(B(quote a b c)$B!I(B). And, when mapping a function, sometimes the
programer also needs to put the apostrophy in front. (A Mathematica
programer would think, if a function (e.g. $B!H(Bmap$B!I(B) that takes another
function as argument, why would the function require the programer to
put the apostrophy, why the function itself could be designed to not
evaluate that argument.)

4. A behind-the-scenes model of computation. Namely, lisp the language
deals with the concept of $B!H(Blisp objects$B!I(B, and there's a $B!H(Bprint syntax$B!I(B
that represent these objects. (and a $B!H(Bread syntax$B!I(B that reads a code
into these objects)

I think these are probably most odd things in lisp for a Mathematica
programer on the first exposure.

       *       *       *

I was trying to write a emacs major mode and also writing a tutorial
about it, in the past week. In this process, i find it quite hard.
Writing a complete, robust, major mode for public consumption will
require knowledge of:

    * mode hooks (basic concept)
    * font lock system (some detail)
    * font and faces intrastructure (basic concept)
    * buffer local variables (datil)
    * regex (detail)
    * keymaps (detail)
    * syntax table (detail)
    * abbreviation system (basics)
    * Emacs mode conventions (a lot elisp coding experience)

(the list gives a rough idea, not meant to be complete or exact in
some way)

Some of the above are related to a editing programing system (such as
hooks font system, buffer local var, abbrev system). Some are more
related to the language. For example, keymaps and syntax table
themselves are editing-programing-system related concepts, but the
data themselves utimately calls for understanding of lisp's char-table
data type. Then, writing a mode will also need to understand $B!H(Bdefvar$B!I(B,
$B!H(Bdefcustom$B!I(B and ohters, which ultimately involves good understanding
of lisp's symbols, macros, forms, and evaluation model ...

In short, the study of writing a major mode forced me to start to read
some of the lisp reference chapters that deals with the technical
details of lisp language, such as chapters on $B!H(BSymbols$B!I(B, $B!H(BVariables$B!I(B,
$B!H(BFunctions$B!I(B.

What prompted me to write this post, is that yesterday i thought a bit
about lisp's objects, why Mathematica doesn't have it, and realized:

Mathematica, being high-level as it is, essentially merged the concept
of read-syntax, print-syntax, and objects, into just one concept of
$B!H(BWhat you see is what you get$B!I(B _Expressions_ (i.e. the code programer
types). That is, the code is the object, the object is the code. In
other words, there is no such $B!H(Binternal$B!I(B concepts as objects. What you
type is the code, as well as the meaning.

For example, let's say a list:

In lisp:
  (list 1 2 3)

Mathematica:
  List[1,2,3]

In lisp, we say that the text $B!H(B(list 1 2 3)$B!I(B is the read syntax for
the lisp's list object. So, the lisp interpreter reads this text, and
turns it into a list object. When the lisp system prints a list
object, it prints $B!H(B(list 1 2 3)$B!I(B too, since list object's read syntax
and print syntax is the same.

In Mathematica, there's no such steps of concept. The list
$B!H(BList[1,2,3]$B!I(B is just a list as is. There is no such intermediate
concept of objects. Everything in Mathematica is just a $B!H(Bexpression$B!I(B,
meaning the texual code you type.

This is a high-level beauty of Mathematica.

Now, one might think further, that if for modern language to be high
level (such as Mathematica, or to a lesser degree PHP, Javascript,
Lisp), but if the language also needs to deal with general programing
with tasks such as writing a compiler, operating system, device
drivers (such as those usually done with $B!H(Bcompiled langs$B!I(B such as C,
Java, Haskell and lisp too), then is lisp's internal concept $B!H(Bobject$B!I(B
necessary in a language such as lisp?

Does anyone know?

(note: typically all compiled lang has some $B!H(Bbehind-the-scene$B!I(B $B!H(Bmodel$B!I(B
of sorts and almost always idiosyncratic to itself. For example, Java
has $B!H(Bobjects$B!I(B (which is very different from lisp's) and references.
Another good example to illustrate the point are the terms $B!H(Blist$B!I(B,
$B!H(Barray$B!I(B, $B!H(Bvector$B!I(B, $B!H(Bsequence$B!I(B, which engenders huge amount of
confusion. Typically, in online lang forums, each language proponent
sees and understands only their own version of these terms.)

See also:

$B!z(B What is Expressiveness in a Computer Language
 http://xahlee.org/perl-python/what_is_expresiveness.html

$B!z(B The Concepts and Confusions of Prefix, Infix, Postfix and Fully
Functional Notations
 http://xahlee.org/UnixResource_dir/writ/notations.html

$B!z(B Lisp's List Problem
 http://xahlee.org/emacs/lisp_list_problem.html

$B!z(B Jargons And High Level Languages
 http://xahlee.org/emacs/jargons_high_level_lang.html

  Xah
  ···@xahlee.org
$B-t(B http://xahlee.org/

From: Pillsy
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <73c8ba44-7d0e-4716-9787-b449d747d21e@k39g2000hsf.googlegroups.com>
On Jan 22, 4:50 am, Xah Lee <····@xahlee.org> wrote:
[...]
> The Mathematica language is very similar to lisp. From a Mathematica
> expert's point of view (me), the first thing that seems odd of lisp is
> the low-level-ness. Perhaps in order of the low level impression:

> 1. The cons business. (lists are made up of cons cells and it is
> necessary for the programer to understand this to write any non-
> trivial lisp program)

This is a *strength* of Lisp over Mathematica. M'ca's underlying list
representation is actually a vector, which means that pushing items
onto the front of a list is quite expensive. A lot of the time M'ca
programs end up using two-element "lists" as conses, giving you
results like:

{a[1], {a[2], {a[3], ...}}}

and you have to gratuitously flatten everything when you're done. It's
just plain annoying.

> 4. A behind-the-scenes model of computation. Namely, lisp the language
> deals with the concept of "lisp objects", and there's a "print syntax"
> that represent these objects. (and a "read syntax" that reads a code
> into these objects)

Um, M'ca has a much more elaborate set-up for printing and reading
syntax, and appropriately so, because it attempts to allow you to use
notation that mimics standard mathematical notation in an extensible
way. But it definitely has its own behind-the-scenes model, and one
that's not so different from Lisp's.  That's what TreeForm and
FullForm are for, right?

Cheers,
Pillsy
From: Steve Harris
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <32e9aa5e-6e56-4150-83bf-159ead78fc37@i29g2000prf.googlegroups.com>
> if a function (e.g. "map") that takes another
> function as argument, why would the function require the programer to
> put the apostrophy, why the function itself could be designed to not
> evaluate that argument.)

Because usually the argument is not a literal, but an s-expression
that needs to be evaluated to produce the list.
From: Joost Diepenmaat
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <87bq7e84mn.fsf@zeekat.nl>
Please note up front: I'm talking about common lisp only here, but most
is probably applicable to other lisps too.

Xah Lee <···@xahlee.org> writes:
> The Mathematica language is very similar to lisp. From a Mathematica
> expert's point of view (me), the first thing that seems odd of lisp is
> the low-level-ness. Perhaps in order of the low level impression:
>
> 1. The cons business. (lists are made up of cons cells and it is
> necessary for the programer to understand this to write any non-
> trivial lisp program)

True, but even though I'm quite new to Lisp and cons cells, I didn't
have a hard time understanding them. Lisp's proper lists are just singly
linked lists. And making cons cells explicit means you can use them for
more than just nested lists / trees. You can build alists, circular
lists, binary trees etc all using the same small set of functions and
types.

> 2. No highlevel treatment of lists. (In Mathematica, there are a
> system of functions to extract or manipulate (nested) lists considered
> as trees. e.g. getting all elements at a level, or arbitrary set of
> nodes of a tree. All these based on a tree index concept. (i.e. first
> branch's second branch's third's branch will be 1,2,3.) In lisp,
> programers uses cons, car, cdr, caadr, etc. That is bizarre and
> downright fucking stupid.)

copy-tree, tree-equal subst, etc are not tree functions? and as I said
above you can use the same functions 3 functions cons, car and cdr
(plus the combinations of car and cdr) to do a lot already /without/
having to learn new functions just because something build out of cells
is now a tree.

I think that's one of the stated reasons to keep car and cdr in arc, too
(instead of replacing them outright with head and tail, for example):
car and cdr are applicable to more than just plain lists.

What I'm saying is that car, cons and cdr may be confusing at first but
they make up for it by giving you a lot of power. Note that all lisp
expressions except atoms are themselves parsed into structures build out
of cons cells.

> 3. The syntax is a bit quirky. In particular, a mathematica programer
> sees that sometimes list is written as “(list a b c)”, but sometimes
> there's this oddity “'(a b c)” (which is syntactically equivalent to
> “(quote a b c)”). And, when mapping a function, sometimes the
> programer also needs to put the apostrophy in front. (A Mathematica
> programer would think, if a function (e.g. “map”) that takes another
> function as argument, why would the function require the programer to
> put the apostrophy, why the function itself could be designed to not
> evaluate that argument.)

Lisp functions /always/ evaluate their arguments. Also, I use mapcar on
function calls a lot, which means I /want/ the arguments to be evaluated
at least some of the time, and then having to use quote / list
explicitly when I don't isn't a big deal to me.

I don't know how mathematica deals with that, though.

 [ ... ]

> Mathematica, being high-level as it is, essentially merged the concept
> of read-syntax, print-syntax, and objects, into just one concept of
> “What you see is what you get” _Expressions_ (i.e. the code programer
> types). That is, the code is the object, the object is the code. In
> other words, there is no such “internal” concepts as objects. What you
> type is the code, as well as the meaning.

[A] I'm confused.  Do you mean Mathematica has no higher level concept
of the code than the plain text? Surely it parses the code into /some/
structure to manipulate / analyze / run?

> For example, let's say a list:
>
> In lisp:
>   (list 1 2 3)
>
> Mathematica:
>   List[1,2,3]
>
> In lisp, we say that the text “(list 1 2 3)” is the read syntax for
> the lisp's list object. So, the lisp interpreter reads this text, and
> turns it into a list object. When the lisp system prints a list
> object, it prints “(list 1 2 3)” too, since list object's read syntax
> and print syntax is the same.

It does not. Try it out.

> In Mathematica, there's no such steps of concept. The list
> “List[1,2,3]” is just a list as is. There is no such intermediate
> concept of objects. Everything in Mathematica is just a “expression”,
> meaning the texual code you type.

See my question [A] above.

> This is a high-level beauty of Mathematica.
>
> Now, one might think further, that if for modern language to be high
> level (such as Mathematica, or to a lesser degree PHP, Javascript,
> Lisp), but if the language also needs to deal with general programing
> with tasks such as writing a compiler, operating system, device
> drivers (such as those usually done with “compiled langs” such as C,
> Java, Haskell and lisp too), then is lisp's internal concept “object”
> necessary in a language such as lisp?
>
> Does anyone know?

If you mean, does having integers, cons cells, symbols etc types get you
anything? If so, the answer is yes, of course. I'm not even sure how
you'd work with a high level language that doesn't distinguish *at all*
between functions and numbers, for instance. And as you note below, most
languages have at least some types "internally" if only for performance
reasons.

In any case, having an explicit and predicable representation for each
expression makes writing macros a lot easier.

I've started reading the Lisp in small Pieces book, and I guess it gives
better answers to these questions in far more detail than I can give
you. You may want to check it out.

> (note: typically all compiled lang has some “behind-the-scene” “model”
> of sorts and almost always idiosyncratic to itself. For example, Java
> has “objects” (which is very different from lisp's) and references.

Java's "true" objects are almost but not quite entirely unlike
user-defined CLOS objects.

> Another good example to illustrate the point are the terms “list”,
> “array”, “vector”, “sequence”, which engenders huge amount of
> confusion. Typically, in online lang forums, each language proponent
> sees and understands only their own version of these terms.)

Yes. Lists in Perl for example are very different from lists in Lisp. If
only because lists in perl code can only be literals and can't assigned
to variables (you have to use perl's arrays for that).

I think that sort of confusion between languages is probably
unavoidable, though arrays and vectors are usually quite comparable
between languages that have them.

General comment: common lisp is a pretty high level language, when you
want it to be, but it's also pretty obviously designed "from the low
level up" with programmer access to almost all low-level details that
you need for performance, flexibility and building your own high-level
prettyness. From my limited experience it's the stack of lower-level to
high-level functionality that give it an advantage over "purely high
level" or "(relatively) restricted domain" languages like Perl.

Joost.
From: Jon Harrop
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <13pg0k8pledhp75@corp.supernews.com>
Joost Diepenmaat wrote:
> General comment: common lisp is a pretty high level language...

You might also like to look at OCaml, Haskell and F#.

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/products/?u
From: Joost Diepenmaat
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <8763xhwuy9.fsf@zeekat.nl>
Jon Harrop <······@jdh30.plus.com> writes:

> Joost Diepenmaat wrote:
>> General comment: common lisp is a pretty high level language...
>
> You might also like to look at OCaml, Haskell and F#.

Once I finish this Erlang book, I might. Can't do everything at once.

Joost.
From: Jon Harrop
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <13pkj2oa8rs68b7@corp.supernews.com>
Joost Diepenmaat wrote:
> Jon Harrop <······@jdh30.plus.com> writes:
> 
>> Joost Diepenmaat wrote:
>>> General comment: common lisp is a pretty high level language...
>>
>> You might also like to look at OCaml, Haskell and F#.
> 
> Once I finish this Erlang book, I might. Can't do everything at once.

Erlang is on my list... :-)

What Erlang book are you reading and how good is it?

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/products/?u
From: Joost Diepenmaat
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <87abmtd2y0.fsf@zeekat.nl>
Jon Harrop <······@jdh30.plus.com> writes:

> Joost Diepenmaat wrote:
>> Jon Harrop <······@jdh30.plus.com> writes:
>> 
>>> Joost Diepenmaat wrote:
>>>> General comment: common lisp is a pretty high level language...
>>>
>>> You might also like to look at OCaml, Haskell and F#.
>> 
>> Once I finish this Erlang book, I might. Can't do everything at once.
>
> Erlang is on my list... :-)
>
> What Erlang book are you reading and how good is it?

"Programming Erlang" by Joe Armstrong (pragmatic programmers). Like the
PP books I've read on Ruby it's pretty easy to read and as far as I can
see now it seems like it gives a fairly good overview of what Erlang is
like. There's also a lot of example code that's interesting. But also
like the Ruby books it's sometimes a bit skimpy on the details, at least
that's how it looks to me.

All in all, not a bad introduction, I think. And relatively cheap at 30
euros for 500+ pages.

Joost.
From: Joost Diepenmaat
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <873asld2cu.fsf@zeekat.nl>
Jon Harrop <······@jdh30.plus.com> writes:

> Joost Diepenmaat wrote:
>> General comment: common lisp is a pretty high level language...
>
> You might also like to look at OCaml, Haskell and F#.

I forgot to ask: which one would you recommend I'd take a look at first,
and why?

The only thing I think I know about any of those is that Haskell is
apparently "purely functional" and I seem to recall it does some kind of
pattern matching. Oh, and there's a preliminary Perl 6 implementation
written in Haskell, which made my head hurt when I took a look at it
about 2 years ago.

Joost.
From: tim Josling
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <13pkmda10fgks2e@corp.supernews.com>
On Fri, 25 Jan 2008 22:25:37 +0100, Joost Diepenmaat wrote:

> Jon Harrop <······@jdh30.plus.com> writes:
> 
>> Joost Diepenmaat wrote:
>>> General comment: common lisp is a pretty high level language...
>>
>> You might also like to look at OCaml, Haskell and F#.
> 
> I forgot to ask: which one would you recommend I'd take a look at first,
> and why?
> 
> The only thing I think I know about any of those is that Haskell is
> apparently "purely functional" and I seem to recall it does some kind of
> pattern matching. Oh, and there's a preliminary Perl 6 implementation
> written in Haskell, which made my head hurt when I took a look at it
> about 2 years ago.
> 
> Joost.

Yes it is purely functional, in theory. In practice they have monads which
allow non-purely functional things to happen like IO, and sequences events.

The purely functional nature of Haskell made implementing
efficient multi-threading very simple.

Another interesting thing about Haskell is the use of lazy evaluation.
This allows some interesting programming techniques.

A truly mind-bending language to use. Well worth learning. 

Tim Josling
From: Rainer Joswig
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <joswig-E1EE87.22360925012008@news-europe.giganews.com>
In article <··············@zeekat.nl>,
 Joost Diepenmaat <·····@zeekat.nl> wrote:

> Jon Harrop <······@jdh30.plus.com> writes:
> 
> > Joost Diepenmaat wrote:
> >> General comment: common lisp is a pretty high level language...
> >
> > You might also like to look at OCaml, Haskell and F#.
> 
> I forgot to ask: which one would you recommend I'd take a look at first,
> and why?
> 
> The only thing I think I know about any of those is that Haskell is
> apparently "purely functional" and I seem to recall it does some kind of
> pattern matching. Oh, and there's a preliminary Perl 6 implementation
> written in Haskell, which made my head hurt when I took a look at it
> about 2 years ago.
> 
> Joost.

Could you move this to email or comp.lang.functional ?
Thanks.
From: Jon Harrop
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <13pl6vvcsa5l40c@corp.supernews.com>
Joost Diepenmaat wrote:
> Jon Harrop <······@jdh30.plus.com> writes:
>> Joost Diepenmaat wrote:
>>> General comment: common lisp is a pretty high level language...
>>
>> You might also like to look at OCaml, Haskell and F#.
> 
> I forgot to ask: which one would you recommend I'd take a look at first,
> and why?

Depends what you're after and what platform you're using. Haskell doesn't
have great support on any platform but it is arguably a more mind expanding
toy. OCaml and F# have bigger user bases and industrial-strength
implementations. OCaml is much better on Linux and Mac OS X (very high
performance, particularly for numerics, lots of high quality libraries
etc.) and F# is great on Windows with Visual Studio. Both have much better
development environments and tools than Haskell.

They also all provide features not found in the others. OCaml provides
polymorphic variants and structurally-subtyped OOP, which close the gap
between static and dynamic typing. Haskell provides laziness with
strictness annotations, which makes it possible to write in a completely
declarative style but brings its own problems. F# provides active patterns
and asynchronous workflows, which close the gap between FP <-> OOP and
provide Erlang-style concurrency, respectively.

> The only thing I think I know about any of those is that Haskell is
> apparently "purely functional"

Yes. OCaml and F# are impure like Lisp so you can use mutation whenever you
need to but they encourage immutability and provide lots of immutable data
structures in their standard libraries.

> and I seem to recall it does some kind of pattern matching.

Yes. Pattern matching is a hugely beneficial feature found in all of these
languages. Think: destructuring-bind & cond on steroids and no more
car/cdr...

> Oh, and there's a preliminary Perl 6 implementation 
> written in Haskell, which made my head hurt when I took a look at it
> about 2 years ago.

You might enjoy some of our publications on OCaml:

http://www.ffconsultancy.com/products/ocaml_for_scientists/chapter1.html
http://www.ffconsultancy.com/products/ocaml_journal/free/introduction.html
http://www.ffconsultancy.com/ocaml/
http://www.ffconsultancy.com/ocaml/benefits/

and F#:

http://www.ffconsultancy.com/products/fsharp_journal/free/introduction.html
http://www.ffconsultancy.com/dotnet/fsharp/

I've also collated information about the current use of these (and other)
functional languages:

http://ocamlnews.blogspot.com/2007/12/top-10-most-popular-ocaml-programs.html
http://flyingfrogblog.blogspot.com/2007/11/most-popular-functional-languages-on.html

Like Erlang, OCaml has seen tremendous uptake in industry over the past four
years which is why Microsoft are now productizing F# as their own variant
for their own platform. Consequently, F# is set to become one of the most
successful functional programming languages in history.

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/products/?u
From: Pascal Costanza
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <5vm428F1nci6rU1@mid.individual.net>
Xah Lee wrote:

> In Mathematica, there's no such steps of concept. The list
> $B!H(BList[1,2,3]$B!I(B is just a list as is. There is no such intermediate
> concept of objects. Everything in Mathematica is just a $B!H(Bexpression$B!I(B,
> meaning the texual code you type.

No, "List[1,2,3]" is just a couple of black pixels on the screen.


Pascal

-- 
1st European Lisp Symposium (ELS'08)
http://prog.vub.ac.be/~pcostanza/els08/

My website: http://p-cos.net
Common Lisp Document Repository: http://cdr.eurolisp.org
Closer to MOP & ContextL: http://common-lisp.net/project/closer/
From: Joshua Taylor
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <d3080241-56bb-49f7-9d64-f109962de948@e10g2000prf.googlegroups.com>
On Jan 22, 6:58 am, Pascal Costanza <····@p-cos.net> wrote:
> Xah Lee wrote:
> > In Mathematica, there's no such steps of concept. The list
> > "List[1,2,3]" is just a list as is. There is no such intermediate
> > concept of objects. Everything in Mathematica is just a "expression",
> > meaning the texual code you type.
>
> No, "List[1,2,3]" is just a couple of black pixels on the screen.
>
> Pascal

While I won't make a stand for or against constructor style output of
objects, their use is important in the /design/ of programs. Consider,
for instance, the typical Fibonacci definition:

fib(0) = 0
fib(1) = 1
fib(n) = fib(n-1) + fib(n-2)

A key to understanding the inefficiency of the naive recursive
implementation is seeing the repeated computation:

fib(n) = fib(n-1) + fib(n-2)
         = [fib(n-1-1) + fib(n-1-2)]+ [fib(n-2-1) + fib(n-2-2)]
         = fib(n-2) + [fib(n-3) + fib(n-3)] + fib(n-4)

(You could get it by tracing, too, I suppose, but I'd hope that this
is at least as clear to most programmers.) That's not an really an
argument for any programming language practice, but it's clear that
working with expressions this way /is/ something that people actually
do.

One of DrScheme's language options is constructor style output, and I
wouldn't be surprised if more than a few students (if they know about
the feature) have found it helpful for understanding that "pairs can
make all sorts of things!"

(I thought that that feature could print (1 2 3 4) as (cons 1 (cons 2
(cons 3 (cons 4 ())))), but I can only seem to get (list 1 2 3 4) now.
(1 . 3) prints as (cons 1 3), though.)

//J
From: Patrick May
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <m2k5m26met.fsf@spe.com>
Pascal Costanza <··@p-cos.net> writes:
> Xah Lee wrote:
>> In Mathematica, there's no such steps of concept. The list
>> �gList[1,2,3]�h is just a list as is. There is no such intermediate
>> concept of objects. Everything in Mathematica is just a
>> �gexpression�h, meaning the texual code you type.
>
> No, "List[1,2,3]" is just a couple of black pixels on the screen.

     Are you sure?  They appear in a subdued white on mine.

Regards,

Patrick

------------------------------------------------------------------------
S P Engineering, Inc.  | Large scale, mission-critical, distributed OO
                       | systems design and implementation.
          ···@spe.com  | (C++, Java, Common Lisp, Jini, middleware, SOA)
From: Rainer Joswig
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <joswig-90EF8F.14260522012008@news-europe.giganews.com>
In article 
<····································@n22g2000prh.googlegroups.com>,
 Xah Lee <···@xahlee.org> wrote:

...

> 3. The syntax is a bit quirky. In particular, a mathematica programer
> sees that sometimes list is written as $B!H(B(list a b c)$B!I(B, but sometimes
> there's this oddity $B!H(B'(a b c)$B!I(B (which is syntactically equivalent to
> $B!H(B(quote a b c)$B!I(B

No, it is not.

CL-USER 1 > (read-from-string "'(a b c)")
(QUOTE (A B C))
8

...

> I was trying to write a emacs major mode and also writing a tutorial
> about it, in the past week. In this process, i find it quite hard.
> Writing a complete, robust, major mode for public consumption will
> require knowledge of:
> 
>     * mode hooks (basic concept)
>     * font lock system (some detail)
>     * font and faces intrastructure (basic concept)
>     * buffer local variables (datil)
>     * regex (detail)
>     * keymaps (detail)
>     * syntax table (detail)
>     * abbreviation system (basics)
>     * Emacs mode conventions (a lot elisp coding experience)
> 
> (the list gives a rough idea, not meant to be complete or exact in
> some way)

Right. There is quite a lot of to learn to write a good
mode. For some modes there seem to be lots of
people involved and/or some experts spend significant
time improving it.

> Mathematica, being high-level as it is, essentially merged the concept
> of read-syntax, print-syntax, and objects, into just one concept of
> $B!H(BWhat you see is what you get$B!I(B _Expressions_ (i.e. the code programer
> types). That is, the code is the object, the object is the code. In
> other words, there is no such $B!H(Binternal$B!I(B concepts as objects. What you
> type is the code, as well as the meaning.
> 
> For example, let's say a list:
> 
> In lisp:
>   (list 1 2 3)
> 
> Mathematica:
>   List[1,2,3]
> 
> In lisp, we say that the text $B!H(B(list 1 2 3)$B!I(B is the read syntax for
> the lisp's list object.

No. (LIST 1 2 3) is a textual representation of a function call
to LIST with data 1 2 3. The list itself is just written as (1 2 3).
Lists have a data syntax that can be printed and read.
Like arrays, strings, vectors, numbers, characters, ...

> So, the lisp interpreter reads this text, and
> turns it into a list object. When the lisp system prints a list
> object, it prints $B!H(B(list 1 2 3)$B!I(B too, since list object's read syntax
> and print syntax is the same.


? (defun test ()
       (print '(1 2 3))
       (print (list 1 2 3))
       'done)
TEST
? (test)

(1 2 3) 
(1 2 3) 
DONE

> 
> In Mathematica, there's no such steps of concept. The list
> $B!H(BList[1,2,3]$B!I(B is just a list as is. There is no such intermediate
> concept of objects.

What are two lists then internally if you read them?
Define two variables. Set both to some lists. What are these lists now?
Say you pass one of these lists to a function.
Does the function get a copy? The same? Don't know?

If you print the Mathematica list and then read it back.
What do you get? A new list (Why?)? The old list? Don't know?
What with symbols. Print a symbol. Read the symbol?
Is it a different symbol? The same? Don't know?

We talk of 'objects' being the data in a running (Lisp) system.
See below. Objects, because we don't see chunks of bits.

> Everything in Mathematica is just a $B!H(Bexpression$B!I(B,
> meaning the texual code you type.

Even Mathematica has the idea of data 'objects':
symbols, numbers, lists, strings, expressions, ...
The documentation of Mathematica is a huge mess when
it comes to describing its basic data types.
Mathematica might be less precisely described compared
to Common Lisp. Common Lisp has a spec (with its
own problems), which allows us to have different and
independent implementations of the programming language
Common Lisp. Mathematica is not described in a way
that one can 'easily' implement the thing independently.
Plus Mathematica is focused on symbol computation
with lots of stuff added to it somehow. For a Lisp user
it is similar to an inhouse closed implementation of a
hacked special purpose Lisp dialect. Common Lisp is a
general purpose standardized programming language
which lays down in the standard how a lot of things
should work (printing, reading, interpreter, compiler,
data types, ...). I'm not saying that it is necessary
beautiful, but the Common Lisp describes the language
in much more detail with more basic facilities.

> 
> This is a high-level beauty of Mathematica.

With an implementation that is partly in some low-level language.
You see the beauty of the language. The Lisp programmer
sees the details and the implementation, too.

Part of it comes from its 'narrow' main domain: symbolic
mathematics (with lots of stuff added somehow). Lisp has
a different purpose. Symbolic computation
is just one. Lisp would let you implement Mathematica
with all of its functionality: windows, notebooks, language,
3d graphics, OpenGL interface and so on. This has
been done for systems like Macsyma or Axiom,
where Lisp is the implementation language for a system
like Mathematica (including all the UI stuff).

> 
> Now, one might think further, that if for modern language to be high
> level (such as Mathematica, or to a lesser degree PHP, Javascript,
> Lisp), but if the language also needs to deal with general programing
> with tasks such as writing a compiler, operating system, device
> drivers (such as those usually done with $B!H(Bcompiled langs$B!I(B such as C,
> Java, Haskell and lisp too), then is lisp's internal concept $B!H(Bobject$B!I(B
> necessary in a language such as lisp?
> 
> Does anyone know?

First: a notation of is always for an external representation.

What we know about languages (syntax, parsers, ...) determines
how our programming languages look like. The Lisp model
of a programming language is a bit odd, in that it has been
shaped by the needs of working interactively with lists and
other data.

Second: there is an model of data for the running Lisp system.
We also have a model in our mind how the data is organized
in a Lisp system. We 'see' how the internal data gets
gets written to the output and we can also use functions
to manipulate the data and observe effects.

So here comes the first complication: there are 'objects'.
What are objects? Objects have an identity
(so you can have two arrays and you can observe that
they are different arrays - are they EQ or not?), a type
(you can ask every object for its type), some
properties (say, the array dimensions and the element-type
of an array) and the data ( the array contents).

So, objects have:

* identity
* a type
* may have some internal information
* data 

CONS
* you can check if to cons cells are EQ or not
* type is cons
* no internal information
* CAR and CDR cells are the data

But there are other data types like that: strings, characters, hashtables,
streams, pathnames, ...

Second complication: there is some data that is not really
an object like above: numbers are an example.

Third complication: there are objects where you don't get access
to the data. An example is a compiled function. It is
a real data object, but there is no function to set or get
the machine instructions of that function. You can
create function objects and call them later. But you can'
access the machine code. You can also not print them in a
way, so that you can read it back. You would need to
be able to write and read the machine instructions - which
makes not that much sense for text output.

So, now we have an idea about the internal data and about
internal objects. Let's print them:

a number: prints based on its types
  integer: 64568405
  ratio:  1/10
  complex: How do we print a complex number?
  Floats: how do we print a float number? That's not that easy.

symbol:
  usually we want:  FOO
  but symbols can be arbitrary strings so we need sometimes:
    |this sentence is REALLY a symbol|

CONS:
  we print:  (foo . bar)
  what we don't print: the identity and the type.

LISTs, that's odd
  there are no real lists. Lists are conses or NIL.
  We could print them as conses:
     (A . (B . (C . NIL)))
  But for convenience we print (A B C)

PATHNAMES?
  tough. Pathnames are different on various platforms.
  One could give up and say they are strings.
  Common Lisp has pathnames as objects with data
  (host, directory, name, version, ...).
  #P"/foo/bar/test.lisp"

FUNCTIONs ? How would you print the result of:
  (compile nil (lambda (a) (+ a 10))) ??
  It is probably compiled to some machine code.
  so we do just this: #<Function 15 2009528A>

and more...

So the whole business of printing is quite complex and
we have to make all kinds of decisions how different
data gets printed. For some data we don't really
know how to print it (example: functions). For some data we
we don't want to print all detail. For some data we want
to optimize printing (lists, ...) based on some decisions.
So, Common Lisp provides a machinery for all this and
more.

Decisions the user has to make:

 do you want to print the data in a way that can be read back or not?
   for a string this means: double quotes around the string or not?

 do you really want to print the data in length?
   do you really want to print a 1000 x 1000 x 1000 array to
   your text console?

do you really want to print the data in full depth?
  say, you have a tree that has a depth of 100, do you want to print
  all levels?

and more...

Now we come to reading.

How do we read data? READing means turning an external
representation into an internal one.

423423 -> will be read as a integer number
(A B C) -> will be read as a list. READ will create the conses and symbols (if necessary).
 |this REALLY is a symbol| : we get a fresh new symbol if there isn't one.
   if there is one already, we get that
"this surely is a string":  we get a fresh new string object
#p"/foo/bar.lisp":  we get a pathname
#<Function 15 2009528A> : what now? we can't read that.

Complication 1: we can't read everything.
  We can only read what has enough information in the textual
  form, so that we can create data from it.
Complication 2: we don't get the original data objects back.
Complication 3: for symbols we get a unique object.
Complication 4: there is often not enough information in the printed
   data to get a data object back, which has all the properties of
   the object we printed.


All of the above (PRINTing, internal representation, READing)
has to fit together in some form of machinery that is:
* easy to use
* extensible
* understandable

You can see that the above is REALLY difficult and that
there are lots of design choices over a period of
evolution is involved. A huge hack? Maybe. But there is
no easy and obvious solution. It is also not possible
to see each dimension in isolation. One can't just focus
on designing the PRINT functionality - one needs to
know what data types there are and one also needs to
know what kind of READ functionality you want to provide.
The internal data representation is also not independent
from the other subsystems (say I/O). Lisp's model
is already higher-level than what you get from your
operating system (chunks of memory and interfaces
that use C data conventions (mostly)).
From: Maciej Katafiasz
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <fn52kf$47e$2@news.net.uni-c.dk>
Den Tue, 22 Jan 2008 14:26:05 +0100 skrev Rainer Joswig:

> In article
> <····································@n22g2000prh.googlegroups.com>,
>  Xah Lee <···@xahlee.org> wrote:
>> 3. The syntax is a bit quirky. In particular, a mathematica programer
>> sees that sometimes list is written as $B!H(B(list a b c)$B!I
(B,
>> but sometimes there's this oddity $B!H(B'(a b c)$B!I(B (which 
is
>> syntactically equivalent to $B!H(B(quote a b c)$B!I(B

I wouldn't mind or notice if you just ignored Xah Lee as is just and 
proper, but your newsreader is broken. It doesn't set Content-type: 
properly, yet it quotes !ASCII characters as is, and Xah happens to use 
ISO-2022-JP and quote characters 「」, so your reader is sending malformed 
posts. Please fix that.

Cheers,
Maciej
From: Ken Tilton
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <4796ddff$0$11586$607ed4bc@cv.net>
Maciej Katafiasz wrote:
> Den Tue, 22 Jan 2008 14:26:05 +0100 skrev Rainer Joswig:
> 
> 
>>In article
>><····································@n22g2000prh.googlegroups.com>,
>> Xah Lee <···@xahlee.org> wrote:
>>
>>>3. The syntax is a bit quirky. In particular, a mathematica programer
>>>sees that sometimes list is written as $B!H(B(list a b c)$B!I
> 
> (B,
> 
>>>but sometimes there's this oddity $B!H(B'(a b c)$B!I(B (which 
> 
> is
> 
>>>syntactically equivalent to $B!H(B(quote a b c)$B!I(B
> 
> 
> I wouldn't mind or notice if you just ignored Xah Lee as is just and 
> proper, but your newsreader is broken. It doesn't set Content-type: 
> properly, yet it quotes !ASCII characters as is, and Xah happens to use 
> ISO-2022-JP and quote characters 「」, so your reader is sending malformed 
> posts. Please fix that.

You seem to have appointed yourself the Style Editor of comp.lang.lisp. 
Thanks! I am greatly enjoying your performance. As an encore, ... 
checking ... OK, the tide comes in at 5:43am, I'd love to see you stop it!

kt

-- 
http://www.theoryyalgebra.com/

"In the morning, hear the Way;
  in the evening, die content!"
                     -- Confucius
From: Peter Hildebrandt
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <op.t5fuszylx6i8pv@babyfoot>
On Tue, 22 Jan 2008 16:37:51 +0100, Maciej Katafiasz <········@gmail.com>  
wrote:

> Den Tue, 22 Jan 2008 14:26:05 +0100 skrev Rainer Joswig:
>
>> In article
>> <····································@n22g2000prh.googlegroups.com>,
>>  Xah Lee <···@xahlee.org> wrote:
>>> 3. The syntax is a bit quirky. In particular, a mathematica programer
>>> sees that sometimes list is written as $B!H(B(list a b c)$B!I
> (B,
>>> but sometimes there's this oddity $B!H(B'(a b c)$B!I(B (which
> is
>>> syntactically equivalent to $B!H(B(quote a b c)$B!I(B
>
> I wouldn't mind or notice if you just ignored Xah Lee as is just and
> proper, but your newsreader is broken. It doesn't set Content-type:
> properly, yet it quotes !ASCII characters as is, and Xah happens to use
> ISO-2022-JP and quote characters 「」, so your reader is sending  
> malformed
> posts. Please fix that.

Funny thing:  In my reader (Opera M2) Rainer's response looks just fine.   
Yours OTOH seems to contain some weird stuff ...

Peter

> Cheers,
> Maciej



-- 
Using Opera's revolutionary e-mail client: http://www.opera.com/mail/
From: Maciej Katafiasz
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <fnb9u8$qg2$1@news.net.uni-c.dk>
Den Thu, 24 Jan 2008 15:35:13 +0100 skrev Peter Hildebrandt:

> Funny thing:  In my reader (Opera M2) Rainer's response looks just fine.
> Yours OTOH seems to contain some weird stuff ...

It's probably sniffing when there's no Content-type given, so it comes 
out alright. Whereas my post has the header forcing a particular 
encoding, so you see what I see, ie. garbage.

Cheers,
Maciej
From: Xah Lee
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <da297d6d-0115-4c19-83dd-c316a2449fbd@v46g2000hsv.googlegroups.com>
2008-01-22

sometimes its quite frustrating when discussing things in newsgroups.

As most know, newsgroups discussions are characterized by ignorance,
massive confusion, one-upmanship, and in general a showcase of the
loniness of the tech geekers in technological societies ...

but in this particular thread, a contributing factor is that most
people don't know Mathematica to the degree that nobody ever seen one
single line of code in Mathematica, and of the very few who have
actually used Mathematica (maybe in their school days in the computer
lab), most don't have the basic knowledge of programing it.
(Mathematica 6 sells for over 2 thousands dollars i think (check their
web to be sure), and before that version, it's over one thousand
dollars, and i guess it is sold perhaps 10 or hundreds times than
lisps sold by all lisp companies combined)

This doesn't mean that all newsgroup posters are morons, or that all
messages are of low quality, or that they are not valuable. (many,
perhaps as much as half of posts in comp.lang.* groups, are highly
technical, and highly valuable to the questioner or participants) But
what i'm trying to say here, is that a significant portion of
messages, esp in debate or opinion oriented discussions (which
comprise perhaps 50% of posts), most of them missed the mark
completely. (as in this thread) Often resulting in divisions and
debates into technical details and facts.

In this thread, my question is about the necessity of lisp's object
concept. Most simply don't understand this question. This question,
certainly isn't a typical computer language designing question. (the
bag of problems the typical language designer have, esp compiler
experts, are exemplified by recent post of Ray Dillinger
(comp.lang.lisp, 2008-01-09, "Re: Java as a first language "considered
harmful"" (http://groups.google.com/group/comp.lang.lisp/msg/
365cb58ce1b65914)))

What do i mean by "necessity of lisp's object concept"? I thought i
gave sufficient explanation in my post, but so far nobody actually
addressed it squarely. The closest is Rainer Joswig, who gave a
valuable and technical examination of lisp's objects and why it is
necessary. (i still need to digest its technicalities detail) Though
in my opinion it does not address this problem head on, because it is
addressed from a lisp expert's point of view with not much
consideration of computer languages in general, and who is also a
ardent lisp advocate.

To help understand the problem and related issues, perhaps we can ask
a series of concrete questions that are either a paraphrase of the
original question or very much related to it.

* Does a lisp variant or descendant, such as Logo, Dylan, (and perhaps
tcl), all intimately require its programer to have the notion of
objects? And, how close or remote, is their notion of object differ
than lisp's object? Do they actually use the term "object"? (note here
that the word "object" is not to be confused with the term "object" in
the lang's Object-Oriented Programing facility. The "object" in our
context is certain "thing" or "entity" type of concept that are
required understanding for the programer to be proficient in using the
language. (even almost all langs has certain "internal" or "behind-the-
scenes" concept that are exposed to the programer, particularly
"compiled languages", but as far as i know they do not all necessarily
all require the programer to understand a notion of a "entity" that
represent all things/objects/items/living-units of the language. (some
will just speak of data types)))

* Consider one of emacs lisp, common lisp, or scheme lisp. Suppose we
make purely documentation changes, such that all mention of the object
is deleted. In their place, if necessary, the manual talks about
expressions instead, or, use wordings so that all discussion of lisp
objects is replaced by speaking about particular instance of its print
syntax. (in other words, do it like Mathematica does) What is the
practical effect of this? Is this reasonably doable?

* Does languages like Haskell, Concurrent Clean, or ML, erlang, OCmal/
f#, has a concept of language's running unit a la lisp's objects?

-------------

A few notes about the idea of a "behind the scenes model" of a
language.

First of all, all computer languages as we know it, necessarily has a
textual representation (for simplicity of discussion, we exclude here
GUI based languages (e.g. Prograph), databases, spreadsheets, cellular
automata, etc.) Now, since all these languages runs on a silicon chip
based hardware (we exclude exotic or theoretical hardwares such as
quantum, DNA, optical etc), inevitably there is a concept of "process"
when the program runs, and this process may be further broken down to
as to be constructed out of abstract entities that corresponds to the
textual elements of the language the programer types. So, it is very
silly, to answer to the effect that "yes lisp's object is necessary
because there is a non-textual hardware it runs on".

Another point, perhaps obvious, to note is that many languages,
especially high-level ones such as Mathematica, PHP, Javascript, do
not have such concepts of entities (a la lisp's objects). Also note,
as stated in my first post, that many other languages, does have
certain "behind the scenes" language model that the programer need to
understand before becoming proficient with the language. For example,
Java has its own "objects" and "references". Perl has "references",
"file handles". C and C++, have pointers... (i don't know C or C++ so
i may be wrong here on terminology or technicality)

I think another important thing we need to realize is that what
constitute the word "necessity" in the phrase "the necessity of lisp's
object's concept". For example, in almost all languages, a programer
can have a working knowledge, perhaps employed as a junior programer,
without understanding the "behind the scenes" concept of these
languages. In fact, i assert that one can program in lisp and write
large scale or significant number of software, without actually having
a good understanding that lisp's expressions are compiled into
entities called lisp objects. (certainly i have programed rather a lot
useful functions in emacs lisp before getting a clear idea of the
terminology "lisp object" used throughout its manual) For another
example, in Java, i venture to say that majority of professional java
programers ("majority" means greater than 50%; "professional" means
those who makes a living primarily by coding), do NOT understand any
of the behind-the-scenes java language model. (specifically, what i
mean by java's "behind-the-scenes java language model" are those you
acquire when you read the Java Language Specification, but many of its
terminology and concepts is also used throughout the Java manual (aka
the Java API) without much explanation)

Given a computer language, to what extent it has "behind the scenes
language model" concepts, and to what degree that understanding this
concept is necessary in being practically proficient in the language.
I think these are all relevant questions.

(I think this is a extremely important realization, that modern
languages are progressively getting more higher level and they all
have less and less any "behind the scenes language model" concepts.
Or, put in somewhat equivalent way, a programer can be highly
proficient without needing to understand the language's technical
specification. This "behind the scenes language model" concept is
intimately tied to compiler/speed concepts that are exposed to the
programer in the language. (this i've detailed somewhat in my essay
"jargons and high level languages"))

-------

Thanks Rainer and others for correcting some errors about my statement
of lisp's "list" and "quote" in relation to the read/print syntax.
(and thanks Rainer for the technical exposition on lisp's objects)

Rainer wrote:
$B!V(BWhat are two lists then internally if you read them?  Define two
variables. Set both to some lists. What are these lists now?  Say you
pass one of these lists to a function.  Does the function get a copy?
The same? Don't know?  If you print the Mathematica list and then read
it back.  What do you get? A new list (Why?)? The old list? Don't
know?  What with symbols. Print a symbol. Read the symbol?  Is it a
different symbol? The same? Don't know?$B!W(B

I don't quite understand your questions.

One way to understand Mathematica is by the so-called term-rewriting
and or graph-rewriting of computer "scientist". When you think of
these computing models, you'll see that there's no  "behind the
scenes" language model like Lisp's Objects, or Java's Objects and
References and other types, or C's pointers... etc,

$B!V(BSay you pass one of these lists to a function.  Does the function get
a copy? $B!W(B

Think of term-writing, then its easy to see that in Mathematica, with
respect to the so-called "getting a copy" "behind-the-scenes" concepts
in most computer languages, then Mathematica does gets a copy, and
always.

I do not really know what's so-called "denotational semantics" by
computer "scientists". However, for the past decade, i always inferred
that the term denotational semantics describes Mathematica. Basically,
the gist of the idea to me is that there's no "behind the scenes
language model" that exists for just about every other language and
each idiosyncratic and totally mutually incompatible. The word
"denotational", means to me that the language works by notations,
denote its meaning by notations, which fits my concept of what
Mathematica is. In other words, a language that has denotational
semantics just means to me that it is like a live, computable system
of mathematics as a _formal system_. (for those of you don't know, a
"formal system" in the context of mathematics, and in our context of
programing languages, is a language whose computation model is based
on string transformation or similar notational model, and all meanings
of the language are completely represented by just the symbol
sequences (no "behind the scenes language model" stuff). (another way
to easy understand this for people who never touched term-rewriting
based systems is to think of programing in a language that understand
strings and transformation of strings but nothing else. (no numbers,
for example)))

(i'd have answered many of my questions if i knew Haskell well... but
never did)

This concludes my rambling for now. I spent a couple of hours writing
this (as with almost all my postings) and aint no gonna do major
editing work on it for the moment. I will, however, probably do so
when i put this on my website. (fucking 1709 words now, wee!)

  Xah
  ···@xahlee.org
$B-t(B http://xahlee.org/
From: Rainer Joswig
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <joswig-05A25F.09373323012008@news-europe.giganews.com>
In article 
<····································@v46g2000hsv.googlegroups.com>,
 Xah Lee <···@xahlee.org> wrote:

...

I think you are seriously confused when it comes to programming
language models. Don't take that offensive, just as a recommendation
to clear up basic things like: 'expression', 'object', 'process',
'interpretation'. Part of that may come from that Mathematica only
has a weak specification of the core programming language.

> Thanks Rainer and others for correcting some errors about my statement
> of lisp's "list" and "quote" in relation to the read/print syntax.
> (and thanks Rainer for the technical exposition on lisp's objects)
> 
> Rainer wrote:
> $B!V(BWhat are two lists then internally if you read them?  Define two
> variables. Set both to some lists. What are these lists now?  Say you
> pass one of these lists to a function.  Does the function get a copy?
> The same? Don't know?  If you print the Mathematica list and then read
> it back.  What do you get? A new list (Why?)? The old list? Don't
> know?  What with symbols. Print a symbol. Read the symbol?  Is it a
> different symbol? The same? Don't know?$B!W(B
> 
> I don't quite understand your questions.
> 
> One way to understand Mathematica is by the so-called term-rewriting
> and or graph-rewriting of computer "scientist". When you think of
> these computing models, you'll see that there's no  "behind the
> scenes" language model like Lisp's Objects, or Java's Objects and
> References and other types, or C's pointers... etc,

Of course there is. Term Rewriting is only part of the story
in Mathematica. Mathematica has control constructs,
functions and all kind of weird stuff.
What is a Term? A Term is recursively
defined as either a object or a construct with subterms.
Objects are symbols, strings, numbers, lists, and so on.
There are also functions and 'pure function' (like
anonymous functions in Lisp). You can creare
lists of functions, just like in Lisp.

 A Mathematica runtime system is quite similar
to what a Lisp system does.

Example:

Find the equivalent
in Mathematica (which has functions, global variables,
local variables, data constructors, mutable data structures,
Set, Equal, SameQ, List, ...) for the following:

You can set global variables with values:

(setf foo1 'bar)

You can define functions with parameters and code:

(defun foo (a)
   (list a a))

You can call functions:

(foo foo1)

You get results from these functions.

(foo foo1)  ->  (bar bar)

see below for more

> 
> $B!V(BSay you pass one of these lists to a function.  Does the function get
> a copy? $B!W(B
> 
> Think of term-writing, then its easy to see that in Mathematica, with
> respect to the so-called "getting a copy" "behind-the-scenes" concepts
> in most computer languages, then Mathematica does gets a copy, and
> always.

That's not true. If you set the nth element of an list,
the original list will be modified. Not a copy.

If you pass that list around and change it, the original
list will be modified. Not a copy.

See above. The list (bar bar). What are the symbols?
are they copies of the symbols or is there just one
symbol and the list points to them from two places?

Do the equivalent to the following in Mathematica:

(setf bar1 (list 'bar-symbol))

(setf bar2 (foo bar1))

(print bar2)

(setf (first bar1) 'new-bar-symbol)

(print bar1)

(print bar2)

Now, what does bar2 print? Is bar2 changed or not?


See also:

http://reference.wolfram.com/mathematica/tutorial/MakingListsOfObjects.html
http://reference.wolfram.com/mathematica/tutorial/PrinciplesOfEvaluation.html

Also note that above uses the word 'object'.
From: John Thingstad
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <op.t5dvg6oput4oq5@pandora.alfanett.no>
P� Wed, 23 Jan 2008 06:45:51 +0100, skrev Xah Lee <···@xahlee.org>:

I think you are mixing up object's and object oriented programming.
Calling atomic entities in a language objects predates object oriented  
programming.
For the first 30 years or so there was no support for object oriented  
programming in lisp.
Object oriented programs are data driven. That is the  
functions/procedures/methods/messages (whatever) all relate to the object.  
This is what I will assume you mean.

Correct me if I am wrong, but the support for windows programming in  
mathematic seems weak.
Sure you have the tree based display subsystem in the notebiook but it  
works essentially to format a document. In particular it is processed  
sequentially. (like HTML)

This brings me to one of the weaknesses of relational logic. It is not  
just in pattern matching and logic programming but also in relational  
databases by the way. It is awkward to represent one to may and many to  
many relationships.

The data flow in a windows app mens you can wander from one state to  
another at random.
This is very difficult to implement with a relational data model.
Thus even languages like Prolog have implemented objects. If for no other  
reason than to facilitate writing windows interfaces.

There are problems for which object orientation is the simplest mode.
Lisp is a multi paradigm language. So it supports procedural to-down  
programming, bottom up programming and object oriented programming.
There is no style guides in Lisp and probably as many opinions about the  
right way of writing Lisp programs as there are people.

For instance Rainer Josvig does most of his work using the CLOS system,  
while I am much more reluctant to use it.

For the record describing the Mathematica language as a form of Lisp seems  
wrong to me.
Sure it inherits is idea of storing data in trees and using prefix syntax  
 from Lisp. The essential difference to me is how it handles functions.  
Mathematica has relations not functions. So in order to see what code to  
call you have to search the database and find which of the definitions  
makes the closest match. Like in Prolog order left to right and top to  
down is important.

--------------
John Thingstad
From: John Thingstad
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <op.t5dvl5b7ut4oq5@pandora.alfanett.no>
P� Wed, 23 Jan 2008 13:54:32 +0100, skrev John Thingstad  
<·······@online.no>:

> Thus even languages like Prolog have implemented objects.

That should be classes.

--------------
John Thingstad
From: Jon Harrop
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <13pembob531qa16@corp.supernews.com>
John Thingstad wrote:
> Correct me if I am wrong, but the support for windows programming in
> mathematic seems weak.

Windows is Mathematica's native platform. Other platforms (like Linux) have
had missing features.

> Sure you have the tree based display subsystem in the notebiook but it
> works essentially to format a document.

Notebooks also typeset code and maths and provide interactive graphics.

> In particular it is processed sequentially. (like HTML)

Notebook cells can be evaluated out of order and, indeed, they are typically
evaluated and edited manually in any order with a few "initialization"
cells that get called once before everything else.

> The data flow in a windows app mens you can wander from one state to
> another at random.
> This is very difficult to implement with a relational data model.
> Thus even languages like Prolog have implemented objects. If for no other
> reason than to facilitate writing windows interfaces.

I don't believe Mathematica's language would be inherently bad for
application programming, although it lacks an expressive static type system
like OCaml. However, in the absence of static typing you need huge support
for debugging and unit testing frameworks which Mathematica has also lacked
(although I believe it is now being addressed).

> For the record describing the Mathematica language as a form of Lisp seems
> wrong to me.
> Sure it inherits is idea of storing data in trees and using prefix syntax
> from Lisp.

and term rewriting (macros).

> The essential difference to me is how it handles functions. 
> Mathematica has relations not functions.

Well, rewrite rules.

> So in order to see what code to 
> call you have to search the database and find which of the definitions
> makes the closest match. Like in Prolog order left to right and top to
> down is important.

Equivalently in Lisp, people can and do redefine random stuff and things at
run-time. As they say "where will the program counter go next? Who
knows...". That is the same problem of "rats nest" programs, IMHO.

Incidentally, the Mathematica expression:

  {1, 2, 3}

is actually an array/vector rather than a linked list. A true list would be:

  {1, {2, {3, {}}}}

Mathematica has very little support for lists and is rather ill-suited to
handling them, e.g. it segfaults from stack overflow after an order of
magnitude fewer recursions than any other language I know.

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/products/?u
From: Ken Tilton
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <4796e4d8$0$11533$607ed4bc@cv.net>
Xah Lee wrote:

...maciej made me snip this....

> What prompted me to write this post, is that yesterday i thought a bit
> about lisp's objects, why Mathematica doesn't have it, and realized:
> 
> Mathematica, being high-level as it is, essentially merged the concept
> of read-syntax, print-syntax, and objects, into just one concept of
> $B!H(BWhat you see is what you get$B!I(B _Expressions_ (i.e. the code programer
> types). That is, the code is the object, the object is the code. In
> other words, there is no such $B!H(Binternal$B!I(B concepts as objects. What you
> type is the code, as well as the meaning.
> 
> For example, let's say a list:
> 
> In lisp:
>   (list 1 2 3)
> 
> Mathematica:
>   List[1,2,3]

Well, I do not know Mathematica, so you'll have to help me.

If I say (eq (list 1)(list 1)) in Lisp I get nil, aka, nope, they are
not EQ, EQ being the test for object identity. (They happen also not to
be EQL, just EQUAL (the one that digs into structure)).

What happens in Mathematica?

The advantage to me in Lisp is that I can have two lists starting with
the same textual expression and they will, well, be two different lists,
not one. I can mutate one without mutating the other, holding references
to each in different places, and then see different lists when those two
places are compared, or (the flip side) see "the same" list even when
the contents of the list have changed. ie, I might have two Lisp
"places" bound to the same list and want changes to be visible
regardless of the binding.

Mind you, this all comes down to the question of state. Does Mathematica
have that concept, or is it purely functional? (not that I am conversant
with FPL either). If not, sure all list[1,2,3]s can be collapsed into
one, no need for object identity, there it is, list[1,2,3]. But in Lisp
we can have two variables bound to the same list:

  (setf x (setf y (list 1 2 3))

...and then (delete 2 y) and see the result in x because x is "/the
same/ list as y", it is /not/ "the list of 1, 2, and 3".

Is that necessary? It is what it is, which is different than Mathematica
it seems, but I do not see an interesting issue here.

> 
> In lisp, we say that the text $B!H(B(list 1 2 3)$B!I(B is the read syntax for
> the lisp's list object. So, the lisp interpreter reads this text, and
> turns it into a list object. When the lisp system prints a list
> object, it prints $B!H(B(list 1 2 3)$B!I(B too, since list object's read syntax
> and print syntax is the same.
> 
> In Mathematica, there's no such steps of concept. The list
> $B!H(BList[1,2,3]$B!I(B is just a list as is. There is no such intermediate
> concept of objects. Everything in Mathematica is just a $B!H(Bexpression$B!I(B,
> meaning the texual code you type.
> 
> This is a high-level beauty of Mathematica.

No beauty either way, just different.I think you have a non-issue here,
either you like state or do not, Lisp does, Mathematica might not or
might only with globals bound to vectors (making that up as I go).

kt

-- 
http://www.theoryyalgebra.com/

"In the morning, hear the Way;
 in the evening, die content!"
                    -- Confucius
From: Xah Lee
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <867553c8-e4c6-4eb8-b5d1-80ce3dd8719e@l32g2000hse.googlegroups.com>
Hi Kenny!

Ken Tilton <···········@optonline.net> wrote:

「If I say (eq (list 1)(list 1)) in Lisp I get nil, aka, nope, they are
not EQ, EQ being the test for object identity. (They happen also not
to be EQL, just EQUAL (the one that digs into structure)).

What happens in Mathematica?
」

In[5]:=
Equal[List[1],List[1]]

Out[5]=
True

This is the beauty of mathematica. Simple, easy to understand. What
you see is what you get. No behind the scenes stuff.

「The advantage to me in Lisp is that I can have two lists starting
with the same textual expression and they will, well, be two different
lists, not one.」

This is exactly the point of confusing, behind-the-scenes lang model,
i mean. Depending on the language, they can get quite convoluted. (in
particular, Java, with its access specifiers, constructors, objects,
arrays, references etc.)

「I can mutate one without mutating the other, holding references to
each in different places, and then see different lists when those two
places are compared, or (the flip side) see "the same" list even when
the contents of the list have changed. ie, I might have two Lisp
"places" bound to the same list and want changes to be visible
regardless of the binding.」

Note here that Mathematica language programing doesn't have any
concepts of momeries, pointers, or references in any way. So, this
means, you can't like change one entity and expect some other entity
will also be changed mysteriously or magically.

Remember, everything is textual, “what u see is what get”, thinking it
as a formal mathematics system. (formal means FORMal. The word
“formal” is a example of bad/mutated terminology that creates endless
confusion)

Kenny wrote:
「Mind you, this all comes down to the question of state. Does
Mathematica
have that concept, or is it purely functional? (not that I am
conversant
with FPL either). If not, sure all list[1,2,3]s can be collapsed into
one, no need for object identity, there it is, list[1,2,3]. But in
Lisp
we can have two variables bound to the same list:

  (setf x (setf y (list 1 2 3))

...and then (delete 2 y) and see the result in x because x is "/the
same/ list as y", it is /not/ "the list of 1, 2, and 3".

Is that necessary? It is what it is, which is different than
Mathematica
it seems, but I do not see an interesting issue here.
」

Excellent example to illustrate the issue. I'd pick on your
characterization about this being stateful vs not-stateful. What the
behind-the-scenes hard-linked kinda thing is basically modeled on a
state machine kinda thinking. (actually, more just because it's easy
to implement)

Mathematica does not use “behing the scenes” lang model but you can
assign vars in Mathematica and change it later in your program. The
Mathemtica programing culture is more like “whatever works”, in this
sense a lot like Common Lisp as contrasted to Scheme lisp.

In Mathematica, if you want the behavior as of hardlinking 2 behind-
the-scene things, just assign them to the same var... like this:

In[25]:=
data={1,2,3};
x := data;
y := data;

In[28]:=
data=Delete[data,2]

Out[28]=
{1,3}

In[29]:=
x

Out[29]=
{1,3}

In[30]:=
y

The “:=” is syntax shortcut for “SetDelayed[]”, while “=” is “Set[]”.
The difference, is that the Set evaluate the rhs before assignment,
while SetDelayed evaluate the rhs only when it is called.

 In my 2 decades of progaming experiences, with 5+ years of geometry
programing and 5+ years in web application development and unix sys
admin, expert at several languages, i don't think there is a instance
where i needed 2 things to be linked in some hard way. And, whenever a
lang support such a idea, it usually just hamper the use of the
language. (one example is unix file system's hardlinks)

Basically, i think the concept of linking things is one of those easy-
to-implement paradigm that just gets rooted in the early time of
computing, when hardware (as well as software techs) are rather
primitive. Over time, these types of paradigms gets shifted and
eliminated. (e.g. dynamic/lexical scope, pointers vs so-called garbage-
collection, primitive statics typing (a la C, Java) vs dynamic typing
(lisp,perl,python,php, ruby, javascript ...on and on) or typing with
inference (haskell) or no types and mathematical types (Mathematica))

People unfamiliar with ultra high level languages such as Mathematica
(or perhaps haskell too), can understand this “what u see is what u
get” by comparing to dynamic and lexical scope. Lexical scope, in a
sense, is of the model “wysiwyg”, while dynamic can be likened to
“behind the scenes happenings”. The behavior of Mathematica expression
is wysiwyg, while lisp's expressions with its “objects” is “behind the
scenes happenings”. Also, the “behind the scenes” style are often the
brainless of the two to implement, that's why they are everywhere. But
as i repeated quite a few times, as computer langs throughout the past
2 decades, we see a progress towards higher level with less and less
these behind-the-scenese lang model kinda langs.

Kenny wrote:
「No beauty either way, just different.」

There is much beauty far beyond lisp, Kenny.

I get frustrated seeing how people truly are moronic in comparison to
me, having no idea what i'm talking about while i fully understand
what they are talking about. In the past few years i began to reckon
that i'm just a genius, whose views are maybe 30 years into the future
beyond the average computer scientists or industrial programers. (a
bit scary to say this myself, i know.)

In this particular thread, i claim, that high level langs of the
future is getting rid of the “behind the scenes” lang model, as i have
repeatedly voiced it in this thread. When i posted my first post, i
had some reservations, thinking that lisp's objects concept are
perhaps necessary in some way for the lang's area of applications. Now
having seen the many replies and have thought more about it, i more
inclined to think it's simply another baggage of 1970s computing era,
and probably can be eliminated without effecting the power of the lang
in any way.

--------------

I don't like to write short replies... but there are messages going on
in this thread that's going wild... i'll address a few shortly here.

John Thingstad is going wild stating that i confused “object” in OOP
with “lisp's objects”. I have absolutely no idea how he construed
that, honestly, absolutely no idea. In fact in one of my post i
specifically, explicitly, stated that the 2 shouldn't be confused.
Possibly he replied to Rainer's writings thinking it was mine.

The Jon Harrop character, although very informative, tends to spew
taunting remarks selling his stuff. He's throwing a few bad mouthing
of Mathematica, i may or may not retort. (one about how Mathematica
doesn't have some beautiful static typing as his love OCaml, the other
has some technical merits about how mathematica's “list” isnt “true
lists” but “vectors”. (very computer scientist-moronic, use of
jargons))

The Rainer Joswig character, is a silly one. He replied to my
messages, telling me to do some simple computations in Mathematica and
pointing out some Mathematica documentation to me, as to demonstrate
some of his points. Did he miss the fact that i'm the world's top
expert of Mathematica? Or is that not believed? If he belived, at
least, that i have like 10 years of experience of actually programing
in Mathematica, how could he think in such a way as if i didn't
understand some detail in the manual? So, to infer further, perhaps
any disagreement in our post must be attribute to other causes than my
not understanding some aspect of Mathematica?

Rainer's posts, are often like this. Perhaps it's his personality or
way of communication or thinking pattern.

----

... sometimes i think perhaps i should be nicer to people or
something. Oh well, the depths and mysteries and questions of life and
living and cosmos...

  Xah
  ···@xahlee.org
∑ http://xahlee.org/

☄
From: Don Geddis
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <87hch4bfpn.fsf@geddis.org>
Xah Lee <···@xahlee.org> wrote on Wed, 23 Jan 2008:
> I get frustrated seeing how people truly are moronic in comparison to me,
> having no idea what i'm talking about while i fully understand what they
> are talking about.  In the past few years i began to reckon that i'm just a
> genius

I have another theory: perhaps other people are able to communicate complex
ideas clearly, whereas you think muddled thoughts and have very low
communication skills.

I wonder which theory is more likely to be true?

> Did he miss the fact that i'm the world's top expert of Mathematica?

Are you?  Can you demonstrate for my your superior knowledge of Mathematica
over, say, Stephen Wolfram?

Because I've also heard Wolfram claim that he understands Mathematica pretty
well.  I wonder if there's any way to determine whether it is you, or he, (or
maybe even someone else!) that is the "world's top expert of Mathematica".

        -- Don
_______________________________________________________________________________
Don Geddis                  http://don.geddis.org/               ···@geddis.org
Cole's Law:  Thinly sliced cabbage.
From: George Sakkis
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <881ec8b2-889a-4429-86c2-be4b63c35935@s8g2000prg.googlegroups.com>
On Jan 23, 12:43 pm, Don Geddis <····@geddis.org> wrote:
> Xah Lee <····@xahlee.org> wrote on Wed, 23 Jan 2008:
>
> > I get frustrated seeing how people truly are moronic in comparison to me,
> > having no idea what i'm talking about while i fully understand what they
> > are talking about.  In the past few years i began to reckon that i'm just a
> > genius
>
> I have another theory: perhaps other people are able to communicate complex
> ideas clearly, whereas you think muddled thoughts and have very low
> communication skills.
>
> I wonder which theory is more likely to be true?

Are you new on Usenet ? I'm always amazed when people are trying to
engage in a logical conversation with Xah Lee.

George
From: Ken Tilton
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <47978b01$0$11593$607ed4bc@cv.net>
Xah Lee wrote:
> Hi Kenny!

Xah!

> 
> Ken Tilton <···········@optonline.net> wrote:
> 
> 「If I say (eq (list 1)(list 1)) in Lisp I get nil, aka, nope, they are
> not EQ, EQ being the test for object identity. (They happen also not
> to be EQL, just EQUAL (the one that digs into structure)).
> 
> What happens in Mathematica?
> 」
> 
> In[5]:=
> Equal[List[1],List[1]]
> 
> Out[5]=
> True
> 
> This is the beauty of mathematica. Simple, easy to understand. What
> you see is what you get. No behind the scenes stuff.
> 
> 「The advantage to me in Lisp is that I can have two lists starting
> with the same textual expression and they will, well, be two different
> lists, not one.」
> 
> This is exactly the point of confusing, behind-the-scenes lang model,
> i mean. Depending on the language, they can get quite convoluted. (in
> particular, Java, with its access specifiers, constructors, objects,
> arrays, references etc.)

I do find things like C++ pretty scary in this respect, mostly because 
of that whacky syntax. :) I am just not sure mutable state is 
necessarily confusing. I suppose (but am prepared for rectification) you 
are OK with slots of structs and classes (does Mathematica have such 
things?) being mutable, such that a change to a slot of an instance is 
visible from any reference that might be held to that instance. And I 
surmise also you do not consider this magic nor even behind-the-scenes? 
If so (and again I understand you might not in fact agree so far), then 
all we have in Lisp is that our lists (and other things other than 
numbers and characters) also have object identity, so if there are 
multiple references to the same thing they all change together, like an 
in-memory database if you will.

Now if I am wrong and you do find mutable state in general confusing, 
well, I guess many agree with you, namely the entire FP community. And 
even I agree: the functional paradigm is a huge win for transparency and 
program correctness. I just never enslave my self to good principles 
like FP, I always reserve the right to do what I want, shucks, even use 
tagbody-go in moments of weakness. :)

> 
> 「I can mutate one without mutating the other, holding references to
> each in different places, and then see different lists when those two
> places are compared, or (the flip side) see "the same" list even when
> the contents of the list have changed. ie, I might have two Lisp
> "places" bound to the same list and want changes to be visible
> regardless of the binding.」
> 
> Note here that Mathematica language programing doesn't have any
> concepts of momeries, pointers, or references in any way. So, this
> means, you can't like change one entity and expect some other entity
> will also be changed mysteriously or magically.

Well, the c.l.l savages always ridicule me on this, but one /does/ have 
to master the cons to do well with Lisp. And I guess you are right, many 
a noob (all?) trips over Lisp lists early on. Hell, I have always 
maintained Lisp lists should be explained this way: there is no such 
thing as a list, there are just conses with cars and cdrs.
> 
> Remember, everything is textual, “what u see is what get”, thinking it
> as a formal mathematics system. (formal means FORMal. The word
> “formal” is a example of bad/mutated terminology that creates endless
> confusion)
> 
> Kenny wrote:
> 「Mind you, this all comes down to the question of state. Does
> Mathematica
> have that concept, or is it purely functional? (not that I am
> conversant
> with FPL either). If not, sure all list[1,2,3]s can be collapsed into
> one, no need for object identity, there it is, list[1,2,3]. But in
> Lisp
> we can have two variables bound to the same list:
> 
>   (setf x (setf y (list 1 2 3))
> 
> ...and then (delete 2 y) and see the result in x because x is "/the
> same/ list as y", it is /not/ "the list of 1, 2, and 3".
> 
> Is that necessary? It is what it is, which is different than
> Mathematica
> it seems, but I do not see an interesting issue here.
> 」
> 
> Excellent example to illustrate the issue. I'd pick on your
> characterization about this being stateful vs not-stateful. What the
> behind-the-scenes hard-linked kinda thing is basically modeled on a
> state machine kinda thinking. (actually, more just because it's easy
> to implement)
> 
> Mathematica does not use “behing the scenes” lang model but you can
> assign vars in Mathematica and change it later in your program. The
> Mathemtica programing culture is more like “whatever works”, in this
> sense a lot like Common Lisp as contrasted to Scheme lisp.
> 
> In Mathematica, if you want the behavior as of hardlinking 2 behind-
> the-scene things, just assign them to the same var... like this:
> 
> In[25]:=
> data={1,2,3};
> x := data;
> y := data;
> 
> In[28]:=
> data=Delete[data,2]
> 
> Out[28]=
> {1,3}
> 
> In[29]:=
> x
> 
> Out[29]=
> {1,3}
> 
> In[30]:=
> y
> 

And?:

Out[30]=
{1,3}

I guess so, since it seems no diff than x. Can I go the other way, 
Delete[x,2] and see it back in data? Or has x become like a symbol-macro 
for data? Anyway...

> The “:=” is syntax shortcut for “SetDelayed[]”, while “=” is “Set[]”.
> The difference, is that the Set evaluate the rhs before assignment,
> while SetDelayed evaluate the rhs only when it is called.

And once it is called? What if you had evaluated x /first/ and seen 
1,2,3 and /then/ deleted 2. Would a second evaluation of x still return 
1,2,3? If it returns 1,2, then := in a sense gives one the same thing as 
object identity, except (and this may be an advantage, if I want to 
achieve this behavior I must remember to use :=. ie, if I now want to 
have z refer to the same list, I must say z := x.

> 
>  In my 2 decades of progaming experiences, with 5+ years of geometry
> programing and 5+ years in web application development and unix sys
> admin, expert at several languages, i don't think there is a instance
> where i needed 2 things to be linked in some hard way.

I do it all the time. Lists are a very powerful data structure and with 
a full suite of list-processing functions such as CL has one can go 
quite far with them without getting into structs and classes. This works 
especially well inside functions where it is not all that hard to keep 
track of what is in each position in a list one dashes off on the fly.

> And, whenever a
> lang support such a idea, it usually just hamper the use of the
> language. (one example is unix file system's hardlinks)

But if we cannot mutate lists we lose expressive power. Lisp programs 
that always copy structure cannot scale beyond toy size. So mutable 
state is not there just to confuse people, it extends the language (so, 
yes, there is more to be learned).
> 
> Basically, i think the concept of linking things is one of those easy-
> to-implement paradigm that just gets rooted in the early time of
> computing, when hardware (as well as software techs) are rather
> primitive. Over time, these types of paradigms gets shifted and
> eliminated. (e.g. dynamic/lexical scope, pointers vs so-called garbage-
> collection, primitive statics typing (a la C, Java) vs dynamic typing
> (lisp,perl,python,php, ruby, javascript ...on and on) or typing with
> inference (haskell) or no types and mathematical types (Mathematica))
> 
> People unfamiliar with ultra high level languages such as Mathematica
> (or perhaps haskell too), can understand this “what u see is what u
> get” by comparing to dynamic and lexical scope. Lexical scope, in a
> sense, is of the model “wysiwyg”, while dynamic can be likened to
> “behind the scenes happenings”.

OK, but if one thinks of dynamic scope as up the stack instead of behind 
the scenes, it is not so scary. Mind you, this means using special 
variables with discipline, only ever binding them in let clauses.

> The behavior of Mathematica expression
> is wysiwyg, while lisp's expressions with its “objects” is “behind the
> scenes happenings”. Also, the “behind the scenes” style are often the
> brainless of the two to implement, that's why they are everywhere. But
> as i repeated quite a few times, as computer langs throughout the past
> 2 decades, we see a progress towards higher level with less and less
> these behind-the-scenese lang model kinda langs.

I wish I knew more about FP, it sounds like you would prefer that to the 
extra mental housekeeping we Lispers take for granted (and sometimes 
screw up).

> 
> Kenny wrote:
> 「No beauty either way, just different.」
> 
> There is much beauty far beyond lisp, Kenny.

You are right, it has been too long since I dropped this URL:

    http://www.tilton-technology.com/cello-shot-06.jpg

> 
> I get frustrated seeing how people truly are moronic in comparison to
> me, having no idea what i'm talking about while i fully understand
> what they are talking about. In the past few years i began to reckon
> that i'm just a genius, whose views are maybe 30 years into the future
> beyond the average computer scientists or industrial programers. (a
> bit scary to say this myself, i know.)

I have not read much of the thread, but you are not going to change the 
audience, all you can change is Xah. (No, it is not easy changing 
oneself, either, but unlike changing others it is remotely possible.) 
The change you can make is simply to continue only those subthreads you 
find worthwhile, which may well be the null set. I think you already do 
this with the hysterical Greek chorus that always screeches "Troll!" at 
you, just do the same with exchanges you do not find intellectually 
stimulating.

> 
> In this particular thread, i claim, that high level langs of the
> future is getting rid of the “behind the scenes” lang model, as i have
> repeatedly voiced it in this thread. When i posted my first post, i
> had some reservations, thinking that lisp's objects concept are
> perhaps necessary in some way for the lang's area of applications. Now
> having seen the many replies and have thought more about it, i more
> inclined to think it's simply another baggage of 1970s computing era,
> and probably can be eliminated without effecting the power of the lang
> in any way.

No, mutable lists rock and are one of the great bits of genius behind 
Lisp, the idea that the singly-linked list is an incredibly versatil 
data structure. To have that and all the list-processing functions just 
sitting there at ones fingertips while dashing off little algorithms is 
simply huge, explaining Greenspun's Tenth (the replication of Lisp in 
other HLLs).

> 
> --------------
> 
> I don't like to write short replies... but there are messages going on
> in this thread that's going wild... i'll address a few shortly here.
> 
> John Thingstad is going wild stating that i confused “object” in OOP
> with “lisp's objects”. I have absolutely no idea how he construed
> that, honestly, absolutely no idea. In fact in one of my post i
> specifically, explicitly, stated that the 2 shouldn't be confused.
> Possibly he replied to Rainer's writings thinking it was mine.
> 
> The Jon Harrop character, although very informative, tends to spew
> taunting remarks selling his stuff. He's throwing a few bad mouthing
> of Mathematica, i may or may not retort. (one about how Mathematica
> doesn't have some beautiful static typing as his love OCaml, the other
> has some technical merits about how mathematica's “list” isnt “true
> lists” but “vectors”. (very computer scientist-moronic, use of
> jargons))
> 
> The Rainer Joswig character, is a silly one. He replied to my
> messages, telling me to do some simple computations in Mathematica and
> pointing out some Mathematica documentation to me, as to demonstrate
> some of his points. Did he miss the fact that i'm the world's top
> expert of Mathematica? Or is that not believed? If he belived, at
> least, that i have like 10 years of experience of actually programing
> in Mathematica, how could he think in such a way as if i didn't
> understand some detail in the manual? So, to infer further, perhaps
> any disagreement in our post must be attribute to other causes than my
> not understanding some aspect of Mathematica?
> 
> Rainer's posts, are often like this. Perhaps it's his personality or
> way of communication or thinking pattern.

If it helps, what I notice is the character-orientation of your summary. 
  eg, What happens if my last sentence instead was, "That summary is 
character-oriented?" You and I just disappeared, and the chance of 
interpersonal conflict diminishes. On those rare occasions when I am not 
actively making trouble here, I sometimes go back and rewrite so all the 
pronouns disappear. Fascinating exercise.

> ... sometimes i think perhaps i should be nicer to people or
> something. 

Ironically, that is hard for you because you are /too/ social an animal. 
The sociopath feels no connection to others and so can charm them into 
doing anything. Your thin skin (a great metaphor, btw) leaves you no way 
to keep people out, no way to shrug off those whose e-company you do not 
enjoy. In a sense, by reacting strongly to others you make their 
problems your problem, in that you cannot be comfortable as long as they 
are jerks. Not a recipe for contentment.

There are billions of people on this planet, put your energy into those 
you enjoy, not so much into those you do not. When your will weakens, 
well, hey, what's a killfile for? :)

> Oh well, the depths and mysteries and questions of life and
> living and cosmos...

You worried above about being a genius, here's something you might like. 
I got the idea from a bumper sticker, so you know it's good: "If you 
think you can do anything, try sailing." Made relevant, nothing is 
harder or more rewarding for folks like us than getting along with 
others. So if you think you are a genius, Xah, figure out how to get 
along with folks on Usenet. I'll give you a headstart from your roots: 
"Win without fighting."

kt

-- 
http://www.theoryyalgebra.com/

"In the morning, hear the Way;
  in the evening, die content!"
                     -- Confucius
From: Frank Buss
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <cp6whvbslj50$.sv5m0bw80ink.dlg@40tude.net>
Ken Tilton wrote:

> Mind you, this all comes down to the question of state. Does Mathematica
> have that concept, or is it purely functional? (not that I am conversant
> with FPL either). 

I'm not a Mathematica expert, but sometimes it is useful for solving
equations etc. If you hit F1 on "List" in Mathematic, the help screen says:

| Lists are very general objects that represent collections of expressions. 

This means, Mathematica has objects and a list is an object. And you can
modify lists in Mathematica, without modifying other lists with the same
elements, which would be really strange. Functional languages like Haskell
doesn't allow to modify lists (with some exceptions and internally some
operations may be implemented in an efficient modifying way), which makes
it easier to write more bug-free code, if you think long enough to find a
good functional solution, which is more difficult for me sometimes than
writing a non-functional solution.


A Mathematica notebook:


In[1]:=x=List[4,5,6]
Out[1]={4,5,6}

In[2]:=y=List[4,5,6]
Out[2]={4,5,6}

In[3]:=x[[2]]
Out[3]=5

In[4]:=x[[2]]=9
Out[4]=9

In[5]:=x
Out[5]={4,9,6}

In[6]:=y
Out[6]={4,5,6}


The same in Lisp:


CL-USER 1 > (defparameter *x* (list 4 5 6))
*X*

CL-USER 2 > (defparameter *y* (list 4 5 6))
*Y*

CL-USER 3 > (elt *x* 1)
5

CL-USER 4 > (setf (elt *x* 1) 9)
9

CL-USER 5 > *x*
(4 9 6)

CL-USER 6 > *y*
(4 5 6)

-- 
Frank Buss, ··@frank-buss.de
http://www.frank-buss.de, http://www.it4-systems.de
From: Ken Tilton
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <479766e5$0$11555$607ed4bc@cv.net>
Frank Buss wrote:
> Ken Tilton wrote:
> 
> 
>>Mind you, this all comes down to the question of state. Does Mathematica
>>have that concept, or is it purely functional? (not that I am conversant
>>with FPL either). 
> 
> 
> I'm not a Mathematica expert, but sometimes it is useful for solving
> equations etc. If you hit F1 on "List" in Mathematic, the help screen says:
> 
> | Lists are very general objects that represent collections of expressions. 
> 
> This means, Mathematica has objects and a list is an object. And you can
> modify lists in Mathematica, without modifying other lists with the same
> elements, which would be really strange. Functional languages like Haskell
> doesn't allow to modify lists (with some exceptions and internally some
> operations may be implemented in an efficient modifying way), which makes
> it easier to write more bug-free code, if you think long enough to find a
> good functional solution, which is more difficult for me sometimes than
> writing a non-functional solution.
> 
> 
> A Mathematica notebook:
> 
> 
> In[1]:=x=List[4,5,6]
> Out[1]={4,5,6}
> 
> In[2]:=y=List[4,5,6]
> Out[2]={4,5,6}
> 
> In[3]:=x[[2]]
> Out[3]=5
> 
> In[4]:=x[[2]]=9
> Out[4]=9
> 
> In[5]:=x
> Out[5]={4,9,6}
> 
> In[6]:=y
> Out[6]={4,5,6}
> 
> 
> The same in Lisp:
> 
> 
> CL-USER 1 > (defparameter *x* (list 4 5 6))
> *X*
> 
> CL-USER 2 > (defparameter *y* (list 4 5 6))
> *Y*
> 
> CL-USER 3 > (elt *x* 1)
> 5
> 
> CL-USER 4 > (setf (elt *x* 1) 9)
> 9
> 
> CL-USER 5 > *x*
> (4 9 6)
> 
> CL-USER 6 > *y*
> (4 5 6)
> 

Haha, sure, it is the same, but is it the same same? Lisp's is nothing 
to take lightly:

    file:///C:/Program%20Files/acl81/ansicl/glossary/s.htm#symbol

But I digress. Your example does not demonstrate sameness until we also see:

 > In[1]:=x=List[4,5,6]
 > Out[1]={4,5,6}
 >
 > In[2]:=y=x  <- bingo
 > Out[2]={4,5,6}
 >
 > In[3]:=x[[2]]
 > Out[3]=5
 >
 > In[4]:=x[[2]]=9
 > Out[4]=9
 >
 > In[5]:=x
 > Out[5]={4,9,6}
 >
 > In[6]:=y
 > Out[6]={4,9,6} <- bongo
 >

Can do? Bingo bongo, I mean. :)

kt


-- 
http://www.theoryyalgebra.com/

"In the morning, hear the Way;
  in the evening, die content!"
                     -- Confucius
From: Frank Buss
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <9ygm0ba7g8q6.14jkk3zw5y3hx.dlg@40tude.net>
Ken Tilton wrote:

>  > In[2]:=y=x  <- bingo
>  > Out[2]={4,5,6}

Looks like this does a deep copy in Mathematica.

>  > In[6]:=y
>  > Out[6]={4,9,6} <- bongo

Methematica prints {4,5,6} at this point. Maybe this is what Xah Lee means.

-- 
Frank Buss, ··@frank-buss.de
http://www.frank-buss.de, http://www.it4-systems.de
From: Kaz Kylheku
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <98549142-729e-4787-94e0-77a2fc4f65dd@m34g2000hsb.googlegroups.com>
On Jan 22, 1:50 am, Xah Lee <····@xahlee.org> wrote:
> In Mathematica, there's no such steps of concept. The list
> "List[1,2,3]" is just a list as is. There is no such intermediate
> concept of objects. Everything in Mathematica is just a "expression",
> meaning the texual code you type.

If this concept is only described in the Mathematica documentation
then an expression is not really /in/ Mathematica.

It's just something that you type which becomes a behavior in
Mathematica, but otherwise has no (programmer-accessible)
representation.

I.e. List[1, 2, 3] causes something to happen, but is itself not an
entity.

According to you, anyway. If you understand Mathematica as well as you
do Lisp, then I'd take all that with a grain of salt. Maybe there is a
way to quote List[1, 2, 3] to get it as data, only you haven't figured
it out.

> This is a high-level beauty of Mathematica.

Not giving the programmers access to the data structure which
represents the program source is simply a myopic limitation. A myopic
limitation doesn't give rise to a higher level language.

> Java, Haskell and lisp too), then is lisp's internal concept "object"
> necessary in a language such as lisp?
>
> Does anyone know?

You need the source code as an object if you want to extend the
compiler with user-defined code to handle new syntax by transforming
it to existing syntax (e.g. Lisp macros).

Note that in compiled Lisp systems, the object representation of a
program can ``go away''. It's needed during the source code processing
steps, including macro expansion. When all of that is done, the
representation of a program becomes machine language. Of course, any
literal constants in the program, including quoted pieces of its list
structure, become run-time data.

So if we have:

(defun foo()
  '(1 2 3))

when we compile it, the list (defun foo () '(1 2 3)) goes away; it is
not necessary  to retain it in the compiled image. The list (1 2 3)
however will be represented somewhere in that image as static data,
just like a numeric or string literal. The function FOO will be
translated to some little piece of machine code which calculates the
effective address of that data (i.e. the reference to the first cons
cell of (1 2 3)) and returns it.

This is analogous to the C language:

  /* this ``while'' appears in the compiled image */
  char *str = "while";

  /* this ``while'' doesn't */
  while (1) { /* ... */ }

One is quoted to become literal data, one isn't.
From: Xah Lee
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <73afc3f9-898c-4f20-a8ff-c85ccd4952c0@s27g2000prg.googlegroups.com>
This reply is long, at over 5 thousand words. It is separated into 3
posts.

-------------

Kenny wrote:
«I do find things like C++ pretty scary in this respect, mostly
because of that whacky syntax. :) I am just not sure mutable state is
necessarily confusing. I suppose (but am prepared for rectification)
you are OK with slots of structs and classes (does Mathematica have
such things?)»

struct as in C's “record” data type? Classes as in Java? Mathematica
doesn't have these things. For Class, it depend on what you mean. As a
language's data type, no. As a function that contains data ond inner
functions one can implement in just about any well designed lang, yes.

Kenny wrote:
«being mutable, such that a change to a slot of an instance is visible
from any reference that might be held to that instance. And I surmise
also you do not consider this magic nor even behind-the-scenes?  If so
(and again I understand you might not in fact agree so far), then all
we have in Lisp is that our lists (and other things other than numbers
and characters) also have object identity, so if there are multiple
references to the same thing they all change together, like an in-
memory database if you will. »

As i repeated many times now, programing in Mathematica does not have
any concept of memory, reference, pointer, WHATSOEVER. There is NO
behind-the-scene memory, object, abstract storage location, where some
other entities “reference” to it.

As i repeated many times, one can think of Mathematica as graph-
rewriting, term-rewriting, symbol-sequence transformation, or a formal
system. Where, the formal here means mathematical _formalism_ (lookup
in wikipedia, folks!). Or, think of it as lisp's concept of “forms”
that are used everywhere in emacs lisp documentation, it is
essentially the principal concept one needs to understand for lisp's
macros. So, practically speaking, one could think of Mathematica
language as if it is programing in lisp using macros and nothing but
macros. (So, if you think of this way, you'll see what i mean by no
“behind the scene”, a wysiwyg model of language. I repeat, think of it
as pure lisp macro and form programing. _Textual_. No “behind the
scenes” stuff! WHAT YOU SEE IS WHAT YOU GET! _Textual_. No “behind the
scenes” stuff! WHAT YOU SEE IS WHAT YOU GET! _Textual_. No “behind the
scenes” stuff! WHAT YOU SEE IS WHAT YOU GET! _Textual_. No “behind the
scenes” stuff! WHAT YOU SEE IS WHAT YOU GET!)

Of course, despite my profuse talking, i guess it is nigh impossible
to convey the idea unless one actually start to program or spend a few
hours in Mathematica. In analogy, imagine yourself trying to preach
the gospel of lisp to Java programers. You can spend hours and few
thousands words, on some concept that's so crystal clear and basic to
you, but the Java guy wouldn't be able to comprehend it. (and if the
Java person happens to be a hotshot or have years of industrial
experience, or if you happen to not kiss their asses (why should
you?), he'll deem you a fucking troll)

The thought has come to my mind to write a few pages of tutorial of
mathematica for lisp programers. Written in a way such that every
mathematica construct has a exact analogous counter part in lisp, in
both syntax and semantics, so that the tutorial actually also
functions as a fun-oriented lisp programing excursion. (in fact, if
you are a lisp expert, you can consider yourself already half of a
Mathematica expert. (lisp's major concepts of fully functional nested
syntax, forms, macros, lambda, lists, function application are just
about the same set of fundamental ideas in Mathematica, almost token
for token identical except few syntax quirks or when we get into hairy
stuff of evaluation model exhibited in lisp as “quote”, “eval” ... or
in Mathematica as Hold[], HoldForm[], Evaluate[], ReleaseHold[]...)
For any seasoned lisper, if you actually just spend 1 month studying
Mathematica, you can boast 1 year of Mathematica experience in your
resume. And if you code Mathematica for 1 year, you can boast being a
Mathematica expert. (in this regard of Lisp/Mathematica, its the same
with Perl/PHP, Scheme/Common Lisp, and probably similar with Java/C#,
due to, their intimacies))

In many of my post about lisp here in 2007, often i have hoped that my
discussion may lead to the discourse of hows and technicalities of a
writing lisp extension that emulates Mathematica. (am ambivalent in
leading into this direction since if successful, will piss off Stephen
Wolfram (robbing his business), possibly entail legal issues, and i
myself am not ready to gratuitously offer my services to lisp
community given the way lisper MORONS reacted to my criticisms and
amidst the fucking militant OpenSource collectivism ambiance)

Lisp can actually relatively easily do it, emulating the full
Mathematica's systematic list/tree extraction/manipulation functions
(getting rids of the cons problem for good), as well getting rid of
the lisp “behind the scenes” object concept with some practical
benefits, a full automatic code formatter (beyond emacs current
offerings) that get rid of the endless manual “how you should indent
parens” once for all (as in the benefit transparently offered by
Python), and also layer on top of sexp a infix notation while
maintaining absolutely the full sexp benefits, and even more, if lisp
experts are interested in spending now quite a lot energy, go as far
as making lisp a full 2D Mathematical notation display system, and a
notebook system (think of it as a language feature so powerful that
the SOURCE CODE actually transparenly functions like a Word Processor
file format that can contain structures and images).

But before any of the above discussions can sprout, it is already shot
down by the army of sensitive and clueless lisper morons. Guarding
their ways in high anxiety, dismissing the FREQUENT and PERSISTANT
criticisms of lisp (many of these criticisms are from renowned lisp
luminaries), and kept thinking somehow that a language designed 40
years ago are destined to be perpetually perfect. (meanwhile
grudgingly witnessing facts that relative fucking stupid langs like
Perl, Python, PHP, Java, Ruby, born and rose beyond lisp's combined
applications in its entire history) Many old lispers still fancy that
lisp is the only unique, all-powerful, all-living creature that rule
all languages.

This is a social and psychological problem of lisp mainly caused by
the old timers. Besides fixing problems of the lisp the language
proper, other problems and possible solutions such as frequent
complain of its libraries are similarly thwarted and smiten.

Whenever there's a new lisp, such as dylan or the vapourwear arc,
there's a high sentiment of hate-love brouhaha. All Common lispers are
anxious and giddy, throwing mixed remarks of excitment and
belittlement. The Scheme Lispers live in a fantasy world of their own,
simultaneously wishing for popularity and aloofness, culminating in
the R6RS fuckup in late 2007. (if my chances of becoming a Common
Lisper anytime soon is 1%, my chances to become a Scheme lisper is
0.01%, by sheer will.)

(this post continues in the next post)

  Xah
  ···@xahlee.org
∑ http://xahlee.org/

☄
From: Xah Lee
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <bea5b67b-9a45-441b-a5a8-8d4d2dd63a0a@u10g2000prn.googlegroups.com>
(this post the 2nd part continuation from previous one)

Kenny wrote:
«Now if I am wrong and you do find mutable state in general confusing,
well, I guess many agree with you, namely the entire FP community. And
even I agree: the functional paradigm is a huge win for transparency
and program correctness. I just never enslave my self to good
principles like FP, I always reserve the right to do what I want,
shucks, even use tagbody-go in moments of weakness. :)»

KENNY!! IT'S NOT A QUESTION OF MUTABILITY! Mutability — another
computer “scientist-moron” jargon. What you mean by mutability? As i
have repeatedly explained in different contexts in posts here, opaque
jargons beget confusion. By mutability, do you mean ultimately the
ability to reference to the same memory address exhibited in lisp as
lisp objects? By mutability do you mean Haskell's prevention of a
variable being assigned twice?

As i tried to explain in previous message... regarding Mathematica, if
by mutability you mean pointers/references/behind-the-scene-objects,
then No, mathematica doesn't have any of such notions. If by
mutability you mean whether a variable can have its values changed,
then yes, Mathematica do.

The “mutable/mutability/immutable” is another fucking jargon that
refuse to die precisely because it's a fucking moronic imprecise and
gaudy jargon. It has different meanings as applied to data types,
languages, language models, computational models. Every tech geeker
latches onto it their own conception and brandish it to no ends.

Note here, as i have also repeated in different posts in the past,
that among Mathematica's over 1 thousand pages of manual, containing
fucking over 1 thousand build-in functions, with maybe a few hundred
advanced mathematical functions that only a few handful mathematicians
in the world understands their meaning, and containing all the fucking
fancy constructs exhibited by jargons such as lambda, HOF (higher
order function), fucking first class citizens, fucking fuck you
“closures”, mutabilatilabilalilaty, M'm'mmmmacros, the over 1 thousand
pages of Mathematica manual doesn't mention any of these fucking shit
terms and is extremely easy to read and understand. (i happened to
have read the over 1 thousand pages of mathematica manual word for
word from cover to cover 3 times during 1990s in different years)

Kenny wrote:
«Well, the c.l.l savages always ridicule me on this, but one /does/
have to master the cons to do well with Lisp. And I guess you are
right, many a noob (all?) trips over Lisp lists early on. Hell, I have
always maintained Lisp lists should be explained this way: there is no
such thing as a list, there are just conses with cars and cdrs.»

Exactly. When will these lispers die?

When?

------------------------------------

In[25]:=
   data={1,2,3};
   x := data;
   y := data;

In[28]:=
   data=Delete[data,2]
Out[28]=
   {1,3}

In[29]:=
   x
Out[29]=
   {1,3}

In[30]:=
   y
Out[30]=
   {1,3}

Kenny wrote:
«Can I go the other way, Delete[x,2] and see it back in data?»

No, not in the above. Remember, in Mathematica, everything is textual,
or copies. All functions return copies. There's no such thing as some
“object” that lives in some ether space that if changed effects others
entities “pointing/referencing” to it. There is no concept of
reference/pointer/memory whatsoever. Again, think of programing only
using lisp's macros. It's all about form changing. What you see is
what you get. WYSIWYG.

Kenny wrote:
«Or has x become like a symbol-macro for data?»

Yes. If i understand you correctly, yes.

Xah wrote:
「The “:=” is syntax shortcut for “SetDelayed[]”, while “=” is
“Set[]”.  The difference, is that the Set evaluate the rhs before
assignment, while SetDelayed evaluate the rhs only when it is called.」

Kenny wrote:
«And once it is called? What if you had evaluated x /first/ and seen
1,2,3 and /then/ deleted 2. Would a second evaluation of x still
return 1,2,3? If it returns 1,2, then := in a sense gives one the same
thing as object identity, except (and this may be an advantage, if I
want to achieve this behavior I must remember to use :=. ie, if I now
want to have z refer to the same list, I must say z := x.»

Yes.

Again, if a programer actually want the behavior of certain things
linked together, you can do it just as one can in any well-designed
lang. (For example, in lisp, you can program the hard link behavior
without resorting to lisp's default objects concepts) The concept of
linked behavior as exhibited by langs that come with the “behind-the-
scenes” “language model” is essentially having the variables defined
in terms of another variable, since the hard link concept is basically
just references pointing to the same memory address.

I'd be happy to show how it can be done in Mathematica if you provide
lisp code of certain linked behavior you want.

-------------------------------------

The thing to understand here is high-level-ness. It seems to me,
programers accustomed to compiled languages as well as lispers, have a
extremely difficult time understanding the concept of computer
language as a textual interface to computation. In particular,
compiler writers or those acquainted with compiler knowledge, are
spectacularly incapable in understanding this because their brain has
been permanently damaged by low-level exposure. (in this thread, that
Kaz Kylheku guy is the best example, voicing his dips and jumpers
mumble jumble into this thread. (if we switch the underlying hardware
to DNA or quantum qubits, his dips and jumpers goes haywire))

Kenny wrote:
«But if we cannot mutate lists we lose expressive power. Lisp programs
that always copy structure cannot scale beyond toy size. So mutable
state is not there just to confuse people, it extends the language
(so, yes, there is more to be learned).»

I object your use of the term mutable here as discussed above.
However, if i understand you correctly... here's the issue.

My issue here, is the complete separation of language as a interface
to computation, and elements in the language as hints/declarations/
helpers for the current technology of compilers to achieve speed or
certain low level issues (such as reference to memory, number size in
bits), which are necessary when the language is used in certain
applications (such as writing a OS, compiler, device driver) that by
definition needs to have direct access to hardware elements.

As i've tried to explain above, the behavior of linked things (of
which you are calling it “mutability” here), is essentially having 2
vars pointing to another var that holds the value. As i tried to
explain, this can be easily done in any well-designed lang, without
having the language having a behind-the-scenes reference/pointer/
object model. (how exactly it is done depends on the language, of
course) As stated before, if you give a code example in lisp that
shows some linked behavior, i can provide the some behavior coded in
Mathematica, or i can even try to code the same behavior in emacs lisp
but implemented without utilizing what you'd call the “mutability”
concept (what i'd call behind-the-scenes language model, or pointers/
references/“memory addr”/“lisp's object”.).

In yet another way to see this, is to focus on the _behavior_ of
computation as a function of the language's textual input (the source
code). Put it in yet another way, any of the idiosyncratic behind-the-
scenes language model of Lisp/Java/C are all having no mathematically
meaningful utility. They are by-products of computer engineering
constrained by today's hardware, often necessary due to practical,
speed issues. (and in the case of C and Java or most imperative langs,
it's mostly because it's brainless to implement)

Kenny wrote:
«OK, but if one thinks of dynamic scope as up the stack instead of
behind the scenes, it is not so scary. Mind you, this means using
special variables with discipline, only ever binding them in let
clauses.»

Nah.

First of all, please don't use the word stack. I don't know what the
fuck is a stack, and i know it is mathematically meaningless. What i
do know, is that one can insert ping pong balls into a pussy, and the
last one inserted will come out first. Now, that is mathematics.

Dynamic scope is another stupidity, just like the behind-the-scenes
things i've spoke above, that it has no meaningful Mathematical value.
Its existence, is just a by-product of computer engineering
constrained by computer hardware (in this case, the hardware of few
decades ago). As a “proof” of this, you do realize that dynamic scope
basically disappeared in langs of the past 2 decades.

Of course, as a review of computing history, we note that the behavior
of the fortuitous “dynamic scope” of the past is actually desirable
sometimes. And, it can be easily implemented in any well designed
langs, without the lang itself having a behind-the-scene stuff.
Exactly how this can implemented on top depends on the lang of course.

Again, let me emphasize, the issue here, is a clear separation of what
is mathematical meaningful in a language, from what's practical,
speed, or hardware issues. If one wants to design a language that's
not just high-level like mathematica, but also as speedy as C, then as
i have wrote long esasy before, this should be done by having special
declarative/hint construct that clearly indicate to user of the
language that such is for the compiler, and having nothing to do with
his computation.

Kenny wrote:
«I wish I knew more about FP, it sounds like you would prefer that to
the extra mental housekeeping we Lispers take for granted (and
sometimes screw up).»

I think i'm even aversive to the jargon of Functional Programming. It
smacks of certain elitism and generates confusion. So-called
Functional programing practically is just to define your functions so
it doesn't call global vars. Of course, in a more strict sense, it
entails all that lambda, currying, recursion, higher-order-functions,
function sequencing aka filters or streaming, function composition,
“referential transparencies”, monads (whatever that is) ... and all
that good shit.

... now a bit rambling about jargons:

Jargons, first of all, we should understand, are terms that arise in
particular field by its professionals, for the practical purpose to
communication. More specifically, the need is usually as a shorthand
of complex ideas, and a way to make the idea precise. As i have said
profusely in my various essays, jargons, being a part of language,
which is a social element of human animals, it also serve a social
function of class differentiation (think of how you dress represent
who you are; and you diction and accent gives out who you are; and
human animals in general go out of their way to dress/speak in hope to
give impressions, hide, or be themselves). Jargons, as in this social
role, tends to become abused, in the sense that often jargons are
thrown by people without actually intending to use it as its original,
professional purpose of ease in communicating specialized ideas. More
specifically, jargons are throw by people without understanding what
it means nor intend to discuss any issue. (see the many literary,
highbrow words i used in this post? They are not really there for the
purpose of clear communication. They are there, to tell my detractors,
that they are fucking morons in comparison to me in the department of
english writing. Too.)

Now, a jargon, or the naming of things, has a quality aspect. That is,
as a professional in some field, we can judge, the quality of a jargon
according to the degree the term communicates its meaning or by other
criterion such as terseness, familiarity, religious/cultural issues,
and so on. For example, there are standardized jargons in almost all
professional fields with published tomes, and many times how a new
concept or thing is named is much debated for its merits before it is
actually standardized or adopted by some standardizing organization in
the field. (think of jargons in law, physics, mathematics, chemistry,
astronomy, biology, psychology, literature, linguistics ... and in
general any science. (try to come up with examples of jargons in each
of these field yourself))

Now, let's consider the jargon “functional programing” a bit. The
jargon “functional programing”, in my opinion, is a jargon of good
quality. It communicates its meaning quite well, based on the
similarity of math concept of function. Not too long, culturally ok,
not offensive ..., using words that are commonly understood (sometimes
this is a good thing, sometimes not) ...

Now, let's consider the jargon “functional programing” in the context
of social, class differentiation context. When the jargon FP is
abused, in what way it is abused? what are some of the particular
effects or ramifications of its abuse? I don't have some elaborate
analysis of this... but one observation i can think of, is that it
confounds the very essence of the idea functional programing, namely
that of coding subroutines such that it doesn't use global vars. So,
especially when in tech geekers group as in comp.lang.*, it often
gives the impression to laymen (imperative programers) that it is
something very complex, involving a bunch of “advanced” and esoteric
concepts of math. Essentially, distorting its essential meaning and
generates non-understanding for something that is otherwise extremely
easy to understand.

(again, i note here that in the Mathematica language's one-thousand
pages of manual, it does not emphasize or give the reader any
impression of the jargon “functional programing” at all. (am not sure
if the term is actually found in the manual other than in a intro
section showing briefly how Mathematica can do various styles of
programing) This is a extreme beauty and savviness of Stephen Wolfram.
The effect of this proper technical writing, avoids the problem of
programers sensationally discuss things that is divisive of programing
(i.e. writing specifications for a desired computation). In contrast,
the worst shit of abuse of jargon and its damaging effects happen in
Haskell, Scheme, Perl, Python documentations. Perl and Python docs are
in different class of stupidity than the Haskell and Scheme. Haskell
and Scheme people actually understand the meaning of the jargons they
use, and they used it properly, only not understanding its social
aspects. While, the Perl and Python morons do not understand the
technical aspects of the jargons they use, but also so fucking stupid
to litter them profusely in their docs)

Kenny wrote:
«You are right, it has been too long since I dropped this URL:
    http://www.tilton-technology.com/cello-shot-06.jpg
»

Great. :) Can the chick be naked?

I have:
http://xahlee.org/PageTwo_dir/Personal_dir/porn_movies.html

Kenny wrote:
«No, mutable lists rock and are one of the great bits of genius behind
Lisp, the idea that the singly-linked list is an incredibly versatile
data structure. To have that and all the list-processing functions
just sitting there at ones fingertips while dashing off little
algorithms is simply huge, explaining Greenspun's Tenth (the
replication of Lisp in other HLLs).»

O my god Kenny. By now i hope you understand my idea of the separation
of computer language issues of (1) language elements that have do to
with constructing computation and algorithms. (2) language elements
that are by-products of computer engineering constrained by hardware
technologies; mathematically valueless but necessarily anyway for
practical, speed, or hardware-device-controlling aspects.

If you understand the above, you'll see that the “linked list” belongs
to the second category. In other words, it is a mathematically
meaningless concept as far as computation is concerned.

And, if you understand the above, then the linked list as exhibited in
lisp's cons cells, is detrimental and hampers the programer's job of
using the language to specify computations. To you, a lisper, you
might not see this, and think cons being great because the lang's
technical details is ingrained in your mind. But let me put to you a
mirror: suppose i present to you the idiosyncratic, behind-the-scene,
language model thingamajig used in Java or C or C++ language, you
probably are very confounded by them, and find them quite a fetter and
distraction to specifying computations you have in your mind.

--------------------------

In the above, i have expressed the idea of seperation of mathematicaly
meaningful elements of a computer language, from elements in the
language that are necessitated by hardware and speed constraints.

Stephen Wolfram — a certified genius — resorts to not word but deeds
and created Mathematica some 20 years ago, based on this priciple. (i
do not know he actually, explicitly thought in this way, nor, whether
this is the dominant priciple he had in mind when creating
Mathematica. However, it is my ascription, that Mathematica the lang
is a lang based on the above priciple. Furhter, Stephen is fully aware
of this priciple and this is exhibited in a lot places in his
writings. (side note: there are other langs based on this principle,
also, i'm not the only person who have this idea. This idea of a lang
being a pure tool for specification of computation without computer-
engineering-hardware issues, is certainly not new. However, it is
probably the case that 99.99% of computer scientist who work on
languages, have no idea of it.))

I hope i have made this idea clear. A recent essay on this is at:

 “Jargons And High Level Languages”
 http://xahlee.org/emacs/jargons_high_level_lang.html

(and read the related essays linked at the bottom if you want to know
more)

Is there any lispers who are still very confused and don't know what
i'm talking about? Raise your hands please, so i can kick you in the
ass.

(this message continues in the next post)

  Xah
  ···@xahlee.org
∑ http://xahlee.org/

☄
From: Xah Lee
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <1b361847-a752-4c8e-8ab5-6db44932eb48@i72g2000hsd.googlegroups.com>
(this post the 3rd part continuation from previous one)

Kenny wrote:
«If it helps, what I notice is the character-orientation of your
summary.  eg, What happens if my last sentence instead was, "That
summary is character-oriented?" You and I just disappeared, and the
chance of interpersonal conflict diminishes. On those rare occasions
when I am not actively making trouble here, I sometimes go back and
rewrite so all the pronouns disappear. Fascinating exercise.»

«...»

«You worried above about being a genius, here's something you might
like.  I got the idea from a bumper sticker, so you know it's good:
"If you think you can do anything, try sailing." Made relevant,
nothing is harder or more rewarding for folks like us than getting
along with others. So if you think you are a genius, Xah, figure out
how to get along with folks on Usenet. I'll give you a headstart from
your roots: "Win without fighting."»

Thank you for the kind gesture.

Though, the question here, is whether i'm willing.

Further, the underlying assumption is hostility and benevolence. In
general, i'm probably one of the most loving person here. I quote:

    «The best index to a person's character is (a) how he treats
people who can't do him any good, and (b) how he treats people who
can't fight back.» —Abigail Van Buren)

 (Compare, the loving, caring, moralists and mother-fuck-faces such as
George W Bush, who caused the death of 40 thousand to 260 thousand
people)

Also, consider in a smaller scale of those “benign” and “OpenSource-
morally-good” tech geekers who holds power. Namely, those ops/
admins/“first-click”-creators in IRCs, blogs, mailing lists, online
forums, newsgroups... many have banned/kicked and harassed me (one of
them in a legal way), some i'd guess wish to do me harm because they
think i'm a “troll”.

(perhaps i should note here, publicly for the first time, that i was
banished in freenode's irc emacs channel in 2006 (and still banned to
this day), by a FSF-abiding (and employee), morality-sensitive,
vegetarian, fuckface one John Sullivan (aka johnsu01), despite that
i'm perhaps one of the most helpful member in that channel at the time
(and i offered verifiable proof on this assertion)
For detail, see the link at:
 http://xahlee.org/emacs/emacs_essays_index.html
)

Kenny wrote:
«Ironically, that is hard for you because you are /too/ social an
animal.  The sociopath feels no connection to others and so can charm
them into doing anything. Your thin skin (a great metaphor, btw)
leaves you no way to keep people out, no way to shrug off those whose
e-company you do not enjoy. In a sense, by reacting strongly to others
you make their problems your problem, in that you cannot be
comfortable as long as they are jerks. Not a recipe for contentment.
There are billions of people on this planet, put your energy into
those you enjoy, not so much into those you do not. When your will
weakens, well, hey, what's a killfile for? :)»

Yeah... i appreciate your comments. I like writing. I used to despise
writing when i was 20ish (~1990), thinking it a second-rate activity
that should be allotted to writers as opposed to mathematicians.
(apparently, this is not a uncommon thought among scientists, among
which is Einstein) But since about maybe mid 1990s, i find that i
enjoy writings. I guess there are several reasons i can trace back.
One is my habit of reading dictionaries. I, perhaps, have checked
English dictionary entries more number of times than all persons ever
visited comp.lang.lisp in their lifetimes since the existence of
comp.lang.lisp, combined. (unless, one of them happens to be a
lexicographer) (or, altertively assessment: more than 99% of persons
of who have a PHD in literature or English) Secondly, my study of
logic has made me practiced extensive writing in a most austere,
logical style possible, during mid 1990s. This happenstance also
trained me greatly in critical thinking and made me aware of many deep
problems and levels of logic and philosophy, esp in communication and
English expressions. (and i have since come to fucking despise
grammarians and English writing pedants (because i find their ideas
and teachings moronic))

Another comment to your above paragraph, is the concept of living by
hatred. You have seen Star Wars right? In Star War, there's the Siths,
who live and thrive by hatred. Y'know, there's saying among them, that
the hatred leads to power. I just found the following verse on the
web:

   The Sith Philosophy

   Fear leads to anger.
   Anger leads to hate.
   Hatred leads to power.
   Power leads to victory.
   Let your anger flow through you.
   Your hate will make you strong.
   True power is only achieved through testing the limits of one's
anger, passing through unscathed.
   Rage channeled through anger is unstoppable.
   The dark side of the Force offers unimaginable power.
   The dark side is stronger than the light.
   The weak deserve their fate.

Speaking of hatred, one has to wonder, whence does hatred origin?
According to the above, fear leads to anger, anger leads to hate, but
that's just a theatrical composition. Fear is inherent in every human
animal, in particular, the fearing of other human animals, but fear
doesn't necessarily leads to anger, and fear isn't the dominant
emotion among a human animal's emotions. So, at this point, the
verse's logic of the origin of hatred breaks down. After all, it is
theatrical. Now, back in reality, hatred nevertheless usually has a
cause. You don't hate unless someone, something, made you. The US
American culture, in general believes in some pure form of evil (such
as Sadam Hussein).

This can be seen in their comic book stories of superheros they are
brought up with, where the story line can almost always be
characterized as a clearly divided forces of good vs evil, and the bad
guys usually take the form of some pure dark evil lord. It is also
cultivated from their God-Believing-Sect's scriptures of the concept
of devil. (contrast this to the Greek mythologies, where gods or
mortals, good deeds and bad deeds, are all complex and human, and
nearly impossible to say who's the “bad guy”.)

Although i live by love and knowledge, but I thrive by hatred. Hatred
gives I a reason to live. Hatred gives me hope. Hatred enpowers me. It
is hatred, indigence, defiance, that drove me to a quest for
knowledge. It is hatred, of motherfucking lisping _idiots_ with PHD
tattood to their faces, that drove the production of this essay that
explicate a view of high-level language i long had. (will be archived
on my website soon)

A couple more quotations for the road:

«Few people are capable of expressing with equanimity opinions which
differ from the prejudices of their social environment. Most people
are not even capable of forming such opinions.» — Albert Einstein,
1954

«It takes considerable knowledge just to realize the extent of your
own ignorance.» —Thomas Sowell

  Xah
  ···@xahlee.org
∑ http://xahlee.org/

☄
From: Slobodan Blazeski
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <0f991b97-ff5d-46bf-b482-9f2269d42d7f@k39g2000hsf.googlegroups.com>
On Jan 24, 3:22 pm, Xah Lee <····@xahlee.org> wrote:
> (this post the 3rd part continuation from previous one)
>

> Another comment to your above paragraph, is the concept of living by
> hatred. You have seen Star Wars right? In Star War, there's the Siths,
> who live and thrive by hatred. Y'know, there's saying among them, that
> the hatred leads to power. I just found the following verse on the
> web:
>
>    The Sith Philosophy
>
>    Fear leads to anger.
>    Anger leads to hate.
>    Hatred leads to power.
I prefer George Bernard Shaw's
Hatred is the coward's revenge for being intimidated.

The Sith Code

Peace is a lie
There is only passion
Through passion I gain strength
Through strength I gain power
Through power I gain victory
Through victory my chains are broken
The Force shall set me free

Yuthura Ban (Star Wars: Knights of the Old Republic)
http://www.imdb.com/title/tt0356070/quotes

Where does you quote coming from ? I can't found any good reference to
any dogma mentioning it, only personal blogs and pages unrelated with
star wars universe.

cheers
Slobodan

>    Power leads to victory.
>    Let your anger flow through you.
>    Your hate will make you strong.
>    True power is only achieved through testing the limits of one's
> anger, passing through unscathed.
>    Rage channeled through anger is unstoppable.
>    The dark side of the Force offers unimaginable power.
>    The dark side is stronger than the light.
>    The weak deserve their fate.
>
From: Xah Lee
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <13b2e35f-596e-4741-84b1-121b84743157@k2g2000hse.googlegroups.com>
Dear padawan Slobodan,

As someone said to me on irc recently: your google fu is weak.

you wrote:
«[some star wars sith quotes] Where does you quote coming from ? I
can't found any good reference to any dogma mentioning it, only
personal blogs and pages unrelated with star wars universe.»

The site i found was just some random site i searched thru google by
the phrase “hate leads to power” or something similar i don't
remember.

The site is:

http://www.scotty795.zoomshare.com/6.html

(nice music)

  Xah
  ···@xahlee.org
∑ http://xahlee.org/

☄
From: Slobodan Blazeski
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <b6b767f7-c882-4f35-9180-cdfccdbee37f@h11g2000prf.googlegroups.com>
On Jan 24, 9:01 pm, Xah Lee <····@xahlee.org> wrote:
> Dear padawan Slobodan,
>
> As someone said to me on irc recently: your google fu is weak.
>
> you wrote:
>
> «[some star wars sith quotes] Where does you quote coming from ? I
> can't found any good reference to any dogma mentioning it, only
> personal blogs and pages unrelated with star wars universe.»
>
> The site i found was just some random site i searched thru google by
> the phrase “hate leads to power” or something similar i don't
> remember.
Dear Xah

I said
>Where does you quote coming from ? I can't found any good reference to
>any DOGMA(*) mentioning it, only personal blogs and pages unrelated with
>star wars universe.

That roughly means that I asked for official reference to those verses
from the starwars universe, it could be movie, book , video game,
interview with George Lucas etc. The below mentioned site is just a
personal site from somebody I don't know at all. What's his
credibility?
a. Author of star wars book like Matt Stover - Shatterpoint
b. Director of some star wars movie like Irvin Kershner The Empire
Strikes Back
c. Designer of star wars video game like David Falkner Star Wars:
Knights of the Old Republic
...
Lisp(**) paralel would be something like Saying that lisp has an built
in Prolog engine, and when I ask where you heard that you could point
me to :

a. ANSI standard X3.226-1994  page _
b. Common lisp hyperspec    link _
Both represent official dogma
c. John McCarthy wrote about it at _
d. Kent Pittman said it at _
People with enourmous authority about the subject
e. Franz Allegro 8.1 link _
f. Lispworks Enterprise  link _
Leading vendors
g. Paul Graham  On Lisp page _
h. Peter Norvig PAIP page _
Highly respected authors

Your link is more like:
I read it at Slobodan Blazeski blog  link  _
Slobodan who ?

cheers
Slobodan
>
> The site is:
>
> http://www.scotty795.zoomshare.com/6.html
>
> (nice music)
>
>   Xah
>   ····@xahlee.org
> ∑http://xahlee.org/
>
> ☄

(*)Dogma -established belief or doctrine held by a religion, ideology
or any kind of organization, thought to be authoritative and not to be
disputed, doubted or diverged from.
(**) When I say lisp I mean common lisp for any other group I use lisp
family of languages
From: Ken Tilton
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <4798b865$0$11607$607ed4bc@cv.net>
Xah Lee wrote:
>    The Sith Philosophy
> 
>    Fear leads to anger.

I have seen that. I also read once that an attacking German shepherd is 
in a state of fear. Not sure how much consolation that is to the 
attackee.  Frustration is another good source of anger.

>    Anger leads to hate.
>    Hatred leads to power.

Nixon showed how hatred can lead to loss of power.

>    Power leads to victory.

And then we die, after having had a not very good time in life. Happy 
people do not hate. Hatred is a lousy feeling, tho I see the Sith have 
an answer for that.

>    Let your anger flow through you.

The Sith solution is to feel better by destroying the other, the 
Buddhist solution is to sit until it becomes clear that this is all an 
illusion so don't get too caught up in it. The hater has a weak mind and 
allows the hated object to become their master in that they are now 
spending all their time and energy on destroying the hated when they 
could be down at the corner pub chatting up the lovely beer wench.

The Amish taught the world a lesson when the deranged guy killed all 
those schoolgirls (before offing himself). They reached out to the 
family of the killer and drew them in to their circle of support, and 
they taught the surviving Amish children not to hate the killer. Or in 
modspeak, "Let it go." That was for their own sake, not the killer's.

>    Your hate will make you strong.
>    True power is only achieved through testing the limits of one's
> anger, passing through unscathed.
>    Rage channeled through anger is unstoppable.
>    The dark side of the Force offers unimaginable power.

Yeah, but how are we keeping score? The power only matters if one has to 
destroy something, and one has to destroy something only if one is so 
filled with hatred thay they cannot abide the existence of the hated. 
The Sith Philosophy turns out to be nothng more than a circular 
justification and glorification of a bad temper. Cue Dr. Phil.

>    The dark side is stronger than the light.

Only locally. The weaker have ganged up in the form of society and laws 
and police (their in-house bad guys) to manage the real bad guys.

>    The weak deserve their fate.

Inheriting the Earth? Being happy and content with their loved ones, 
with no interest in fighting fights? Sounds good.

> Although i live by love and knowledge, but I thrive by hatred. Hatred
> gives I a reason to live. Hatred gives me hope. Hatred enpowers me. It
> is hatred, indigence, defiance, that drove me to a quest for
> knowledge. It is hatred, of motherfucking lisping _idiots_ with PHD
> tattood to their faces, ...

That passage sounds more impotent than empowered. Anger rules us, it 
does not give us power. Dressing it up in Sith Philosophy won't help. 
Anger never helps a discussion or negotiation or collaboration. Anger 
turns other people off, so it leads more to failure than victory because 
many times other people hold the key to what we want. We often see 
people acting out in public in a way that makes it clear they saw a 
movie where someone threw a hissy fit and everyone snapped to attention 
and did their bidding. The reality is that it goes the other way. Blow a 
gasket and anyone around you will do as much as they can to make things 
go harder for you. Handle misturns conspicuously well and you will be 
amazed at the power a DMV clerk has to make your day go well.

If one is a fearful German shepherd who gets thru the day by terrifying 
others, fine, but it is the tail-wagging retrievers who get their ears 
scratched.

kt

-- 
http://www.theoryyalgebra.com/

"In the morning, hear the Way;
  in the evening, die content!"
                     -- Confucius
From: Slobodan Blazeski
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <9082562a-cba4-4ece-b20b-1eac747d0ed9@z17g2000hsg.googlegroups.com>
On Jan 24, 5:10 pm, Ken Tilton <···········@optonline.net> wrote:
> Xah Lee wrote:
> >    The Sith Philosophy
>
> >    Fear leads to anger.
>
> I have seen that. I also read once that an attacking German shepherd is
> in a state of fear. Not sure how much consolation that is to the
> attackee.  Frustration is another good source of anger.
Anger  sometimes could be used as a stress valve, but kick some boxing
bag instead of breaking someone's head
>
> >    Anger leads to hate.
> >    Hatred leads to power.
>
> Nixon showed how hatred can lead to loss of power.
Hartred will give power only to people you hate, making them even
stronger, while you waste your valuable energy and become weaker. I
prefer George Bernard Shaw's definitions as
hatred being the coward's revenge for being intimidated.
>
> >    Power leads to victory.
>
> And then we die, after having had a not very good time in life. Happy
> people do not hate. Hatred is a lousy feeling,
Agreed
> tho I see the Sith have
> an answer for that.
I think Xah was trapped in non-dogmatic quote, the real one goes like
this
...
Through passion I gain strength
Through strength I gain power
...
something that I agree. I think there's a secret lisp macro called
with-passion that wraps lispers to spend so much energy on such an
*useless* language. Everytime you open your eyes you see difference
between the things made with passion and those without it. Passion
will give tou strength.
>
> >    Your hate will make you strong.
> >    True power is only achieved through testing the limits of one's
> > anger, passing through unscathed.
> >    Rage channeled through anger is unstoppable.
> >    The dark side of the Force offers unimaginable power.
>
> Yeah, but how are we keeping score? The power only matters if one has to
> destroy something, and one has to destroy something only if one is so
> filled with hatred thay they cannot abide the existence of the hated.
> The Sith Philosophy turns out to be nothng more than a circular
> justification and glorification of a bad temper. Cue Dr. Phil.
I  don't accept those good vs bad separation
1st.It depends from your perspective (look at theory of relativity)
2nd Yin Yang http://dl.nlb.gov.sg/digitalk/random2/images/Stellaris_Yin_Yang.jpg
shows
there was a saying going something like:
Good people have so much  bad in them and bad people have so much good
that is hard say who should teach who.
Even if someone is in that good people / bad people analogy, don't
forgeth that there a good people, there are bad people and there are
good people in a bad day.

cheers
Slobodan

The oldest man was once nineteen years old, and full of wisdom. Then
he studied lisp...
From: Maciej Katafiasz
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <fnbacd$qg2$2@news.net.uni-c.dk>
Den Thu, 24 Jan 2008 11:10:36 -0500 skrev Ken Tilton:

> If one is a fearful German shepherd who gets thru the day by terrifying
> others, fine, but it is the tail-wagging retrievers who get their ears
> scratched.

The only fearfulness in German shepherds is how fearsomely intelligent 
and well-tempered they are. It's really not a good personification of 
hatred. And I speak from experience as a long time owner of a German 
shepherd, she was without doubt the most balanced and forgiving in my 
family.

Cheers,
Maciej
From: Ken Tilton
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <47998fd6$0$11623$607ed4bc@cv.net>
Maciej Katafiasz wrote:
> And I speak from experience as a long time owner of a German 
> shepherd, she was without doubt the most balanced and forgiving in my 
> family.

What was her name?

kt

-- 
http://www.theoryyalgebra.com/

"In the morning, hear the Way;
  in the evening, die content!"
                     -- Confucius
From: Maciej Katafiasz
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <fncacf$uk5$1@news.net.uni-c.dk>
Den Fri, 25 Jan 2008 02:29:47 -0500 skrev Ken Tilton:

> Maciej Katafiasz wrote:
>> And I speak from experience as a long time owner of a German shepherd,
>> she was without doubt the most balanced and forgiving in my family.
> 
> What was her name?

Pufa.

Cheers,
Maciej
From: Rainer Joswig
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <joswig-F9F05C.12385625012008@news-europe.giganews.com>
In article <············@news.net.uni-c.dk>,
 Maciej Katafiasz <········@gmail.com> wrote:

> Den Fri, 25 Jan 2008 02:29:47 -0500 skrev Ken Tilton:
> 
> > Maciej Katafiasz wrote:
> >> And I speak from experience as a long time owner of a German shepherd,
> >> she was without doubt the most balanced and forgiving in my family.
> > 
> > What was her name?
> 
> Pufa.
> 
> Cheers,
> Maciej

Pufa? I cannot imagine that a German shepherd dog would
listen to that name. ;-)

I recently read a story where British police men had to
be trained in German, because their German shepherd dogs
did not listen to English commands. ;-)
From: Maciej Katafiasz
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <fnd0vt$1mm$1@news.net.uni-c.dk>
Den Fri, 25 Jan 2008 12:38:56 +0100 skrev Rainer Joswig:

> Pufa? I cannot imagine that a German shepherd dog would listen to that
> name. ;-)
> 
> I recently read a story where British police men had to be trained in
> German, because their German shepherd dogs did not listen to English
> commands. ;-)

She seemed to do just fine, in fact her grasp of the language was 
somewhat scary. Unless of course it'd inconvenience her to understand, 
then suddenly comprehension would fall sharply. That's actually the worst 
part of arguing with a dog, you can't really prove it understands you, 
even if you know damn well that that's the case.

Cheers,
Maciej
From: Slobodan Blazeski
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <a18e8f41-ba68-416d-9532-6550568e326a@e4g2000hsg.googlegroups.com>
On Jan 25, 4:58 pm, Maciej Katafiasz <········@gmail.com> wrote:
> Den Fri, 25 Jan 2008 12:38:56 +0100 skrev Rainer Joswig:
>
> > Pufa? I cannot imagine that a German shepherd dog would listen to that
> > name. ;-)
>
> > I recently read a story where British police men had to be trained in
> > German, because their German shepherd dogs did not listen to English
> > commands. ;-)
>
> She seemed to do just fine, in fact her grasp of the language was
> somewhat scary. Unless of course it'd inconvenience her to understand,
> then suddenly comprehension would fall sharply. That's actually the worst
> part of arguing with a dog, you can't really prove it understands you,
> even if you know damn well that that's the case.
>
> Cheers,
> Maciej

That's why I prefer an aquarium, I'm 99,999999999999999999999999% sure
that fish doesn't understand what I'm trying to say. To be fair to
them, sometimes I didn't understand myself.

cheers
Slobodan
From: Tim X
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <878x2dnsi9.fsf@lion.rapttech.com.au>
Rainer Joswig <······@lisp.de> writes:

> In article <············@news.net.uni-c.dk>,
>  Maciej Katafiasz <········@gmail.com> wrote:
>
>> Den Fri, 25 Jan 2008 02:29:47 -0500 skrev Ken Tilton:
>> 
>> > Maciej Katafiasz wrote:
>> >> And I speak from experience as a long time owner of a German shepherd,
>> >> she was without doubt the most balanced and forgiving in my family.
>> > 
>> > What was her name?
>> 
>> Pufa.
>> 
>> Cheers,
>> Maciej
>
> Pufa? I cannot imagine that a German shepherd dog would
> listen to that name. ;-)
>
> I recently read a story where British police men had to
> be trained in German, because their German shepherd dogs
> did not listen to English commands. ;-)

Yeah, know what you mean. I've had a terrible time learning gaelic in
order to command my mum's Scottish Terrior to stop trying to
hump my leg!

Now she is talking about getting a Chihuahua or chinese Foo Dog. I guess
the Chihuahua will understand spanish, but what about the Foo dog? Do
you think I'm better off with Mandarin or Cantonese? 

Bloody irritating all these languages. Why can't everyone just settle on
one - after all, they all seem pretty much functionally
equivalent. Maybe we should go with latin, after all its one of the
oldest and many languages have adopted many of its terms and there has
been some really influencial knowledge and information communicated in
it - we just have to get past all that FUD about age, lack of standard
terms for modern concepts and excessive use of '*us' suffixes etc. At
least it has a fairly simple and consistent syntax.

Tim

.  
-- 
tcross (at) rapttech dot com dot au
From: George Neuner
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <9vknp317fpfaeun3o1l3326u2ft23rui6f@4ax.com>
On Sat, 26 Jan 2008 15:02:54 +1100, Tim X <····@nospam.dev.null>
wrote:

>Rainer Joswig <······@lisp.de> writes:
>
>> In article <············@news.net.uni-c.dk>,
>>  Maciej Katafiasz <········@gmail.com> wrote:
>>
>>> Den Fri, 25 Jan 2008 02:29:47 -0500 skrev Ken Tilton:
>>> 
>>> > Maciej Katafiasz wrote:
>>> >> And I speak from experience as a long time owner of a German shepherd,
>>> >> she was without doubt the most balanced and forgiving in my family.
>>> > 
>>> > What was her name?
>>> 
>>> Pufa.
>>> 
>>> Cheers,
>>> Maciej
>>
>> Pufa? I cannot imagine that a German shepherd dog would
>> listen to that name. ;-)
>>
>> I recently read a story where British police men had to
>> be trained in German, because their German shepherd dogs
>> did not listen to English commands. ;-)
>
>Yeah, know what you mean. I've had a terrible time learning gaelic in
>order to command my mum's Scottish Terrior to stop trying to
>hump my leg!
>
>Now she is talking about getting a Chihuahua or chinese Foo Dog. I guess
>the Chihuahua will understand spanish, but what about the Foo dog? Do
>you think I'm better off with Mandarin or Cantonese? 
>
>Bloody irritating all these languages. Why can't everyone just settle on
>one - after all, they all seem pretty much functionally
>equivalent. Maybe we should go with latin, after all its one of the
>oldest and many languages have adopted many of its terms and there has
>been some really influencial knowledge and information communicated in
>it - we just have to get past all that FUD about age, lack of standard
>terms for modern concepts and excessive use of '*us' suffixes etc. At
>least it has a fairly simple and consistent syntax.
>
>Tim

Problem is, Latin is Eurocentric.  So for that matter are Esperanto
and Eurish, both of which could reasonably claim more applicability in
non European (or descendent) countries than Latin.

We wouldn't be having this problem if Rome and Chin had not fallen.
Then there would be only two languages to learn - Latin and Chinese
(though I'm not sure what dialect was spoken in the western province
from which the first emperor came).

For better or worse, English seems to be the least common denominator.


As for dogs ... they already have a common language.  My Dalmatian
spoke Croatian, my Scottish terrier Scots Gaelic (not to be confused
with the Irish Gaelic or Welsh Gaelic other terriers speak), and my
Boston terrier spoke high English.  They appeared to have no trouble
communicating amongst themselves and they all obeyed my grandfather
who only spoke to them in German.

George
--
for email reply remove "/" from address
From: David Golden
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <JBRmj.24288$j7.450074@news.indigo.ie>
George Neuner wrote:


> my Scottish terrier Scots Gaelic (not to be confused
> with the Irish Gaelic or Welsh Gaelic other terriers speak),


Mind your Ps and Qs. ;-)

Welsh is not Gaelic, it's Brythonic.     (P-Celtic)
Irish/Scottish/Manx are the Gaelic ones. (Q-Celtic)

http://en.wikipedia.org/wiki/Goidelic
http://en.wikipedia.org/wiki/Brythonic_languages
From: George Neuner
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <nh0op35qc953pif6issrk0efe9jksbfakg@4ax.com>
On Sun, 27 Jan 2008 02:01:13 +0000, David Golden
<············@oceanfree.net> wrote:

>George Neuner wrote:
>
>
>> my Scottish terrier Scots Gaelic (not to be confused
>> with the Irish Gaelic or Welsh Gaelic other terriers speak),
>
>
>Mind your Ps and Qs. ;-)
>
>Welsh is not Gaelic, it's Brythonic.     (P-Celtic)
>Irish/Scottish/Manx are the Gaelic ones. (Q-Celtic)
>
>http://en.wikipedia.org/wiki/Goidelic
>http://en.wikipedia.org/wiki/Brythonic_languages

Whoops!  I apologize to all the Welsh here.

George
--
for email reply remove "/" from address
From: Don Geddis
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <87hch2ihxq.fsf@geddis.org>
Xah Lee <···@xahlee.org> wrote on Thu, 24 Jan 2008:
>    The Sith Philosophy
>    Fear leads to anger.
>    Anger leads to hate.
>    Hatred leads to power.
>    Power leads to victory.
>    Let your anger flow through you.
>    Your hate will make you strong.
>    True power is only achieved through testing the limits of one's anger, passing through unscathed.
>    Rage channeled through anger is unstoppable.
>    The dark side of the Force offers unimaginable power.
>    The dark side is stronger than the light.
>    The weak deserve their fate.

Probably better as Yoda said it (from The Phantom Menace):
        I sense much fear in you.
        Fear is the path to the dark side.
        Fear leads to anger.
        Anger leads to hate.
        Hate leads to suffering.
which you can find at 1:30-1:40 here:
        http://youtube.com/watch?v=6iIzDJ1o0Ow

Or perhaps you prefer Darth Maul's version:
        Fear.
        Fear attracts the fearful.
        The strong.  The weak.
        The innocent.  The corrupt.
        Fear.
        Fear is my ally.
available here:
        http://youtube.com/watch?v=iPfBQ-_7sGE

_______________________________________________________________________________
Don Geddis                  http://don.geddis.org/               ···@geddis.org
To me, clowns aren't funny.  In fact, they're kinda scary.  I've wondered where
this started, and I think it goes back to the time I went to the circus and a
clown killed my dad.  -- Deep Thoughts, by Jack Handey
From: Maciej Katafiasz
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <fnbafj$qg2$3@news.net.uni-c.dk>
Den Thu, 24 Jan 2008 06:22:57 -0800 skrev Xah Lee:

>    Rage channeled through anger is unstoppable.

Wait, you can "channel rage through anger" in English? Silly language 
that.

Cheers,
Maciej
From: Jon Harrop
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <13ppqe7nsrlbm19@corp.supernews.com>
Xah Lee wrote:
> struct as in C's ?record? data type? Classes as in Java? Mathematica
> doesn't have these things. For Class, it depend on what you mean. As a
> language's data type, no. As a function that contains data ond inner
> functions one can implement in just about any well designed lang, yes.

I think one might validly claim that Mathematica does provide such
functionality though. Questions of the type system become trivial because
it has only a single type (expressions) and, therefore, there are no type
definitions to worry about. Inheritance is then nothing more than applying
one set of rewrite rules before another:

  f //. {derived, base}

What Mathematica lacks is any way to ensure adherence to an interface.
Perhaps you can implement such a test at run-time using pattern matching
but I doubt it would be as elegant as the above. However, is this a
requirement of OOP? e.g. does Lisp enforce interfaces and, if so, when do
they get checked?

> As i repeated many times now, programing in Mathematica does not have
> any concept of memory, reference, pointer, WHATSOEVER. There is NO
> behind-the-scene memory, object, abstract storage location, where some
> other entities ?reference? to it.

That is a triumph of hope over reality, I'm afraid. Here are a couple of
counter examples that I have experienced:

. You must learn about stack consumption and tail recursion because
Mathematica is extremely prone to segfaulting when the stack is exhausted.
Hiding this behind $RecursionLimit does not abstract away this problem but
merely provides a safe-looking facade compared to segmentation faults
(which happen if you raise $RecursionLimit)

. You must learn about the memory management of cyclic data structures if
you want to avoid leaks in your code because Mathematica is not properly
garbage collected.

The language might not explicitly expose storage concepts but the
implementation requires users to understand it. This might be fixed in the
future if Wolfram Research make the core rewriter tail recursive and
migrate to accurate garbage collection but I don't think that will happen
any time soon.

> The thought has come to my mind to write a few pages of tutorial of
> mathematica for lisp programers. Written in a way such that every
> mathematica construct has a exact analogous counter part in lisp, in
> both syntax and semantics, so that the tutorial actually also
> functions as a fun-oriented lisp programing excursion. (in fact, if
> you are a lisp expert, you can consider yourself already half of a
> Mathematica expert. (lisp's major concepts of fully functional nested
> syntax, forms, macros, lambda, lists, function application are just
> about the same set of fundamental ideas in Mathematica, almost token
> for token identical except few syntax quirks or when we get into hairy
> stuff of evaluation model exhibited in lisp as ?quote?, ?eval? ... or
> in Mathematica as Hold[], HoldForm[], Evaluate[], ReleaseHold[]...)

There are some important discrepancies though. You've even neglected one:
Mathematica's lists are arrays where are Lisp's lists are real linked
lists. This has some important ramifications, e.g. cdr is O(1) in Lisp but
O(n) in Mathematica.

You've also got other problems like Lisp has better-defined asymptotic
complexities for its core operations whereas Mathematica has next to
nothing (what is the complexity of "<", for example?).

> Lisp can actually relatively easily do it, emulating the full
> Mathematica's systematic list/tree extraction/manipulation functions
> (getting rids of the cons problem for good), as well getting rid of
> the lisp ?behind the scenes? object concept with some practical
> benefits, a full automatic code formatter (beyond emacs current
> offerings) that get rid of the endless manual ?how you should indent
> parens? once for all (as in the benefit transparently offered by
> Python), and also layer on top of sexp a infix notation while
> maintaining absolutely the full sexp benefits, and even more, if lisp
> experts are interested in spending now quite a lot energy, go as far
> as making lisp a full 2D Mathematical notation display system, and a
> notebook system (think of it as a language feature so powerful that
> the SOURCE CODE actually transparenly functions like a Word Processor
> file format that can contain structures and images).

I would not call that "relatively easy". Just implementing Mathematica's
pattern matcher is a lot of work...

> This is a social and psychological problem of lisp mainly caused by
> the old timers. Besides fixing problems of the lisp the language
> proper, other problems and possible solutions such as frequent
> complain of its libraries are similarly thwarted and smiten.

I don't think that the social dysfunction of Lispers is caused by old
timers. Perhaps it is exacerbated by some university lecturers but,
ultimately, it is the result of users who haven't studied anything more
modern. I was the same when I was a kid: I'd liked writing rats nest code
and didn't care a hoot for "a fascist type system" that might encourage me
to write something more maintainable.

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/products/?u
From: Jon Harrop
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <13pi94q8ek7td4b@corp.supernews.com>
Xah Lee wrote:
> struct as in C's ?record? data type? Classes as in Java? Mathematica
> doesn't have these things. For Class, it depend on what you mean. As a
> language's data type, no. As a function that contains data ond inner
> functions one can implement in just about any well designed lang, yes.

I think one might validly claim that Mathematica does provide such
functionality though. Questions of the type system become trivial because
it has only a single type (expressions) and, therefore, there are no type
definitions to worry about. Inheritance is then nothing more than applying
one set of rewrite rules before another:

  f //. {derived, base}

What Mathematica lacks is any way to ensure adherence to an interface.
Perhaps you can implement such a test at run-time using pattern matching
but I doubt it would be as elegant as the above. However, is this a
requirement of OOP? e.g. does Lisp enforce interfaces and, if so, when do
they get checked?

> As i repeated many times now, programing in Mathematica does not have
> any concept of memory, reference, pointer, WHATSOEVER. There is NO
> behind-the-scene memory, object, abstract storage location, where some
> other entities ?reference? to it.

That is a triumph of hope over reality, I'm afraid. Here are a couple of
counter examples that I have experienced:

. You must learn about stack consumption and tail recursion because
Mathematica is extremely prone to segfaulting when the stack is exhausted.
Hiding this behind $RecursionLimit does not abstract away this problem but
merely provides a safe-looking facade compared to segmentation faults
(which happen if you raise $RecursionLimit)

. You must learn about the memory management of cyclic data structures if
you want to avoid leaks in your code because Mathematica is not properly
garbage collected.

The language might not explicitly expose storage concepts but the
implementation requires users to understand it. This might be fixed in the
future if Wolfram Research make the core rewriter tail recursive and
migrate to accurate garbage collection but I don't think that will happen
any time soon.

> The thought has come to my mind to write a few pages of tutorial of
> mathematica for lisp programers. Written in a way such that every
> mathematica construct has a exact analogous counter part in lisp, in
> both syntax and semantics, so that the tutorial actually also
> functions as a fun-oriented lisp programing excursion. (in fact, if
> you are a lisp expert, you can consider yourself already half of a
> Mathematica expert. (lisp's major concepts of fully functional nested
> syntax, forms, macros, lambda, lists, function application are just
> about the same set of fundamental ideas in Mathematica, almost token
> for token identical except few syntax quirks or when we get into hairy
> stuff of evaluation model exhibited in lisp as ?quote?, ?eval? ... or
> in Mathematica as Hold[], HoldForm[], Evaluate[], ReleaseHold[]...)

There are some important discrepancies though. You've even neglected one:
Mathematica's lists are arrays where are Lisp's lists are real linked
lists. This has some important ramifications, e.g. cdr is O(1) in Lisp but
O(n) in Mathematica.

You've also got other problems like Lisp has better-defined asymptotic
complexities for its core operations whereas Mathematica has next to
nothing (what is the complexity of "<", for example?).

> Lisp can actually relatively easily do it, emulating the full
> Mathematica's systematic list/tree extraction/manipulation functions
> (getting rids of the cons problem for good), as well getting rid of
> the lisp ?behind the scenes? object concept with some practical
> benefits, a full automatic code formatter (beyond emacs current
> offerings) that get rid of the endless manual ?how you should indent
> parens? once for all (as in the benefit transparently offered by
> Python), and also layer on top of sexp a infix notation while
> maintaining absolutely the full sexp benefits, and even more, if lisp
> experts are interested in spending now quite a lot energy, go as far
> as making lisp a full 2D Mathematical notation display system, and a
> notebook system (think of it as a language feature so powerful that
> the SOURCE CODE actually transparenly functions like a Word Processor
> file format that can contain structures and images).

I would not call that "relatively easy". Just implementing Mathematica's
pattern matcher is a lot of work...

> This is a social and psychological problem of lisp mainly caused by
> the old timers. Besides fixing problems of the lisp the language
> proper, other problems and possible solutions such as frequent
> complain of its libraries are similarly thwarted and smiten.

I don't think that the social dysfunction of Lispers is caused by old
timers. Perhaps it is exacerbated by some university lecturers but,
ultimately, it is the result of users who haven't studied anything more
modern. I was the same when I was a kid: I'd liked writing rats nest code
and didn't care a hoot for "a fascist type system" that might encourage me
to write something more maintainable.

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/products/?u
From: Xah Lee
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <0d6b3fbe-035a-4e5f-b28a-7eba85b691d4@q77g2000hsh.googlegroups.com>
Hi Jon,

Am responding to some parts of your post.

Xah wrote:
「As i repeated many times now, programing in Mathematica does not have
any concept of memory, reference, pointer, WHATSOEVER. There is NO
behind-the-scene memory, object, abstract storage location, where some
other entities “reference” to it.」

Jon wrote:
> That is a triumph of hope over reality, I'm afraid. Here are a couple of
> counter examples that I have experienced:
>
> . You must learn about stack consumption and tail recursion because
> Mathematica is extremely prone to segfaulting when the stack is exhausted.
> Hiding this behind $RecursionLimit does not abstract away this problem but
> merely provides a safe-looking facade compared to segmentation faults
> (which happen if you raise $RecursionLimit)
>
> . You must learn about the memory management of cyclic data structures if
> you want to avoid leaks in your code because Mathematica is not properly
> garbage collected.
>
> The language might not explicitly expose storage concepts but the
> implementation requires users to understand it. This might be fixed in the
> future if Wolfram Research make the core rewriter tail recursive and
> migrate to accurate garbage collection but I don't think that will happen
> any time soon.
>
> There are some important discrepancies though. You've even neglected one:
> Mathematica's lists are arrays where are Lisp's lists are real linked
> lists. This has some important ramifications, e.g. cdr is O(1) in Lisp but
> O(n) in Mathematica.

I disagree. There are hardware resource limitations, and there's
nothing a language can do about that.

What you are saying, is that when a programer, say a Schemer, program
in Mathematica, she would run into errors like $RecursionLimit,
whereas she wouldn't have such error when programing in Scheme. You
attribute this as a Mathematica's language problem as a inability to
abstract the underlying implementation away. (stacks, linked list,
tail recursion, segfault)

I disagree. Any computer language, has characteristic behaviors
peculiar to each language's constructs. (the good ones are called
idioms) You have to learn the idioms of the lang. For example, in
Scheme, you do recursion happy-go-lucky. In Common Lisp, you'd be more
careful about that because Common Lisp does not do recursion as well
as Scheme. (in tech geeking talk, we say that common lisp
implementations does not always have a optimal implementation for a
certain type of recursive construct) In non-lisp langs, recursion
quickly gets you all sort of errors.

Similarly, you can not just import a particular language coding habit
into another, and expect it to behave optimally or even work. (say,
Haskell idioms, Java idioms, Mathematica idioms, Prolog idioms, won't
mesh well)

Likewise in Mathematica, how you program or how exactly you are doing
recursion, has a lot to do with your knowldege of the language. Doing
it one way, gets you recursion error or time halts, but changing a
structure or tweak a part, you are good to go.

For example, you complain that Mathematica's list has the
computational behavior such that it is very slow to access or work
with and quickly gets you into some hardware resource error. (i.e. it
behaves like lisp's “vector” datatype) My retort, is that this is
simply a result of not knowing the language well. (not necessarily
that you don't) For example, if i'm a student, i might use lisp's
alist data type when i actually should use its hash map. The result
being, my program runs like a slug and quickly hits some hardware
resource limit. I would be wrong of me, to blame lisp by claiming it
has bad implementation of key'd list.

As to the actual solution to your complaint about mathematica's
implementation of list, for a mathematica programer expert, in
situations when she knew she's gonna do a lot preappending to grow a
list, she'd just nest the list instead of prepend, then Flatten at the
end if necessary.

Let me give a reverse example then. In Mathematica, i can do:

In[1]:=
433333333333333333399999999999999999999999999999999999555+1

Out[1]=
433333333333333333399999999999999999999999999999999999556

In most languages, that's ugly error. Can i then claim, that all these
other languages, does not have a good implementation of numbers?

Actually this is a good example of my concept of high level languages:
aka, A protocol to specify computation.

In Mathematica, this simple example illustrates a most simple, basic,
elementary specification of a computation. In most other languages,
they simply cannot do it, or requires some libraries, or requires some
datatype declaration. All these, are symptoms of the behind-the-scenes
implementation/practimatics/speeed/hardware issues surfacing to the
language level, which interferes and complicates the language by
mumble jumble idiosyncratic to each language.

Perhaps some lisp hotshot fuckheads, will retort, that lisp can do the
above too. Then, can lisp do this?

In[1]:=
 ·······@Outer[f,{3,4},{a,b},{x,y,z}]

Out[1]=
{f[3,a,x],f[3,a,y],f[3,a,z],f[3,b,x],f[3,b,y],f[3,b,z],f[4,a,x],f[4,a,y],
  f[4,a,z],f[4,b,x],f[4,b,y],f[4,b,z]}

Can Lisp do this:

(* compute Pi to 12345 digits*)
N[Pi, 12345]

Lisp fuckheads may retort, that the above is specialized so-called
computer algebra or domain specific.

Do lispers think, that general languages in the next 10 or 20 years,
will still need to call special libraries, so-called “data types”, or
other behind-the-scenes mumble jumble in order to do the above?

Is there any lispers, still do not understand, my idea of a high level
language that separats hardware/speed/implementation/pragmatics
issues? Is it still justified, to call me a troll, or moron, or shove
your lisp's object technicalities mumble jumble to me? Is it me, who
didn't understand you, or you, who didn't wrong'd me?

If, the majority of computer scientist and compiler morons, actually
clearly see the above high-level language idea (as opposed to
perpetually bury their heads into “types” and other “behind the
scenes” mumble jumble), by now we could have many high level
languages, that are not only as beautiful and easy to use as
Mathematica, but diverse, some will be fast, control hardware, or
free.

Now, if you do understand the idea of separating the pragmatics stuff
out of a computer language, then perhaps we can discuss further the
topic fruitfully. I do not know compilers, many of you do have
compiler knowledge. Perhaps, we can discuss, to what extend such high
level language is possible. (don't forget that Mathematica is already
a existing example. Don't bury your head just in lisp. And, there are
Haskell, erlang, f#... and there's also denotational semantics which i
don't understand but i'm sure others here do) The goal of discussion,
is to grow and cultivate ideas from people with diverse interests and
knowledges. Not, using your narrow expertise to shot others down.

Further readings

• What is Expressiveness in a Computer Language
 http://xahlee.org/perl-python/what_is_expresiveness.html

• The Concepts and Confusions of Prefix, Infix, Postfix and Fully
Functional Notations
 http://xahlee.org/UnixResource_dir/writ/notations.html

• Lisp's List Problem
 http://xahlee.org/emacs/lisp_list_problem.html

• Jargons And High Level Languages
 http://xahlee.org/emacs/jargons_high_level_lang.html

• Is Lisp's Objects Concept Necessary?
 http://xahlee.org/emacs/lisps_objects.html

  Xah
  ···@xahlee.org
∑ http://xahlee.org/

☄
From: Pascal J. Bourguignon
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <7c8x2e9g6f.fsf@pbourguignon.anevia.com>
Xah Lee <···@xahlee.org> writes:
> [...]
> As to the actual solution to your complaint about mathematica's
> implementation of list, for a mathematica programer expert, in
> situations when she knew she's gonna do a lot preappending to grow a
> list, she'd just nest the list instead of prepend, then Flatten at the
> end if necessary.
>
> Let me give a reverse example then. In Mathematica, i can do:
>
> In[1]:=
> 433333333333333333399999999999999999999999999999999999555+1
>
> Out[1]=
> 433333333333333333399999999999999999999999999999999999556
>
> In most languages, that's ugly error. Can i then claim, that all these
> other languages, does not have a good implementation of numbers?

(1+ 433333333333333333399999999999999999999999999999999999555)
433333333333333333399999999999999999999999999999999999556 ; Pffew!  


> Perhaps some lisp hotshot fuckheads, will retort, that lisp can do the
> above too. Then, can lisp do this?
>
> In[1]:=
>  ·······@Outer[f,{3,4},{a,b},{x,y,z}]
>
> Out[1]=
> {f[3,a,x],f[3,a,y],f[3,a,z],f[3,b,x],f[3,b,y],f[3,b,z],f[4,a,x],f[4,a,y],
>   f[4,a,z],f[4,b,x],f[4,b,y],f[4,b,z]}

Yep, no problem:

C/USER[89]> (COM.INFORMATIMAGO.COMMON-LISP.LIST:COMBINE '(f) '(3 4) '(a b) '(x y z))
((F 3 A X) (F 3 A Y) (F 3 A Z) (F 3 B X) (F 3 B Y) (F 3 B Z) (F 4 A X) (F 4 A Y) (F 4 A Z) (F 4 B X) (F 4 B Y) (F 4 B Z))


> Can Lisp do this:
>
> (* compute Pi to 12345 digits*)
> N[Pi, 12345]
>
> Lisp fuckheads may retort, that the above is specialized so-called
> computer algebra or domain specific.

Or they may retort that the situation has not changed since the last
time you asked, lisp implementations can still compute Pi to any
number of digits:

(progn (setf (EXT:LONG-FLOAT-DIGITS) 12345)
       (length (princ-to-string (* 4 (atan 1.0L0)))))
12353

-- 
__Pascal Bourguignon__
From: Jon Harrop
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <13pkedv5k3lsj9f@corp.supernews.com>
Pascal J. Bourguignon wrote:
> Or they may retort that the situation has not changed since the last
> time you asked, lisp implementations can still compute Pi to any
> number of digits:
> 
> (progn (setf (EXT:LONG-FLOAT-DIGITS) 12345)
>        (length (princ-to-string (* 4 (atan 1.0L0)))))
> 12353

ROTFL. That is wrong by an unknown amount. Hey Xah, you forgot to specify
that the answer should be "correct". :-)

PS: SBCL crashes and CLisp silently gives a much more wrong answer.
-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/products/?u
From: Ray Dillinger
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <479e3c73$0$36336$742ec2ed@news.sonic.net>
Jon Harrop wrote:

> Pascal J. Bourguignon wrote:

>> (progn (setf (EXT:LONG-FLOAT-DIGITS) 12345)
>>        (length (princ-to-string (* 4 (atan 1.0L0)))))
>> 12353

> ROTFL. That is wrong by an unknown amount. Hey Xah, you forgot to specify
> that the answer should be "correct". :-)

> PS: SBCL crashes and CLisp silently gives a much more wrong answer.

I think the problem with expressions like this is that you can get a 
wrong answer and not know how wrong it is.  More and more, I'm 
becoming an advocate of math on ranges.  You get the answer as an 
upperbound and lowerbound, and then you have some feedback when 
something of yours (or something you called) did catastrophic rounding 
or something. 

                                Bear
From: Pascal Bourguignon
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <87y7a9r4y0.fsf@thalassa.informatimago.com>
Ray Dillinger <····@sonic.net> writes:

> Jon Harrop wrote:
>
>> Pascal J. Bourguignon wrote:
>
>>> (progn (setf (EXT:LONG-FLOAT-DIGITS) 12345)
>>>        (length (princ-to-string (* 4 (atan 1.0L0)))))
>>> 12353
>
>> ROTFL. That is wrong by an unknown amount. Hey Xah, you forgot to specify
>> that the answer should be "correct". :-)
>
>> PS: SBCL crashes and CLisp silently gives a much more wrong answer.

Well, the above form is misleading. (EXT:LONG-FLOAT-DIGITS) is not the
number of decimal digits of precision we get, but the number of binary
digits of precision, and it's taken into account only for the next
read, so to get the 12349 decimal digits as above,
(EXT:LONG-FLOAT-DIGITS) was actually set to a higher precision.  But
since the result was printed as 12353 characters, it was exact.

Clisp gives the exact decimals of PI, up to the set binary precision.  

C/USER[58]> (setf (EXT:LONG-FLOAT-DIGITS) (truncate (* 3322 12345) 1000))
41010
C/USER[59]> (length (princ-to-string (* 4 (atan 1.0L0))))
12353
C/USER[60]> (mismatch (princ-to-string (* 4 (atan 1.0L0))) gutenpi)
12350

With (EXT:LONG-FLOAT-DIGITS) set to 41010,  (* 4 (atan 1.0L0)) gives
you 12349 decimal digits of Pi.  What more do you want?  It is to
expect that a few decimal digits are printed wrong, in the conversion
from binary to decimal.


gutenpi is a string containing a million decimals of pi take from
http://www.gutenberg.org/dirs/etext93/pimil10.txt


> I think the problem with expressions like this is that you can get a 
> wrong answer and not know how wrong it is.  

Clisp gives correct answers.

-- 
__Pascal Bourguignon__                     http://www.informatimago.com/

This is a signature virus.  Add me to your signature and help me to live.
From: Nicolas Neuss
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <87hch2uhpk.fsf@ma-patru.mathematik.uni-karlsruhe.de>
Xah Lee <···@xahlee.org> writes:

> Hi Jon,
>
> Am responding to some parts of your post.

Fascinating.  Xah Lee meets Jon Harrop at last.  The final showdown between
OCaml/F# and Mathematica is taking place.  Since I expect that this will
take some time, maybe you two could make it out between yourselves with
email and tell us the outcome, if an agreement has been reached.

Nicolas
From: Slobodan Blazeski
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <660dab68-4123-4aeb-888d-c5f8fa7958f2@q39g2000hsf.googlegroups.com>
On Jan 25, 3:01 pm, Nicolas Neuss <········@math.uni-karlsruhe.de>
wrote:
> Xah Lee <····@xahlee.org> writes:
> > Hi Jon,
>
> > Am responding to some parts of your post.
>
> Fascinating.  Xah Lee meets Jon Harrop at last.  The final showdown between
> OCaml/F# and Mathematica is taking place.  Since I expect that this will
> take some time, maybe you two could make it out between yourselves with
> email and tell us the outcome, if an agreement has been reached.
>
> Nicolas

Settled behind the scene via email, no way, the crowd gathered to see
the show  and better it be a good one.

cheers
Slobodan
From: Pascal J. Bourguignon
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <7c4pd1apiv.fsf@pbourguignon.anevia.com>
Nicolas Neuss <········@math.uni-karlsruhe.de> writes:

> Xah Lee <···@xahlee.org> writes:
>
>> Hi Jon,
>>
>> Am responding to some parts of your post.
>
> Fascinating.  Xah Lee meets Jon Harrop at last.  The final showdown between
> OCaml/F# and Mathematica is taking place.  Since I expect that this will
> take some time, maybe you two could make it out between yourselves with
> email and tell us the outcome, if an agreement has been reached.

No! Just bring some pop-corn, the spectacle will be grandiose!


-- 
__Pascal Bourguignon__
From: Damien Kick
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <13pnni2s8eh4m53@corp.supernews.com>
Nicolas Neuss wrote:

> Fascinating.  Xah Lee meets Jon Harrop at last.  The final showdown between
> OCaml/F# and Mathematica is taking place.  Since I expect that this will
> take some time, maybe you two could make it out between yourselves with
> email and tell us the outcome, if an agreement has been reached.

Kill files everywhere buckled from the stress...
From: Jon Harrop
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <13pkiv3dr0snrb5@corp.supernews.com>
Xah Lee wrote:
> I disagree. There are hardware resource limitations, and there's
> nothing a language can do about that.

Absolutely but stack overflows and memory leaks are not "hardware resource
limitations". If they were, implementations could not have already solved
these problems for us as SML/NJ did, for example.

> What you are saying, is that when a programer, say a Schemer, program
> in Mathematica, she would run into errors like $RecursionLimit,
> whereas she wouldn't have such error when programing in Scheme. You
> attribute this as a Mathematica's language problem as a inability to
> abstract the underlying implementation away. (stacks, linked list,
> tail recursion, segfault)

Not quite. The language might be fine but the implementation is not because
it requires the user to be aware of low-level things (like stack and cyclic
data structures) in order to write correct code. An implementation of a
high-level language should not impose such burdens on its users if
possible, and we know this is possible because SML/NJ already did it.

When you're talking about a single-implementation language like Mathematica
this can be a serious concern.

> I disagree. Any computer language, has characteristic behaviors
> peculiar to each language's constructs. (the good ones are called
> idioms) You have to learn the idioms of the lang. For example, in
> Scheme, you do recursion happy-go-lucky. In Common Lisp, you'd be more
> careful about that because Common Lisp does not do recursion as well
> as Scheme. (in tech geeking talk, we say that common lisp
> implementations does not always have a optimal implementation for a
> certain type of recursive construct) In non-lisp langs, recursion
> quickly gets you all sort of errors.

Then idiomatic C does not cause segmentation faults or leak memory just as
idiomatic Mathematica code also does not cause segmentation faults or leak
memory. Mathematica is no better than C (or most other languages) in these
respects.

> Similarly, you can not just import a particular language coding habit
> into another, and expect it to behave optimally or even work. (say,
> Haskell idioms, Java idioms, Mathematica idioms, Prolog idioms, won't
> mesh well)

Exactly. You must be aware of stack consumption in C just as you must be
aware of it in Mathematica. There is no difference.

> Likewise in Mathematica, how you program or how exactly you are doing
> recursion, has a lot to do with your knowldege of the language. Doing
> it one way, gets you recursion error or time halts, but changing a
> structure or tweak a part, you are good to go.

The same as C, yes.

> For example, you complain that Mathematica's list has the
> computational behavior such that it is very slow to access or work
> with and quickly gets you into some hardware resource error. (i.e. it
> behaves like lisp's ?vector? datatype) My retort, is that this is
> simply a result of not knowing the language well.

The Mathematica language specification does not specify the complexities of
most core operations so it is not possible to work out the complexity of a
given function. So it doesn't matter how well you know the language, you
still cannot ascertain how long your function will take to run.

If you are intimately familiar with the current implementation of
Mathematica then you may or may not already know the complexity of a core
operation but that could change in the future and break your code.

> (not necessarily 
> that you don't) For example, if i'm a student, i might use lisp's
> alist data type when i actually should use its hash map. The result
> being, my program runs like a slug and quickly hits some hardware
> resource limit. I would be wrong of me, to blame lisp by claiming it
> has bad implementation of key'd list.

Absolutely but the complexities of those operations are probably well
specified and widely understood in Lisp. The same is not true of
Mathematica.

> As to the actual solution to your complaint about mathematica's
> implementation of list, for a mathematica programer expert, in
> situations when she knew she's gonna do a lot preappending to grow a
> list, she'd just nest the list instead of prepend, then Flatten at the
> end if necessary.

Actually you shouldn't do that in Mathematica because any functions that you
write to act upon deep lists will segmentation fault and lose all of your
data.

A better solution might be to use Reap and Sow. However, this is
Python-style imperative programming at its worst when the problem can be
solved trivially and much more efficiently using functional programming.

> Let me give a reverse example then. In Mathematica, i can do:
> 
> In[1]:=
> 433333333333333333399999999999999999999999999999999999555+1
> 
> Out[1]=
> 433333333333333333399999999999999999999999999999999999556
> 
> In most languages, that's ugly error. Can i then claim, that all these
> other languages, does not have a good implementation of numbers?

That solves a problem that I don't have.

I can easily find a converse example as well. Look at set implementations.
Not amazingly complicated you'd think but OCaml's set implementation is
asymptotically more efficient than Mathematica's so it solves my real
problems 10,000x faster. That is much more important to me than adding one
to a fifty digit integer.

You can benchmark a direct translation of the code from my book OCaml for
Scientists, the "n"th-nearest neighbour example:

  http://www.ffconsultancy.com/products/ocaml_for_scientists/complete/

The F# code is just this:

  let unions = Seq.fold Set.union Set.empty

  let nth nth n i =
    match n with
    | 0 -> Set.singleton i
    | 1 -> Set.map (displace i) system.vertices.[i.index].neighbors
    | n ->
        let s1, s2 = nth (n-1) i, nth (n-2) i
        unions(Seq.map (nth 1) s1) - s2 - s1

  let nth = memoize nth

I originally tried to write this in Mathematica and was unable to write
something that would terminate before my PhD did. At the time I resorted to
C++ but I now know that OCaml or F# would be much better options. They
require less code than Mathematica and the run orders of magnitude faster
without crashing.

> Actually this is a good example of my concept of high level languages:
> aka, A protocol to specify computation.

Some would say that computation should be irrelevant in a high-level
language and that everything should be declarative. This is the Haskell way
but it leads to more performance woes (AFAIK).

I would say that performance is always relevant so I believe there is a
sweet spot between specifying computations and declarative styles.

> In Mathematica, this simple example illustrates a most simple, basic,
> elementary specification of a computation. In most other languages,
> they simply cannot do it, or requires some libraries, or requires some
> datatype declaration. All these, are symptoms of the behind-the-scenes
> implementation/practimatics/speeed/hardware issues surfacing to the
> language level, which interferes and complicates the language by
> mumble jumble idiosyncratic to each language.

But only for that one specific problem.

> ...
> Do lispers think, that general languages in the next 10 or 20 years,
> will still need to call special libraries, so-called ?data types?, or
> other behind-the-scenes mumble jumble in order to do the above?

I would rather have the ability to call a library than the inability to
solve a problem at all, which is what Mathematica offers me. Mathematica's
performance for general programming puts serious limitations on what it is
useful for.

For example, you cannot compute set theoretic operations efficiently in
Mathematica. You either use the inefficient built-in implementations (that
cannot be improved without breaking their specification) or you write your
own and suffer the orders of magnitude slowdown caused by using a very
inefficiently interpreted language.

This doesn't just apply to set operations either, it actually applies to all
data structures that Mathematica doesn't happen to provide. Try
implementing finger trees or suffix trees or wavelet transforms or...

I believe the work done by the implementors of a language conveys about the
best you can possible hope to achieve using their tool. If you look at the
Wavelet Explorer package for Mathematica, for example, you'll see that its
core transforms can be implemented both trivially and vastly more
efficiently in C, C++, OCaml, F# or almost any other compiled language. For
me, that represents the boundary where Mathematica ceases to be useful.
Modern statically typed functional programming languages have eaten so far
into that boundary that I no longer even use Mathematica. The only thing I
missed was the graphics but now you can buy far more powerful visualization
tools for OCaml than Mathematica will ever have, and for a fraction of the
price.

> If, the majority of computer scientist and compiler morons, actually
> clearly see the above high-level language idea (as opposed to
> perpetually bury their heads into ?types? and other ?behind the
> scenes? mumble jumble), by now we could have many high level
> languages, that are not only as beautiful and easy to use as
> Mathematica, but diverse, some will be fast, control hardware, or
> free.

I've spent many years working with a variety of different languages and the
only feature that I would steal from Mathematica to put into a general
purpose language is its notebook front-end with integrated typesetting and
visualization.

> Now, if you do understand the idea of separating the pragmatics stuff
> out of a computer language, then perhaps we can discuss further the
> topic fruitfully. I do not know compilers, many of you do have
> compiler knowledge. Perhaps, we can discuss, to what extend such high
> level language is possible. (don't forget that Mathematica is already
> a existing example. Don't bury your head just in lisp. And, there are
> Haskell, erlang, f#... and there's also denotational semantics which i
> don't understand but i'm sure others here do) The goal of discussion,
> is to grow and cultivate ideas from people with diverse interests and
> knowledges. Not, using your narrow expertise to shot others down.

Well, I've written Mathematica compilers as well... :-)

In the context of compilation, Mathematica is basically in the same boat as
Lisp. If you want to run general purpose code efficiently then you've got a
real uphill struggle on your hands. There are some gallant attempts for
Lisp, like SBCL, but they will never compete with statically typed
languages because acquiring the information the compiler needs in order to
optimize the code completely erodes useful features like dynamic linking,
separate compilation and so forth. Furthermore, I find static typing an
essential tool for writing non-trivial programs correctly.

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/products/?u
From: Xah Lee
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <95b57fd2-84a2-40e6-bcbf-77996e66b964@d4g2000prg.googlegroups.com>
Jon Harrop wrote:
«Exactly. You must be aware of stack consumption in C just as you must
be aware of it in Mathematica. There is no difference.»

Well, in my programing experiences since 1992, i don't know what is a
stack other than its mathematical concept of “first in last out”, but
i wrote a lot programs. In my programing in Mathematica, Perl, Python,
PHP, emacs lisp, SQL, i don't think here's a moment i thougt i need to
understand stacks more than its mathematical concepts.

I venture to say, more than 80% of today's professional programers
don't know what a stack is other than a vague idea of its mathematical
properties. (“professional programer” = people who's income are from
coding.)

i think, once a programer have some expertise in compilers, then
everything's a stack or some such mumble jumble. A language to specify
computation, do not require the concept, unless you are talking about
manipulating hardware.

In our context of recursion error ...  You characterise it as not
being aware of the language's stack consumption. I see it just as not
familiar with the language. There is no need, to speak of stacks, just
as there is no need, to speak of the CPU's architecture, or
electronics and electricity.

  Xah
  ···@xahlee.org
∑ http://xahlee.org/

☄
From: Jon Harrop
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <13plj4kq68iau15@corp.supernews.com>
Xah Lee wrote:
> Well, in my programing experiences since 1992...
>
> In our context of recursion error ...  You characterise it as not
> being aware of the language's stack consumption. I see it just as not
> familiar with the language. There is no need, to speak of stacks, just
> as there is no need, to speak of the CPU's architecture, or
> electronics and electricity.

Are you familiar with $RecursionLimit in Mathematica?

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/products/?u
From: George Neuner
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <91mnp3t598cpj4d1hovngo2m8bkkio9v38@4ax.com>
On Fri, 25 Jan 2008 20:36:34 -0800 (PST), Xah Lee <···@xahlee.org>
wrote:

>
>Jon Harrop wrote:
>�Exactly. You must be aware of stack consumption in C just as you must
>be aware of it in Mathematica. There is no difference.�
>
>Well, in my programing experiences since 1992, i don't know what is a
>stack other than its mathematical concept of �first in last out�, but
>i wrote a lot programs. In my programing in Mathematica, Perl, Python,
>PHP, emacs lisp, SQL, i don't think here's a moment i thougt i need to
>understand stacks more than its mathematical concepts.
>
>I venture to say, more than 80% of today's professional programers
>don't know what a stack is other than a vague idea of its mathematical
>properties. (�professional programer� = people who's income are from
>coding.)

I'd bet money that 99.44% of "professional programmers" know what a
stack is (for certain somewhere there's an idiot who doesn't) - both
in the abstract and in the various realizations they routinely
encounter.  

I'd also bet money that 80% of _all_ programmers know about stacks.
Certainly anyone who ever studied computer architecture or basic
algorithms or who learned assembler or any of the common procedural
languages knows what a stack is.  Very few programmers know nothing of
how the underlying computer works and, IME, most _want_ to know.

IMO, to go a significant amount of time without having an
understanding of such a basic data structure, a programmer would have
to be living under a rock and I would not employ such an ignorant
person to do any sort of programming regardless of the language used.


>i think, once a programer have some expertise in compilers, then
>everything's a stack or some such mumble jumble. A language to specify
>computation, do not require the concept, unless you are talking about
>manipulating hardware.

You have absolutely no idea what you are talking about.


>In our context of recursion error ...  You characterise it as not
>being aware of the language's stack consumption. I see it just as not
>familiar with the language. There is no need, to speak of stacks, just
>as there is no need, to speak of the CPU's architecture, or
>electronics and electricity.

Do you also see pink elephants?  Languages do not compute - computers
compute.

George
--
for email reply remove "/" from address
From: Xah Lee
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <078a3fe6-ba7d-4a71-8156-5b51429bf775@d21g2000prf.googlegroups.com>
Xah Lee wrote:

«Well, in my programing experiences since 1992, i don't know what is a
stack other than its mathematical concept of “first in last out”, but
i wrote a lot programs. In my programing in Mathematica, Perl, Python,
PHP, emacs lisp, SQL, i don't think here's a moment i thougt i need to
understand stacks more than its mathematical concepts.»

«I venture to say, more than 80% of today's professional programers
don't know what a stack is other than a vague idea of its mathematical
properties. (“professional programer” = people who's income are from
coding.)»

Now, there are two idiots that goes by the name of Tim X and George
Neuner, who challenged me of my statement about the num of pro coders
unstand stacks.

Pls excuse me from calling them idiots, because i can no longer hold
back not calling idiots idiots.

i dunno who they are, but the Tim X guy always sounds to me like a
college stud, and the George Neuner guy always sounds to me like some
programing hobbyist that crawled out of the woodwork.

Nevertheless, the question is, to what degree, professional
programers, knows the term stack. I claimed:

«I venture to say, more than 80% of today's professional programers
don't know what a stack is other than a vague idea of its mathematical
properties. (“professional programer” = people who's income are from
coding.)»

and the other 2 idiots claimed:

George: «I'd bet money that 99.44% of "professional programmers" know
what a stack is (for certain somewhere there's an idiot who doesn't) -
both in the abstract and in the various realizations they routinely
encounter.»

----------------------

it is my recommendation, for the idiots to do homework.

the subject here is: to what degree, a professional programer,
understand the term Stack.

As usual in student affairs, first of all they need to find some
definitions. To prevent idiotic students from thinking Professional
Programer being their buddies and cronies or the morons who slave in
comp.lang.* newsgroups, i've given that “professional programer” means
people who's primary income are from coding.

Now, what is a stack? To what degree, of the understanding of the
word, can we say they understand it or not understand it? To clarify,
let me give this definition as the cutting line: A stack is the
mathematical abstraction of inserting ping pong balls into a pussy,
and the first inserted, will come out last.

Now, a programer who understand stacks no more than the above, we'll
consider him as not knowing what a stack is.

So, now, idiots like Tim and George should start their research. I
recommend to spend at least 1 hour, to the question of: what
percentage of pro programers actually understand stacks.

How to go about on this research? That's a good question. It is
exactly what college education teachs you, but perhaps most of you
forgot. I can give you some guide.

First of all, where do we start?

• visit a library. Library usually has a reference section that holds
all types of references and statistical info. Librarians are also
often helpful. If your are dearth in ideas, you can simply ask the
librarian to help u where to start.

• use the web for research. Sure, web search is deemed un-academic,
but the fact is that it helps, and as time goes on, it is perhaps a
critical resource actually better than most organized or printed
resources. How to use web search for this question? I dunno. Perhaps
you can start to search the word “stack”, and see what are some common
things that came out of programers mouth about the word. You could,
spend some time exam how programers discuss it. It should give some
clue as to how much they understand the term.

• if you work in a company, you can start asking your coworkers. Sure,
usually you probably don't want to ask your coworkers. It might
embarass them and strain the relationship you perhaps already don't
have. But, because they are your coworkers, you perhaps already have a
good idea how much they know. Remember, everyone in your company's
tech department is a pro coder. So, you should consider the question
asked to all, not just _some_ of your cronies who you discuss lisp
with. Also, remember the question's population is all pro coders, not
just your company. So, if your company is one of the elite like Apple,
Google, Microsoft, Intel, AMD, Amazon, youtube, youporn, ebay... or
the Fortunate 1000, then it is perhaps highly biased.

• in general, perhaps you need to take a course is descriptive
statistics. It will teach you, in general, all the techniques in
carrying out a survey for a statistical report. In the above, i only
gave out some tips, but is totally not representative of a full course
on statistical research. So, if you forgot what college taught you
about surveying, perhaps in a stat course, or intro to science course,
or psychology or sociology course, you'll have to refresh it.

-----------------

As it so happens, George mentioned that he's willing to bet money on
it.

George wrote: «I'd also bet money that 80% of _all_ programmers [as
opposed to some non-professionals] know about stacks.»

So, George, after you've done your research, i recommend, you
automatically send me $10 USD to my paypal account using
···@xahlee.org. ($10 is all i ask) I am not interested in what you
find out exactly actually. I'm just interested in your honesty, and
receive the money due me.

  Xah
  ···@xahlee.org
∑ http://xahlee.org/

☄
From: George Neuner
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <p5s3q358jrjor4l1fqou5vsbdfeor3vtd9@4ax.com>
On Thu, 31 Jan 2008 04:37:40 -0800 (PST), Xah Lee <···@xahlee.org>
wrote:

>Xah Lee wrote:
>
>�Well, in my programing experiences since 1992, i don't know what is a
>stack other than its mathematical concept of �first in last out�, but
>i wrote a lot programs. In my programing in Mathematica, Perl, Python,
>PHP, emacs lisp, SQL, i don't think here's a moment i thougt i need to
>understand stacks more than its mathematical concepts.�

Few "professional" programmers code in Mathematica at all and almost
no one writes deliverable applications in it.  Where it is used -
which is mostly in scientific computing and signal processing - it is
a tool to figure out algorithms before the application is coded in a
real language that can process a real data set before entropy destroys
the universe.

Similarly, Perl is used mainly by IT for scripting - not a whole lot
for application programming.  And Emacs Lisp is used for quick hacks
and to write extensions for Emacs - AFAIK there are no independent
applications written in it.

Of the languages you list, only SQL, and of late PHP and Python, are
used by real programmers in writing real applications.  And of them,
only Python is a general purpose language.


>�I venture to say, more than 80% of today's professional programers
>don't know what a stack is other than a vague idea of its mathematical
>properties. (�professional programer� = people who's income are from
>coding.)�
>
>Now, there are two idiots that goes by the name of Tim X and George
>Neuner, who challenged me of my statement about the num of pro coders
>unstand stacks.
>
>Pls excuse me from calling them idiots, because i can no longer hold
>back not calling idiots idiots.

There's no excuse for you.  I don't give a rat what you call me ...
you are still wrong.


>i dunno who they are, but the Tim X guy always sounds to me like a
>college stud, and the George Neuner guy always sounds to me like some
>programing hobbyist that crawled out of the woodwork.

I don't know Tim's background.  

George began developing software in 1977.  George has MSCS specialized
in database implementation, BSCS with minors in EE and Mathematics,
and completed course work but not dissertations for two additional
Masters in OS and programming languages.  George worked through
college as a paralegal reviewing patents and patent applications, and
occasionally drafting new patent applications.  George knows well C,
C++, Java, Scheme, SQL, Pascal and several assembler languages; knows
enough Common Lisp, Quel and Modula3 to be considered dangerous; is
learning Haskell and has forgotten much of the BASIC, Fortran and SML
he once knew through lack of use.  George has been employed developing
commercial real time and high performance applications for 14 years.
George has designed algorithms for image processing and character
recognition.  George has designed operating software for a USFDA
approved medical diagnostic imaging system; designed operating
software for a DSP+FPGA DA/DP system; and collaborated on a
programming language, compiler and runtime for writing FPGA
accelerated applications.  

The Xah guy always sounds to me like a typical grad student - a self
appointed expert whose knowledge is miles deep and inches wide.


>Nevertheless, the question is, to what degree, professional
>programers, knows the term stack. I claimed:
>
>�I venture to say, more than 80% of today's professional programers
>don't know what a stack is other than a vague idea of its mathematical
>properties. (�professional programer� = people who's income are from
>coding.)�
>
>and the other 2 idiots claimed:
>
>George: �I'd bet money that 99.44% of "professional programmers" know
>what a stack is (for certain somewhere there's an idiot who doesn't) -
>both in the abstract and in the various realizations they routinely
>encounter.�

And I stand by it.  I've been to developer conferences and I've met,
corresponded with and argued on Usenet with hundreds of developers
from different application areas.  I've never encountered anyone who
seriously developed software and didn't have clue 1 about how the CPU
worked.  I've never actually asked anyone if they understood stacks
abstractly because it was already apparent that they understand them
concretely.


>----------------------
>
>it is my recommendation, for the idiots to do homework.

It's my recommendation that you get out of your Ivory Tower and get a
job in the real world.


>the subject here is: to what degree, a professional programer,
>understand the term Stack.

	:
<snipped - a bunch of meaningless drivel>
	:


>� if you work in a company, you can start asking your coworkers. Sure,
>usually you probably don't want to ask your coworkers. It might
>embarass them and strain the relationship you perhaps already don't
>have. But, because they are your coworkers, you perhaps already have a
>good idea how much they know. Remember, everyone in your company's
>tech department is a pro coder. So, you should consider the question
>asked to all, not just _some_ of your cronies who you discuss lisp
>with. Also, remember the question's population is all pro coders, not
>just your company. So, if your company is one of the elite like Apple,
>Google, Microsoft, Intel, AMD, Amazon, youtube, youporn, ebay... or
>the Fortunate 1000, then it is perhaps highly biased.

The first halfway intelligent idea here - take a poll.  But why not
include the whole programming world instead of just your self selected
ignorant corner of it?

I don't consider the majority of IT people to be software developers.
Only some are full time application developers - for most, programming
is a secondary byproduct of their primary administration work.  I
don't know offhand what percentage of IT works full time in
development but I would expect the number to be fairly low.

Including IT would only hurt your position because most IT people have
a better understanding of CPU hardware than you appear to - which
means a large percentage of them would likely know what a stack is.

I also don't consider Joe salesperson who knows a little Visual BASIC
and whips up little apps at lunchtime to be a "professional" software
developer.


>� 

<snipped - more meaningless drivel>


>-----------------
>
>As it so happens, George mentioned that he's willing to bet money on
>it.
>
>George wrote: �I'd also bet money that 80% of _all_ programmers [as
>opposed to some non-professionals] know about stacks.�
>
>So, George, after you've done your research, i recommend, you
>automatically send me $10 USD to my paypal account using
>···@xahlee.org. ($10 is all i ask) I am not interested in what you
>find out exactly actually. I'm just interested in your honesty, and
>receive the money due me.
>
>  Xah
>  ···@xahlee.org
>? http://xahlee.org/


You're obviously not very confident if all you want to bet is $10.

Even so I'm not your grad student and I won't do your scut work.  You
made the initial claim - you need to show your proof first and then I
get a chance to refute it.

Provide data to back up your claim that

  "more than 80% of today's professional programmers don't
   know what a stack is other than a vague idea of its
   mathematical properties"

or, equivalently, of the converse: that fewer than 20% of professional
programmers understand concretely what a stack is.

I nominate the members of these forums as peer reviewers.

George
--
for email reply remove "/" from address
From: Slobodan Blazeski
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <08731122-323b-47f9-bda0-bba7ffbd7c75@s13g2000prd.googlegroups.com>
On Jan 31, 9:10 pm, George Neuner <·········@/comcast.net> wrote:
>  George knows well C,
> C++, Java, Scheme, SQL, Pascal and several assembler languages; knows
> enough Common Lisp, Quel and Modula3 to be considered dangerous;
ROTFL

cheers
Slobodan
From: Marshall
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <cb630a50-fe0f-41df-9731-3a35e1e5e179@h11g2000prf.googlegroups.com>
On Jan 31, 12:10 pm, George Neuner <·········@/comcast.net> wrote:
>
> The Xah guy always sounds to me like a typical grad student - a self
> appointed expert whose knowledge is miles deep and inches wide.

Reading this paragraph got me to wondering, so I decided to check
his website to see if I could read about his background. I found this:

http://xahlee.org/PageTwo_dir/Personal_dir/xah.html

I have to say it explains a lot.


Marshall
From: John
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <4807b03d-eb05-4279-9a23-86fdb60ff7d8@e25g2000prg.googlegroups.com>
On Feb 1, 12:45 am, Marshall <···············@gmail.com> wrote:
> On Jan 31, 12:10 pm, George Neuner <·········@/comcast.net> wrote:
>
>
>
> > The Xah guy always sounds to me like a typical grad student - a self
> > appointed expert whose knowledge is miles deep and inches wide.
>
> Reading this paragraph got me to wondering, so I decided to check
> his website to see if I could read about his background. I found this:
>
> http://xahlee.org/PageTwo_dir/Personal_dir/xah.html
>
> I have to say it explains a lot.
>
> Marshall

Schizoid, homeless... I could have told you that. He's only about 30
miles from where I'm working... if I ever want to learn the uber-
language Mathematica, I guess I know where to go.
From: Xah Lee
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <f459c9f1-8f31-4605-9e4d-6b2e281f6f19@v29g2000hsf.googlegroups.com>
John <··········@gmail.com> wrote:
«He's only about 30 miles from where I'm working... if I ever want to
learn the uber- language Mathematica, I guess I know where to go.»

I read here that there are lispmeister gathering in the Bay Area? We
can arrange this. You treat me free dinner, i teach and explain the
problems of high-level language design. My services can also include
entertainment in the form of writing.

If there are rich lispers here, you can sponsor me and i live under
your household. Like a mohist in China's Spring and Autumn period. You
give me food, i provide services of teaching and guidance. This can be
made into a 3 or 6 months contract. You'd be my feudal lord. (though
as part of the agreement, my freedom of expression, including
criticizing you at times for your own good, cannot be stifled.)

The George Neuner character has introduced himself. Who are you
though?

  Xah
  ···@xahlee.org
∑ http://xahlee.org/

☄

On Jan 31, 10:57 pm, John <··········@gmail.com> wrote:
> On Feb 1, 12:45 am, Marshall <···············@gmail.com> wrote:
>
> > On Jan 31, 12:10 pm, George Neuner <·········@/comcast.net> wrote:
>
> > > TheXahguy always sounds to me like a typical grad student - a self
> > > appointed expert whose knowledge is miles deep and inches wide.
>
> > Reading this paragraph got me to wondering, so I decided to check
> > his website to see if I could read about his background. I found this:
>
> >http://xahlee.org/PageTwo_dir/Personal_dir/xah.html
>
> > I have to say it explains a lot.
>
> > Marshall
>
> Schizoid, homeless... I could have told you that. He's only about 30
> miles from where I'm working... if I ever want to learn the uber-
> language Mathematica, I guess I know where to go.
From: Xah Lee
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <a9895338-f584-4680-b61b-031578972934@1g2000hsl.googlegroups.com>
dear Georgy boy,

did you know you are a dork?

Yes you are. You, and few others here, including in particular that
Rainer Joswig character, always seems me the biggest dorks in
comp.lang.lisp. (there are quite a lot here, but i don't want to make
this a listing) You, are the most typical idiot in the IT industry, of
which i call tech geeking morons.

I don't know how to reply to your message. Where do i start?

Should i take a sincere approach, and reply to your post politely and
matter-of-factly, so as to actually make you see how your claims and
thinking are wrong? I thought about this and its consequences. In
general, i might win some admiration, but the reply will in general be
ignored (possibly with you replying that hints on some
reconciliation). Note that there are countless tech geeker morons
slaving in tech newsgroups before you and after you, that in general
educating them on a individual basis is silly.

Should i take a flamboyant approcah, to write a reply that proves your
idiocy in a unkind fashion, and to satisfy my writing itch? As you
know, this is my usual style, but after a while the satisfaction wanes
too. (think of teaching lisp in comp.lang.perl. You'll be swarmed by a
army of antagonistic idiots; each of your point'd be shot down by 10
driveling posts.)

Should i maintain a kind and sincere approach to discussions in
newsgroups, so that i garner admirers and respect even with my unusual
ideas? I'm not sure. For one, it is not my style, and as you know,
hatred drives me. For two, i wish to change society, in part by,
spreading the idea of truth and love sans clothing, and, sans
diplomacy.

Am i out of my mind, to actually hope to get some quality discourse in
newsgroups? LOL. George, LOL.

Nice resume btw.

In this thread, we discussed the issue of how many professional
programers understand the word stack. You challenged me on this. In my
previous message, i gave some definitions to validify this question.
In particular, a give a definition on what's meant by “professional
programer”, and what's meant by “understanding stacks”. In your reply,
you seems to have issues with my definition but isn't explicit about
it. Do you, agree with my definition for this question we gonna bet?

I think $10 is good amount. It's small and fun.

  Xah
  ···@xahlee.org
∑ http://xahlee.org/

☄
From: Ken Tilton
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <47a36d19$0$6336$607ed4bc@cv.net>
George Neuner wrote:
>>George: �I'd bet money that 99.44% of "professional programmers" know
>>what a stack is (for certain somewhere there's an idiot who doesn't) -
>>both in the abstract and in the various realizations they routinely
>>encounter.�
> 
> 
> And I stand by it.  I've been to developer conferences and I've met,
> corresponded with and argued on Usenet with hundreds of developers
> from different application areas.  I've never encountered anyone who
> seriously developed software and didn't have clue 1 about how the CPU
> worked. 

Things change dramatically when you get off the elevator on one of the 
five floors dedicated to IT at MegaCorp and you start polling the 
Dilberts of the world. I once drew a blank on the difference bewteen 
compiled and interpreted, then got mocked for being a geek cuz I 
expected them to know. Their cubicle mate just wanted to talk about late 
night infomercials on how to get rich buying and selling real estate. 
And so it goes...

kenny
From: George Neuner
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <2be9q3he02nj962c8dkmpiv10qh3qqjqfh@4ax.com>
On Fri, 01 Feb 2008 14:03:51 -0500, Ken Tilton
<···········@optonline.net> wrote:

>
>
>George Neuner wrote:
>>>George: �I'd bet money that 99.44% of "professional programmers" know
>>>what a stack is (for certain somewhere there's an idiot who doesn't) -
>>>both in the abstract and in the various realizations they routinely
>>>encounter.�
>> 
>> 
>> And I stand by it.  I've been to developer conferences and I've met,
>> corresponded with and argued on Usenet with hundreds of developers
>> from different application areas.  I've never encountered anyone who
>> seriously developed software and didn't have clue 1 about how the CPU
>> worked. 
>
>Things change dramatically when you get off the elevator on one of the 
>five floors dedicated to IT at MegaCorp and you start polling the 
>Dilberts of the world. I once drew a blank on the difference bewteen 
>compiled and interpreted, then got mocked for being a geek cuz I 
>expected them to know. Their cubicle mate just wanted to talk about late 
>night infomercials on how to get rich buying and selling real estate. 
>And so it goes...

I've had similar experiences.  But as I said to Xah, I don't consider
most IT people to be professional *programmers*.  Some do spend all
their time developing applications but most program only as part of
their job.

George
--
for email reply remove "/" from address
From: Marshall
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <7d4e2b95-8ca3-4c5f-a04f-f53697ccec0c@h11g2000prf.googlegroups.com>
On Jan 26, 5:39 pm, George Neuner <·········@/comcast.net> wrote:
>
> I'd bet money that 99.44% of "professional programmers" know what a
> stack is (for certain somewhere there's an idiot who doesn't) - both
> in the abstract and in the various realizations they routinely
> encounter.

Oh, sure. In 25+ years in the industry I don't think I've ever
encountered
someone who didn't know what a stack was. At my last company
a number of the standard interview question were in terms of stacks.
The Java standard library had a Stack class in its first release. It's
a very common abstraction, and of course with C being as dominant
as it is ...


> I'd also bet money that 80% of _all_ programmers know about stacks.

I wonder what the percentage is for Forth programmers? :-)


Marshall
From: Andrew Reilly
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <60f1sdF1qc3vvU1@mid.individual.net>
On Thu, 31 Jan 2008 08:05:02 -0800, Marshall wrote:

> I wonder what the percentage is for Forth programmers?

Fortran and COBOL programmers (old-school, anyway), might bias the result 
very slightly.  There probably aren't any so exclusively focused any 
more, though...

-- 
Andrew
From: George Neuner
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <che9q3pi1oekui97hspvm0uk743f416dnp@4ax.com>
On 31 Jan 2008 22:54:05 GMT, Andrew Reilly
<···············@areilly.bpc-users.org> wrote:

>On Thu, 31 Jan 2008 08:05:02 -0800, Marshall wrote:
>
>> I wonder what the percentage is for Forth programmers?
>
>Fortran and COBOL programmers (old-school, anyway), might bias the result 
>very slightly.  There probably aren't any so exclusively focused any 
>more, though...

COBOL would certainly skew the results, but I'm not so sure about
Fortran.  Many old Fortran programs implement manual recursion with
arrays.  I wouldn't hazard a guess at the percentage but I suspect
more Fortran programmers were more knowledgable than most people today
suspect.

George
--
for email reply remove "/" from address
From: Andrew Reilly
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <60kj7gF1r8hvjU1@mid.individual.net>
On Sat, 02 Feb 2008 13:54:03 -0500, George Neuner wrote:

> On 31 Jan 2008 22:54:05 GMT, Andrew Reilly
> <···············@areilly.bpc-users.org> wrote:
> 
>>On Thu, 31 Jan 2008 08:05:02 -0800, Marshall wrote:
>>
>>> I wonder what the percentage is for Forth programmers?
>>
>>Fortran and COBOL programmers (old-school, anyway), might bias the
>>result very slightly.  There probably aren't any so exclusively focused
>>any more, though...
> 
> COBOL would certainly skew the results, but I'm not so sure about
> Fortran.  Many old Fortran programs implement manual recursion with
> arrays.  I wouldn't hazard a guess at the percentage but I suspect more
> Fortran programmers were more knowledgable than most people today
> suspect.

Yes, you're right.  Maths/physics outlook, etc.  Many of the good matrix 
algorithms are intrinsically recursive, etc.  It was just the absense of 
recursion support in the language (and the corresponding absence of an 
argument stack in some implementations) that I was thinking of.  So make 
that "very" bold-italic in the case of Fortran...

Cheers,

-- 
Andrew
From: Tim X
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <87wspvn86s.fsf@lion.rapttech.com.au>
Xah Lee <···@xahlee.org> writes:

> Jon Harrop wrote:
> «Exactly. You must be aware of stack consumption in C just as you must
> be aware of it in Mathematica. There is no difference.»
>
> Well, in my programing experiences since 1992, i don't know what is a
> stack other than its mathematical concept of “first in last out”, but
> i wrote a lot programs. In my programing in Mathematica, Perl, Python,
> PHP, emacs lisp, SQL, i don't think here's a moment i thougt i need to
> understand stacks more than its mathematical concepts.
>

Then I would suggest your programming experience has been quite limited
and the problems you have worked on fairly trivial. 

> I venture to say, more than 80% of today's professional programers
> don't know what a stack is other than a vague idea of its mathematical
> properties. (“professional programer” = people who's income are from
> coding.)

A stack is one of the most basic of all the ADTs and for a programmer
not to know it or never use it is rediculous. However, I think that many
of the refernces in this thread refer to the call stack rather than the
more generic concept of a stack ADT. I would agree that due to improved
hardware, issues associated with the call stack are less frequent than
they use to be. At the same time, I think programmers do need to know
what the call stack is and the role it plays because an abstract
udnerstanding of this makes it a lot easier to make the right decision
when implementing an algorithm. I also think it helps in understanding
other aspects of programming, such as scope, closures etc. 

>
> i think, once a programer have some expertise in compilers, then
> everything's a stack or some such mumble jumble. A language to specify
> computation, do not require the concept, unless you are talking about
> manipulating hardware.

Disagree. The stack and its role in a specific algorithm has impact on
the way you will implement the algorithm. For example, in C you may
decide to implement a function using call by reference rather than call
by value because you are dealing with large data structures and call by
value will result in a lot of memory allocation and increased stack
sizes that will make things slower and use more memory. 
>
> In our context of recursion error ...  You characterise it as not
> being aware of the language's stack consumption. I see it just as not
> familiar with the language. There is no need, to speak of stacks, just
> as there is no need, to speak of the CPU's architecture, or
> electronics and electricity.
>

How would you explain the language limitation with recursion without
reference to the stack? The bottom line, the stack is the underlying
cause and you can't understand the cause without reference to this. You
could just find out that the language is limited in some way wrt
recursion, but how would you understand what this limitation is and
where else it may have an impact without have some understanding of the
stack and how it works? When you run into the recursion limitation, how
will you know what solution to apply in order to get around the problem
if you don't understand the underlying cause - for example, how do you
understand how tail recursion and recursion optimisation works if you
don't understand the role of the stack?

but why do you have such an issue with mentioning the stack - it is the
reason the language has the limitation. I also disagree with your
assertion you don't need to know about the CPU. Assertions like these
only hold true if you are doing very trivial programming. If you have
something less trivial, then it is essential information. 

A simple example. You are writing a program and have to work out the
algorithm for a specific part. You need to make a choice regardig the
basic data structure to use. There are two possible choices, both with
their own pros and cons. One is to use a hash and the other is to use
some form of tree. The pros and cons relating to the different
properties of a hash and tree are fairly balanced given the current
problem, so we look at other issues that may have impact. One such issue
is efficiency of accessing the different data types. In some
cases, there may be large differences in this area due to the CPU and
the type of and size of cache it has. It is quite possible that the hash
may be more efficient because of a higher likelihood that subsequent
lookups are already in the cache. The tree structure o the other hand is
slower because of a higher number of cache misses because the memory
layout of the tree means that subsequent accesses are less likely to be
in the cache and the disk seek time is adding to the overhead
etc. Knowing this information about the CPU , cache, RAM and disk, I
might decide to still use a tree structure, but I may implement it in
such a way that I have an increased likelihood of subsequent accesses
being found in the cache. for example, maybe I will implement it so that
its actually built on top of an array and therefore all of the tree
nodes are in a contiguous block of memory rather than being scattered
through the memory, on different pages etc or maybe I will grab a single
block of memory and map the tree onto that etc. 

-- 
tcross (at) rapttech dot com dot au
From: Xah Lee
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <293638d8-f5b9-4ab6-bd4e-0b7236daa94c@i3g2000hsf.googlegroups.com>
Jon Harrop wrote:
«The Mathematica language specification does not specify the
complexities of most core operations so it is not possible to work out
the complexity of a given function. So it doesn't matter how well you
know the language, you still cannot ascertain how long your function
will take to run.»

Interesting.

well... but what could be some examples? For example, for complex math
functions such as Integrate or Solve, you can't know the time behavior
in advance. And for simpler things like Sort, basically one can assume
it has time behavior of the current knowldege.

thinking about this... i don't remember Perl, Python, PHP, emacs lisp,
Javascript's manuals ever talk about a function's time complexity.
(maybe once or twice)

(For Integrate, Solve, or other complex functions, one should know the
time complexity of course since there's the source code, but my guess
is that when the function is sufficiently complex, it's impractical to
actually know. I mean, say a function that are thousands of lines or
calls other libs ... when is the last time you actually do analysis to
know a big subroutine's complexity? When you use a complex regex in
perl, do you actually do mathematical analysis just so you know what's
your function's time complexity?

Basically, if you coded it, you deal with the complexity issue at the
time of creating a algorithm, then after that if the program runs well
of any reasonable tests, it's good to go. That's how software are
practically developed.)

Do you want to give more detail about this lacking complexity spec
issue in Mathematica?

  Xah
  ···@xahlee.org
∑ http://xahlee.org/

☄
From: Jon Harrop
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <13pljrthrnfho16@corp.supernews.com>
Xah Lee wrote:
> Interesting.
> 
> well... but what could be some examples?

Sure. The last time I was helping someone to debug a mysterious 100x
performance degradation in their Mathematica code it turned out to be a
numerical routine calling Fourier that was underflowing machine-precision
floats. When Mathematica notices this it silently moves to
arbitrary-precision floats throughout the FFT and everything grinds to a
halt.

When I implemented this commercial product:

  http://www.ffconsultancy.com/products/CWT/

part of the internals required an intricate understanding of exactly how and
when the Mathematica implementation chooses to reevaluate large arrays.
When you're dealing with a Gb of data that is an overwhelming concern and
the product could never have worked without this knowledge. When I worked
at Wolfram Research I described this to them and it turned out they were
not even aware of those performance characteristics themselves.

> For example, for complex math 
> functions such as Integrate or Solve, you can't know the time behavior
> in advance.

Sure.

> And for simpler things like Sort, basically one can assume 
> it has time behavior of the current knowldege.

Yes.

Consider:

  First2[h1_, h2_, t___] := {h1, h2}

What is the asymptotic complexity of that function? The last time I looked,
Mathematica eagerly copied the entire array "t" only to dispose of it
immediately. So that function is O(n) when you would expect it to be O(1).

> thinking about this... i don't remember Perl, Python, PHP, emacs lisp,
> Javascript's manuals ever talk about a function's time complexity.
> (maybe once or twice)

The C++ STL does. Our products certainly do:

  http://www.ffconsultancy.com/products/signal_processing_net/

> (For Integrate, Solve, or other complex functions, one should know the
> time complexity of course since there's the source code, but my guess
> is that when the function is sufficiently complex, it's impractical to
> actually know. I mean, say a function that are thousands of lines or
> calls other libs ... when is the last time you actually do analysis to
> know a big subroutine's complexity?

Yes, I do that all the time. You have to do that to write production-quality
code.

> When you use a complex regex in 
> perl, do you actually do mathematical analysis just so you know what's
> your function's time complexity?

I've never used regular expressions in performance-critical code but I use
pattern matching daily and I absolutely know my functions' time
complexities when its relevant, yes.

> Basically, if you coded it, you deal with the complexity issue at the
> time of creating a algorithm, then after that if the program runs well
> of any reasonable tests, it's good to go. That's how software are
> practically developed.)

When you say "deal with the complexity issue" you mean "optimize the code".
You can't optimize code if you can't predict performance, i.e. you must
know the performance characteristics of the language.

This is well known. Look at the literature on why ML tried to avoid
non-linear patterns, for example.

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/products/?u
From: Xah Lee
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <386a2aa2-6867-4343-9370-54a83b951479@u10g2000prn.googlegroups.com>
Dear Jon Harrop,

You say that the Mathematica lang has a problem, of not specifying
their time complexity.

My response was that it is in general unnecessary and impractical to
document a function's complexity. For simple functions, the complexity
is well known. For complex functions, such as Simplify, Solve,
Integrate, it is impractical to give time complexity. I also
illustrated the general problem, of perl code that use a complex
regex.

Unless your function particularly needs to be scaled, such as a
webserver or massive multiplayer online role-playing game, in general
i think the time complexity is not a theoretical concern but a
practical issue. You just run the program with as much input as
realistic, and if it works in reasonable time, it's fine.

You retort back with several lines. I tried a summary here:

• How mathematica's automatic infinity precision feature slowed down a
programer a magnatude of time, when he didn't expect it to kick in.

• Your own experiences and your work, which are scientific computation
on top of Mathematica, where you cite that you need better detail
about different constructs's time complexity and it was frustrating to
find out in Mathematica.

Jon wrote:
«
  First2[h1_, h2_, t___] := {h1, h2}

What is the asymptotic complexity of that function? The last time I
looked,
Mathematica eagerly copied the entire array "t" only to dispose of it
immediately. So that function is O(n) when you would expect it to be
O(1).
»

Any language, esp in the type of constructs as pattern matching, will
always be inefficient in some particular examples, i assert.

In fact, this is probably true for any construct, not just pattern
matching. A good example is a particular recursion construct that
could be optimized to a linear time complexity, as a requirement in
Scheme conformant compiler, but usally not optimized in such way in
other compilers. Especially not so in imperative languages.

So, that is one simple, common, example. Where something is O(n) in
time or resource, becomes O(big) in time and or resource.

Do you, understand this?

If you understand this example, then you can guess, that there are
quite a lot other examples, where, a simple construct that in theory
should give you time complexity of 1 unit, but turns out to be a order
more, when you used a particular imprementation. As a exercise, can
you come up with some examples, in different languages, for this? (we
already have tail recursion of Scheme/Lisp and Mathematica pattern
matching, as examples)

Further, i assert, in pattern matching, it is perhaps theoretically
impossible, to create a compiler that will always reach the theretical
opitmal time complexity for all possible constructs.

In general, this symptom is asymptotic to the complexity of the
function it self. For example, Mathematica's pattern matching, or
functions Solve, Simplify, Integrate, or other mathematical functions.
In other areas, we can conceive complex recursions (say, a lang that
focus on recursive recursions) parsers, compilers, AI solving
functions, image recognization ... etc.

Jesus, that just reminded me... What about the Halting Problem? Jon,
do you think a lang should specify it's time complexity?

-------------------------

It is true, that when you build large programs in Mathematica, in
general it is slow. And, you can not control or tweak much of it as
you can with languages like Haskell, Lisp, C, Java.

But as i said, that's the nature of high-level-ness. As hardware
resource grows exponentially, inevitably langs are just getting higher
and higher level. Witness, Perl, Python, Ruby, PHP. Witness C, C++,
Java.

Lang with garbage collection, it saves human time, but you lost
control to tweak memory. Lang with high-level hash table (e.g. Perl,
Python, PHP), it saves you time to implement it yourself, but you lost
control to fine details such as when to increase slots. (Emacs lisp
provides intermediates here)

So, in Mathematica, containing over a thousand advanced functions and
algorithms, it is designed to save your time, it is not designed for
you to tweak it, or insist on time complexity.

-------------------------------

The question here, is how can it be possible, to create a lang that's
both high-level and yet fully speedy...

I started to write about this in response to your previous message. I
posted already 2 replies but am still writing the 3rd and or 4th
addressing different issues in your message, but you already shot
back. One of them's complete content is this: “Are you familiar with
$RecursionLimit in Mathematica?”.

LOL. Jon, you made me LOL.

------------------------------------

> When you say "deal with the complexity issue" you mean "optimize the code".
> You can't optimize code if you can't predict performance, i.e. you must
> know the performance characteristics of the language.

No, i don't mean optimiz the code. I mean, when you write a function,
you have to pick or work out a algorithm. At that time, if you are a
good programer, you would have some idea your time/resource
complexity.

> This is well known. Look at the literature on why ML tried to avoid
> non-linear patterns, for example.

I have time on newsgroup to spat with you. I don't have time to read
whatever fucking paper a joe brandishes. Tell me i'm beautiful, and
carefully explain humbly about the detail of this, and provide good
url, then i'll consider it.

  Xah
  ···@xahlee.org
∑ http://xahlee.org/

☄
From: Tim X
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <873asjoo7k.fsf@lion.rapttech.com.au>
Xah Lee <···@xahlee.org> writes:

> Jon Harrop wrote:
> «The Mathematica language specification does not specify the
> complexities of most core operations so it is not possible to work out
> the complexity of a given function. So it doesn't matter how well you
> know the language, you still cannot ascertain how long your function
> will take to run.»
>
> Interesting.
>
> well... but what could be some examples? For example, for complex math
> functions such as Integrate or Solve, you can't know the time behavior
> in advance. And for simpler things like Sort, basically one can assume
> it has time behavior of the current knowldege.
>

I don't think thats true. Different languages implement routines in
different ways and have different orders of complexity. You cannot
assume a language has used the most efficient algorithm - you may hope
they have, but you cannot guarantee it. There have been many times I've
found the sort routine of a language to be poorly implemented or based
on an 'easy' implementation rather than one that was efficient. 

> thinking about this... i don't remember Perl, Python, PHP, emacs lisp,
> Javascript's manuals ever talk about a function's time complexity.
> (maybe once or twice)
>
> (For Integrate, Solve, or other complex functions, one should know the
> time complexity of course since there's the source code, but my guess
> is that when the function is sufficiently complex, it's impractical to
> actually know. I mean, say a function that are thousands of lines or
> calls other libs ... when is the last time you actually do analysis to
> know a big subroutine's complexity? When you use a complex regex in
> perl, do you actually do mathematical analysis just so you know what's
> your function's time complexity?

Agreed you may not know the complexity or efficiency of something prior
to using it. However, if you find the performance is not sufficient, you
will need to work out or at least estimate that complexity in order to
know how to optimise it. Without this critical bit of information, your
optimisation is going to be hit and miss. 

>
> Basically, if you coded it, you deal with the complexity issue at the
> time of creating a algorithm, then after that if the program runs well
> of any reasonable tests, it's good to go. That's how software are
> practically developed.)
>

And if it isn't, you examine the algorithm and work out how to reduce
its time/space complexity to make it more efficient. 

While I don't disagree with the premise that programming languages
should, as far as possible, remove the need for the programmer to know
about many of the low level implementation aspects. However, I don't
agree with your argument that discussions and investigation/analysis of
things like types, call stacks, memory heap etc are irrelevant and just
'tech geek jargon and rubbish'. It is exploration, experimentation and
theoretical discussion/debate in these areas that extend our
understanding and which may, in time, allow us to develop programming
languages that further reduce/remove the need for the programmer to know
about the lower level aspects of a language they are using. 

There is an analagy here with other occupations. For example, years ago
when I use to do a lot of woodwork/carpentry, I needed to know about the
various properties and characteristics of different types of wood. It
wasn't sufficient for me to just know how to use wood working tools
well. Likewise, to be a good programmer, its not just sifficient to know
how to use your language tool in an abstract sense. You need to know
about the platform you are working on to understand how the language has
been implemented on that platform. The type of CPU, its registers,
command set, cache, amount of RAM etc can all impact on the performance
of the program. Without this information, you can not know the best
optimisation strategies. 

having said all of that, I also should point out that these days the
vast majority of programs people are writing rarely push the boundries
of modern platforms. When I first started programming, CPU speeds and
available RAM were slow/small enough that it wasn't uncommon to have to
do things like unroll loops and code them in assembler. C was very
popular at this time because you could more easily get down and dirty
and do very efficient code using poiter arithmetic and customized memory
management that was tweaked to better suit the specific requirements or
usage pattern of the application you were writing. 

These days, this isn't as necessary. This is partly due to improvements
in hardware, but also due to improvements in compiler design and
optimization - improvements that have been largely achieved due to
research, debate and experience with all that tech geek stuff you seem
to think was only introduced to try and make the whole field somehow
elusive or exclusive and mystify things for the non-initiated. While, in
some areas, there has been a tendency to define new terms and concepts
when existing ones would have been sufficient, there is also plenty of
areas where no existing terminology was sufficient to clearly
describe/explain a concept. 

Of course, the easy way to prove your point would be to actually develop
a language that avoids all the unnecessary terms and concepts which seem
to frustrate you. If your right, the language will take off and
revolutionise the field. 

Tim



-- 
tcross (at) rapttech dot com dot au
From: vanekl
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <fndsn6$o8i$1@aioe.org>
Jon Harrop wrote:
[snip]
> In the context of compilation, Mathematica is basically in the same boat as
> Lisp. If you want to run general purpose code efficiently then you've got a
> real uphill struggle on your hands. There are some gallant attempts for
> Lisp, like SBCL, but they will never compete with statically typed
> languages because acquiring the information the compiler needs in order to
> optimize the code completely erodes useful features like dynamic linking,
> separate compilation and so forth. Furthermore, I find static typing an
> essential tool for writing non-trivial programs correctly.


These over-generalizations used to apply. Now, not so much.

Whenever I had a really sticky problem, I would first work out the algorithm
in Lisp and then port it to C/C++ so I could get that last bit of
efficiency that was sometimes required. This technique served me well until two
weeks ago. To my astonishment, the SBCL version ran faster than the C++ -O3 executable.
The most impressive thing was, I wasn't even trying to optimize the
Lisp code (there were zero type declarations), yet the Lisp code consistently outran
the C++/STL code by a significant margin on the same machine. (I/O did not play a
major role.) This type of behavior may not happen every time for every program, but
it's certainly evident to me that CL is no longer sitting at the children's table.

My congratulations to the SBCL team. Even if this just means that C++/STL is getting
bloated more than that CL is getting faster, in either event, it was an eye opener
and very much appreciated.

L. Vanek
From: Jon Harrop
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <13piffvrtlaej58@corp.supernews.com>
Kaz Kylheku wrote:
> According to you, anyway. If you understand Mathematica as well as you
> do Lisp, then I'd take all that with a grain of salt. Maybe there is a
> way to quote List[1, 2, 3] to get it as data, only you haven't figured
> it out.

Xah's point is that List[1, 2, 3] is already data and does not need to be
quoted.

>> This is a high-level beauty of Mathematica.
> 
> Not giving the programmers access to the data structure which
> represents the program source is simply a myopic limitation. A myopic
> limitation doesn't give rise to a higher level language.

Mathematica only gives you access to program source.

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/products/?u
From: Pascal Costanza
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <5vtvssF1nvck1U1@mid.individual.net>
Xah Lee wrote:

> The Mathematica language is very similar to lisp. From a Mathematica
> expert's point of view (me), the first thing that seems odd of lisp is
> the low-level-ness.

It's not so much about being low level, but rather about being able to
change the programming language from within a program itself. This
requires revealing implementation details of the language and providing
hooks to actually make effective changes.

If you believe that it's possible to come up with the 'right'
programming language(s) that don't require tweaking anymore, then this
probably appears as gratuitous to you. However, if you believe that an
essential part of solving a problem in a certain domain is to come up
with a domain-specific language, then this isn't so far off anymore.

Lisp was one of the first languages that enabled such tweaks in a more
systematic way. Yes, there were mistakes made in early Lisp dialects,
and some of those mistakes still live on in current dialects (including
Common Lisp and Scheme). But it is also clear that it is very hard to do
substantially better than Lisp. Some of the oddities in Lisp actually
seem to be necessary to be able to perform certain kinds of tweaks.

> Perhaps in order of the low level impression:
> 
> 1. The cons business. (lists are made up of cons cells and it is
> necessary for the programer to understand this to write any non-
> trivial lisp program)
> 
> 2. No highlevel treatment of lists. (In Mathematica, there are a
> system of functions to extract or manipulate (nested) lists considered
> as trees. e.g. getting all elements at a level, or arbitrary set of
> nodes of a tree. All these based on a tree index concept. (i.e. first
> branch's second branch's third's branch will be 1,2,3.) In lisp,
> programers uses cons, car, cdr, caadr, etc. That is bizarre and
> downright fucking stupid.)

This is an example of providing low-level details so that you can change
the behavior at a higher level. If you provide a list abstraction that
doesn't let you access low-level details, then you're stuck with the
design choices and API that the original designers have decided, and
can't do much about them (except by starting from scratch).

> 3. The syntax is a bit quirky. In particular, a mathematica programer
> sees that sometimes list is written as $B!H(B(list a b c)$B!I(B, but sometimes
> there's this oddity $B!H(B'(a b c)$B!I(B (which is syntactically equivalent to
> $B!H(B(quote a b c)$B!I(B). And, when mapping a function, sometimes the
> programer also needs to put the apostrophy in front. (A Mathematica
> programer would think, if a function (e.g. $B!H(Bmap$B!I(B) that takes another
> function as argument, why would the function require the programer to
> put the apostrophy, why the function itself could be designed to not
> evaluate that argument.)
> 
> 4. A behind-the-scenes model of computation. Namely, lisp the language
> deals with the concept of $B!H(Blisp objects$B!I(B, and there's a $B!H(Bprint syntax$B!I(B
> that represent these objects. (and a $B!H(Bread syntax$B!I(B that reads a code
> into these objects)

Yes, it's an acknowledged issue in Lisp that has its drawbacks, but at
the same time gives you important expressive power: By being able to
refer to runtime objects in source code directly you can actually write
programs about such objects (aka metaprogramming and reflection).

Brian Smith attempted to solve the problems with Lisp in 2-Lisp - see
his work on reflection and 3-Lisp, for which 2-Lisp was an intermediate
step. There are also two nice overview papers: "Reflection in logic,
functional and object-oriented programming: a Short Comparative Study"
by Demers and Malenfant, and "M-LISP: A Representation-Independent
Dialect of LISP with Reduction Semantics" by Muller.

I think 2-Lisp is pretty close to a good solution (but not substantially
 better enough to warrant changing all existing Lisp and Scheme source
code to match the 2-Lisp model). M-LISP seems closer to what you would
actually prefer, I think (but I haven't checked in detail).



Pascal

-- 
1st European Lisp Symposium (ELS'08)
http://prog.vub.ac.be/~pcostanza/els08/

My website: http://p-cos.net
Common Lisp Document Repository: http://cdr.eurolisp.org
Closer to MOP & ContextL: http://common-lisp.net/project/closer/
From: Robert Maas, see http://tinyurl.com/uh3t
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <rem-2008feb04-005@yahoo.com>
> From: Xah Lee <····@xahlee.org>
> The cons business. (lists are made up of cons cells and it is
> necessary for the programer to understand this to write any non-
> trivial lisp program)

IMO the way to write a program is to design an abstract data type
then to use it. Only the person hand-coding the implementation of
that ADT need worry about low-level details such as cons cells.
Most of the time the programmer can use mapping functions (for
lists, or for association lists, or for property lists, or for hash
tables, or for various binary trees such as AVL trees) and not
really care that these various structures are really built of cons
cells. But the fact that structures (except hash tables and arrays)
are directly built out of cons cells allows different objects to
share common structure. For example, large binary trees can share
all but the changes, thereby allowing update of a tree
non-destructively to take log(n) time, keeping both old and new
around simultaneously taking only log(n) extra space beyond a
single copy.

> No highlevel treatment of lists. (In Mathematica, there are a
> system of functions to extract or manipulate (nested) lists
> considered as trees. e.g. getting all elements at a level, or
> arbitrary set of nodes of a tree. All these based on a tree index
> concept. (i.e. first branch's second branch's third's branch will
> be 1,2,3.) In lisp, programers uses cons, car, cdr, caadr, etc.

Why 1,2,3 instead of 0,1,2? That point is too petty to fuss over.
Otherwise, you're just plain wrong. Look up the fuction NTH,
which indexes down a list by a given number.
Then imagine how trivial it is to write a function that uses a list
of successive indexes to specify the element number at successive
levels within a list, calling NTH at each level.
You can even offset by one if you really really need that.
From: Xah Lee
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <b5391b9b-6eb0-4a30-a446-e8a5b8dd0401@k39g2000hsf.googlegroups.com>
Robert Maas, wrote:

> IMO the way to write a program is to design an abstract data type
> then to use it. Only the person hand-coding the implementation of
> that ADT need worry about low-level details such as cons cells.

Exactly. The problem with lisp's cons business, is that it surfaced to
the language level in a way that a programer cannot ignore it.

i.e. a lisp programer, can not practically code in lisp by using
higher level list abstractions and functions without understanding the
cons business. Because, in practice, the using cons is common. You
will encounter its use in just about every lisp program. And, due to
the cons business at the language level, there is the problem of
“proper list” and “improper list”. So, you can't pretend that cons
doesn't exist and use only high-level list functions. They won't work.

> ...
> Otherwise, you're just plain wrong.

Don't be a moron. Thank me for educating you. I'm not trying to be
rude. I'm trying to be truthful. This post is a education for you and
your cohorts. (no insult)

Using garish remarks to try to get me to reply or answer a question,
usually won't work with me.

> Look up the fuction NTH,
> which indexes down a list by a given number.
> Then imagine how trivial it is to write a function that uses a list
> of successive indexes to specify the element number at successive
> levels within a list, calling NTH at each level.

The issue here is not whether it is easy to code it.

First of all, as i gave the reason above, the cons business in the
language stops any possibility of a uniform interface of list
manipulating functions to the underlying cons.

Secondly, there is a big practical benefit of having functions built-
in, even if trivial to code. (but the fact is, those list manipulating
function that exist in Mathematica, is actually Not that trivial to
code.)

For Common Lispers to easily see the point of benefit of built-in,
just think the oft criticism of Scheme by Common Lispers. Yeah, Scheme
programers can almost easily code any of the built-in functions that
exists in CL out of the box, but then every programer will have to
code it, and see differente versions and incompatibilities...etc.

Also, the availability large number of built-in functions has
tremendous impact on the ease of coding with the language. When you
don't have the set A, it may be that it's trivial to create A by
coding it yourself. But if you have A built in, it's trivial to build
set B. But without A built in, B is now a bit difficult. The gist is
that, you can't built a 2nd story without a 1st story.

Mathematica comes with a thousand (or few thousands) of built in
functions (and high-quality ones, not just web collection of Joes like
Perl's CPAN). Similarly, is Java. And to a lesser degree, Python, PHP.
These large number of built-in functions contribute to their success.

(note: “built in” may be in a form of lang function or library;
different lang uses different terminology for this. e.g. regex is
“built-in” in perl as part of lang, but in Python it's a default
module, but both are considered built in here. In java many things are
considered as “packages” even they are there out of the box. The gist
of the notion “built in” here is in the sense ready-to-use out of the
box)

As i mentioned before, for a language to be popular and practically
useful, a huge number of built-in functionality is one of the main
contribution. Language's features typically talked about by computer
scientist morons, such as all the details and variations of types,
inferences, higher-level/first-class function, curry, closure, multi-
inheritance, dispatch, lazy evaluation, pattern matching, tail
recursion, closure, call-with-continuation, ... and many other
jargons, really have _little_ significance to daily practice.

Every studious moron with a computer science degree, can create a
language (like that of Larry Wall, Guido van Rossum, and now Paul
Graham). But to create a high-quality set of huge number of libraries
(as in Java, Mathematica), is extremely difficult. (suppose you have a
super-natural power, that instaneously you can suck all knowledge of
any 2 greatest computer scientist into your brain. So, let's suppose
you possess knowledge of Donald Knuth and, maybe say any other lang
creator. By yourself in your life time you can probably only produce 5
or so high-quality libraries specific to few problems. You will need a
team of tens of people, each a expert in particular field, working for
years, to create anything like the built-in's of Java or Mathematica.)

  Xah
  ···@xahlee.org
∑ http://xahlee.org/

☄




On Feb 4, 10:37 am, ·······@yahoo.com (Robert Maas, see http://tinyurl.com/uh3t)
wrote:
> > From:XahLee<····@xahlee.org>
> > The cons business. (lists are made up of cons cells and it is
> > necessary for the programer to understand this to write any non-
> > trivial lisp program)
>
> IMO the way to write a program is to design an abstract data type
> then to use it. Only the person hand-coding the implementation of
> that ADT need worry about low-level details such as cons cells.
> Most of the time the programmer can use mapping functions (for
> lists, or for association lists, or for property lists, or for hash
> tables, or for various binary trees such as AVL trees) and not
> really care that these various structures are really built of cons
> cells. But the fact that structures (except hash tables and arrays)
> are directly built out of cons cells allows different objects to
> share common structure. For example, large binary trees can share
> all but the changes, thereby allowing update of a tree
> non-destructively to take log(n) time, keeping both old and new
> around simultaneously taking only log(n) extra space beyond a
> single copy.
>
> > No highlevel treatment of lists. (In Mathematica, there are a
> > system of functions to extract or manipulate (nested) lists
> > considered as trees. e.g. getting all elements at a level, or
> > arbitrary set of nodes of a tree. All these based on a tree index
> > concept. (i.e. first branch's second branch's third's branch will
> > be 1,2,3.) In lisp, programers uses cons, car, cdr, caadr, etc.
>
> Why 1,2,3 instead of 0,1,2? That point is too petty to fuss over.
> Otherwise, you're just plain wrong. Look up the fuction NTH,
> which indexes down a list by a given number.
> Then imagine how trivial it is to write a function that uses a list
> of successive indexes to specify the element number at successive
> levels within a list, calling NTH at each level.
> You can even offset by one if you really really need that.
From: Jon Harrop
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <13qjtgd970v6q55@corp.supernews.com>
Xah Lee wrote:
> Mathematica comes with a thousand (or few thousands) of built in
> functions (and high-quality ones, not just web collection of Joes like
> Perl's CPAN).

That is not my experience:

During my PhD I needed to convolve two "lists" and I found a serious bug in
Mathematica's ListConvolve function where it was silently destroying its
input (my data!). Had Mathematica been an open source tool I could have
fixed it myself but, instead, I had to explain the problem and solution to
the guys at WRI and wait for a bug fix release to become available and then
pay for it. Now I can write a convolution in one line of OCaml code that is
not only correct but is 4x faster than Mathematica's thanks to FFTW.

I also wanted to use Mathematica's Compile function to improve the
performance of a function but discovered that it hung on recursive
functions (!). Mathematica 6 addresses this problem by rejecting all
recursive definitions. Can you imagine if Lisp's EVAL hung on or rejected
recursive functions?!

My PhD was largely on wavelets so I obtained Wolfram Research's own ($595!)
WaveletExplorer add-on only to discover that they deal solely with discrete
wavelets and completely ignore continuous wavelets. I suggested they change
the name to DiscreteWaveletExplorer and commercialized my own Mathematica
library as an alternative:

  http://www.ffconsultancy.com/products/CWT/?xl

Overall I was a happy Mathematica user and willing to pay �80 for a student
license at the time but I would definitely not say that its standard
library is higher quality than the next language.

The last useful functionality of Mathematica for me was graphics but Smoke
is orders of magnitude faster, built upon a better language and completely
free:

  http://www.ffconsultancy.com/products/smoke_vector_graphics/?xl

> As i mentioned before, for a language to be popular and practically
> useful, a huge number of built-in functionality is one of the main
> contribution.

Mathematica is not really a popular language: it is about as popular as
OCaml or Haskell and nothing like as popular as Java, C++ or even MATLAB.

Also, I needed textbook data structures and algorithms during my PhD and
found that Mathematica not only lacked them but could not be used to
implement them efficiently. Given that an efficient set union is impossible
for Mathematica users, for example, I would not say that its standard
library is comprehensive even in the context of mathematics, let alone
general-purpose computing.

> Language's features typically talked about by computer 
> scientist morons, such as all the details and variations of types,
> inferences, higher-level/first-class function, curry, closure, multi-
> inheritance, dispatch, lazy evaluation, pattern matching, tail
> recursion, closure, call-with-continuation, ... and many other
> jargons, really have _little_ significance to daily practice.

Putting everything into the standard library is only useful if you do not
have a package manager.

For example, Debian users can get an FFT implementation that is much faster
and more accurate than Mathematica's with:

  apt-get install fftw3-dev

> By yourself in your life time you can probably only produce 5
> or so high-quality libraries specific to few problems. You will need a
> team of tens of people, each a expert in particular field, working for
> years, to create anything like the built-in's of Java or Mathematica.)

LAPACK and FFTW cover the needs of 99% of Mathematica users. All you need is
a language that provides a decent interface to them and there are dozens of
such languages freely available for Linux.

The only aspect of Mathematica that is far more advanced that anything else
I've ever seen is its GUI "notebook" interface.

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/products/?u
From: Xah Lee
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <c421aae2-776b-4759-bd3d-ae4b3a381b9f@c4g2000hsg.googlegroups.com>
Xah Lee wrote:
«Mathematica comes with a thousand (or few thousands) of built in
functions (and high-quality ones, not just web collection of Joes like
Perl's CPAN).»


Jon Harrop wrote:
«That is not my experience: ...»

Jon, you are just picking bones. You do this to lisp, to haskell. You
are widely accussed as a “troll” because of the behavior like these.
(so am i hatted a troll)

Now, from what i see, you do provide interesting facts and details,
and in general NOT behaving like a XYZ lang fanatics that slaves in
comp.lang.XYZ. After all, you are knowledgeable (knew few functional
langs and have published books on them), so i like your posts and
reply to them.

------------------------------------

Jon wrote:
> During my PhD I needed to convolve two "lists" and I found a serious bug in
> Mathematica's ListConvolve function where it was silently destroying its
> input (my data!). Had Mathematica been an open source tool I could have
> fixed it myself but, instead, I had to explain the problem and solution to
> the guys at WRI and wait for a bug fix release to become available and then
> pay for it. Now I can write a convolution in one line of OCaml code that is
> not only correct but is 4x faster than Mathematica's thanks to FFTW.
>
> I also wanted to use Mathematica's Compile function to improve the
> performance of a function but discovered that it hung on recursive
> functions (!). Mathematica 6 addresses this problem by rejecting all
> recursive definitions.

I don't know whether your claims are true. But i assume they are.
These are bugs, jon. Common Lisp has bugs, Haskell has bugs, i see
them discussed in newsgroup posts. Perl has bugs. Javascript has bugs.
Mathematica has bugs. PHP has bugs. Python has bugs. I knew concrete
examples of these personally.

Here's the passage i wrote, and you quoted:

«Mathematica comes with a thousand (or few thousands) of built in
functions (and high-quality ones, not just web collection of Joes like
Perl's CPAN).»

Do you mean to say, that because the problems you encountered above,
you refute my assertion in my original message's context, that in
general Mathematica comes with few thousands built-in functions/
libraries that are high quality?



> My PhD was largely on wavelets so I obtained Wolfram Research's own ($595!)
> WaveletExplorer add-on only to discover that they deal solely with discrete
> wavelets and completely ignore continuous wavelets. I suggested they change
> the name to DiscreteWaveletExplorer and commercialized my own Mathematica
> library as an alternative:
>
>  http://www.ffconsultancy.com/products/CWT/?xl

Maybe if you kissed people's ass in your attitudes, they might sell
your package by now. This is the key to business Jon.

> The last useful functionality of Mathematica for me was graphics but Smoke
> is orders of magnitude faster, built upon a better language and completely
> free:
>
>  http://www.ffconsultancy.com/products/smoke_vector_graphics/?xl

I looked at the your site. It seems smoke is a 2D vector graphics lib.
2D! Jesus, 2D graphics libs are a dime a dozen today. There's flash,
and there's interactive geometry softwares, literally dozens of them,
free. There are also easy to use dedicated langs for 2D graphics/
animation. Have a whiff of my website:

http://xahlee.org/PageTwo_dir/MathPrograms_dir/plane_geometry.html

-----------------

Since we share interest in graphics programing. I have a question. I'm
looking for easy-to-use, 3D library that allows me to do live rotation
and animation. Any suggestion you have?


Here's the article describing features i like:

Requirements For A Visualization Software System For 2020
http://xahlee.org/3d/viz.html

Here's my recent blog about how painful it is:

Graphics Programing Pains
http://xahlee.org/3d/graphics_programing_pain.html

  Xah
  ···@xahlee.org
∑ http://xahlee.org/

☄
From: Jon Harrop
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <13qkj6ter3g1pb2@corp.supernews.com>
Xah Lee wrote:
> �Mathematica comes with a thousand (or few thousands) of built in
> functions (and high-quality ones, not just web collection of Joes like
> Perl's CPAN).�
> 
> Do you mean to say, that because the problems you encountered above,
> you refute my assertion in my original message's context, that in
> general Mathematica comes with few thousands built-in functions/
> libraries that are high quality?

Yes. I used Mathematica for many years and uncovered many serious bugs. I've
used OCaml for longer and uncovered no serious bugs at all. So I would say
that Mathematica does indeed have a large standard library, as you say, but
it is not high-quality.

>> My PhD was largely on wavelets so I obtained Wolfram Research's own
>> ($595!) WaveletExplorer add-on only to discover that they deal solely
>> with discrete wavelets and completely ignore continuous wavelets. I
>> suggested they change the name to DiscreteWaveletExplorer and
>> commercialized my own Mathematica library as an alternative:
>>
>>  http://www.ffconsultancy.com/products/CWT/?xl
> 
> Maybe if you kissed people's ass in your attitudes, they might sell
> your package by now. This is the key to business Jon.

If offered to sell the package to WRI for �10k but they weren't interested.
Mathematica has so few users (e.g. compared to MATLAB) that Mathematica
libraries are not commercially viable. I've contacted several other vendors
and they all drew the same conclusion: Mathematica libraries just don't
sell.

In contrast, I've started writing C# and F# libraries and they not only sell
comparatively well but there is a lot of low-hanging fruit on .NET.

>> The last useful functionality of Mathematica for me was graphics but
>> Smoke is orders of magnitude faster, built upon a better language and
>> completely free:
>>
>>  http://www.ffconsultancy.com/products/smoke_vector_graphics/?xl
> 
> I looked at the your site. It seems smoke is a 2D vector graphics lib.
> 2D! Jesus, 2D graphics libs are a dime a dozen today. There's flash,
> and there's interactive geometry softwares, literally dozens of them,
> free. There are also easy to use dedicated langs for 2D graphics/
> animation. Have a whiff of my website:
> 
> http://xahlee.org/PageTwo_dir/MathPrograms_dir/plane_geometry.html

None of them support high-performance rendering at arbitrary zoom as Smoke
already does. Try replicating our tiger demo in any of those other systems,
for example.

> Since we share interest in graphics programing. I have a question. I'm
> looking for easy-to-use, 3D library that allows me to do live rotation
> and animation. Any suggestion you have?

F# for Visualization is the gold standard:

  http://www.ffconsultancy.com/products/fsharp_for_visualization/?xl

;-)

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/products/?u
From: Robert Maas, see http://tinyurl.com/uh3t
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <rem-2008feb09-005@yahoo.com>
> From: Xah Lee <····@xahlee.org>
> Jon Harrop wrote ... Jon, you are just picking bones.

I side with Jon on this. Not supporting recursion is a *killer* in
any modern programming language. It's just not acceptable to
silently not have it working, then upon discovery simply disallow
it!! It wouldn't even have been acceptable if it had been openly
and honestly disallowed from the start, although at least that
would have been less despicable.

> I don't know whether your claims are true. But i assume they are.
> These are bugs, jon. Common Lisp has bugs, Haskell has bugs, i
> see them discussed in newsgroup posts. Perl has bugs. Javascript
> has bugs. Mathematica has bugs. PHP has bugs. Python has bugs. I
> knew concrete examples of these personally.

There are degrees of bugs:
- Mathematica is totally missing recursion. It's a total loser.
- CMUCL (on FreeBSD Unix) has only two or three bugs I ever discovered:
  - (sleep <noninteger.geq.1>) doesn't sleep at all. It was
     simple for me to write a patch to obtain correct behaviour:
     (defun cl-sleep (sec)
       (cond ((and (not (integerp sec)) (>= sec 1.0))
              (multiple-value-bind (int frac) (floor sec)
                (sleep int) (sleep frac)))
             (t (sleep sec))))
  - FORMAT with ~newline to ignore until first non-white on next
     line didn't work, so I wrote an ugly workaround to delete all
     such ~newline+white sequences before passing the format
     string, but that was fixed in new version a few years ago so I
     can call FORMAT directly now.
  - format strings in [C]ERROR had the same problem, fixed at the same time
- Both XLisp and PowerLisp are so grossly deficient and/or broken
   that they aren't worth using. I have't yet found a free CL for
   Macintosh that's really usable.
- PHP doesn't have integers, only floats. That seems to me an awful deficiency.
- I haven't used Perl or JavaScript enough to know how serious
   their bugs may be. Would somebody offer a brief ballpark
   estimate+example?

> Maybe if you kissed people's ass in your attitudes, they might
> sell your package by now. This is the key to business Jon.

That may be a fact of the business world, but that's no way to
write software now in the days of InterNet. Better to do it the
Amazon.com way of book/product reviews or newsgroup equivalent.
What's the best newsgroup for discussing the relative merits of
computer-mathematics systems? comp.soft-sys.math isn't active.
From: Xah Lee
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <02f074da-fdc5-41b6-8263-84c4778e0b40@e6g2000prf.googlegroups.com>
Robert Maas wrote:
«I side with Jon on this. Not supporting recursion is a *killer* in
any modern programming language. It's just not acceptable to silently
not have it working, then upon discovery simply disallow it!! It
wouldn't even have been acceptable if it had been openly and honestly
disallowed from the start, although at least that would have been less
despicable.»

Mathematica doesn't support recursion?

Dear Robert, your messages are degenerating into stupidity.
Mathematica supports recursion far better than any fucking of your
Common Lisp, Scheme, Haskell, or any motherfucking advanced function
langs.

Motherfucking stupid morons slaving in newsgroups. Fuck you.

  Xah
  ···@xahlee.org
∑ http://xahlee.org/

☄
From: John Thingstad
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <op.t599vdl8ut4oq5@pandora.alfanett.no>
P� Sun, 10 Feb 2008 00:21:54 +0100, skrev Robert Maas, see  
http://tinyurl.com/uh3t <·······@yahoo.com>:

>> From: Xah Lee <····@xahlee.org>
>> Jon Harrop wrote ... Jon, you are just picking bones.
>
> I side with Jon on this. Not supporting recursion is a *killer* in
> any modern programming language. It's just not acceptable to
> silently not have it working, then upon discovery simply disallow
> it!! It wouldn't even have been acceptable if it had been openly
> and honestly disallowed from the start, although at least that
> would have been less despicable.
>

I think you misunderstand here. Mathematica does have recursion.
The problem comes when you try to compile the code.
But compiling code in Mathematica is unusual.
It involves entering a Compile[code..] into the code.
Mathematica doesn't have functions in the traditional sense.
It matches the left side of the the expression for the closest match using  
unification and pattern matching.
This feature is essential for parsing algebras etc. But as it isn't a  
simple function it confuses compile.
Compile is primarily meant to speed up numerical calculations.


--------------
John Thingstad
From: Robert Maas, see http://tinyurl.com/uh3t
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <rem-2008feb11-005@yahoo.com>
> From: "John Thingstad" <·······@online.no>
> Mathematica does have recursion.
> The problem comes when you try to compile the code.
> But compiling code in Mathematica is unusual.

Somebody said if the code isn't compiled, it runs a thousand times
slower. For any major program component, it would seem essential to
compile it to give anywhere near decent speed. How would we feel
about Lisp if the compiler didn't work on anything recusrive, for
example READ had to always be run interpreted? Would Lispers
tolerate that??

> Compile is primarily meant to speed up numerical calculations.

So if you define a function that performs some sort of butterfly to
efficiently perform arithmetic such as modular exponentiation on
very large integers, and of course it's recursive, you lose???

Does Mathematica provide support for (unix) inter-process streams
(pipes), so that you could have a Lisp sitting alongside it to
perform calcuations that required recursion?
From: Jon Harrop
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <13r3j0aa5q9cr5e@corp.supernews.com>
Robert Maas, see http://tinyurl.com/uh3t wrote:
>> From: "John Thingstad" <·······@online.no>
>> Mathematica does have recursion.
>> The problem comes when you try to compile the code.
>> But compiling code in Mathematica is unusual.
> 
> Somebody said if the code isn't compiled, it runs a thousand times
> slower. For any major program component, it would seem essential to
> compile it to give anywhere near decent speed. How would we feel
> about Lisp if the compiler didn't work on anything recusrive, for
> example READ had to always be run interpreted? Would Lispers
> tolerate that??

Exactly.

>> Compile is primarily meant to speed up numerical calculations.
> 
> So if you define a function that performs some sort of butterfly to
> efficiently perform arithmetic such as modular exponentiation on
> very large integers, and of course it's recursive, you lose???

You might be able to manually translate it into a loop. Also, compiling code
changes its evaluation semantics in Mathematica and, in particular, you
lose access to all arbitrary-precision stuff.

> Does Mathematica provide support for (unix) inter-process streams
> (pipes), so that you could have a Lisp sitting alongside it to
> perform calcuations that required recursion?

Yes. Mathematica also has .NET Link which allows you to write programs
in .NET languages that can use and be used by Mathematica programs. I
provided an example in my forthcoming book "F# for Scientists" of a small
F# program that evaluates Mathematica expressions several times faster than
Mathematica does.

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/products/?u
From: Joachim Durchholz
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <1202676191.6879.6.camel@kurier>
Robert Maas, see http://tinyurl.com/uh3t:
> - PHP doesn't have integers, only floats.

PHP has lots of bugs and design problems, but it does have an integer
type.

Regards,
Jo
From: Robert Maas, see http://tinyurl.com/uh3t
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <rem-2008feb11-006@yahoo.com>
> From: Joachim Durchholz <····@durchholz.org>
> PHP has lots of bugs and design problems, but it does have an
> integer type.

Hmm, I missed that. Thanks for correction.
Could you give me PHP code example snippets of how to:
- Assign a variable from an integer literal of 60 digits, such as:
    438924389065827658792346589265897236589716598723658743265872
- Parse a string which contains a sequence of such digits to yield
   an integer value, and assign a variable from that value
If integers can't be of arbitrary size in PHP, how large can they be?
From: John Thingstad
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <op.t6elhvvxut4oq5@pandora.alfanett.no>
På Tue, 12 Feb 2008 08:46:03 +0100, skrev Robert Maas, see  
http://tinyurl.com/uh3t <·······@yahoo.com>:

>> From: Joachim Durchholz <····@durchholz.org>
>> PHP has lots of bugs and design problems, but it does have an
>> integer type.
>
> Hmm, I missed that. Thanks for correction.
> Could you give me PHP code example snippets of how to:
> - Assign a variable from an integer literal of 60 digits, such as:
>     438924389065827658792346589265897236589716598723658743265872

C integers. Use PHP class Bignum for this sort of thing

> If integers can't be of arbitrary size in PHP, how large can they be?

The size of an integer is platform-dependent, although a maximum value of  
about two billion is the usual value (that’s 32 bits signed).
PHP does not support unsigned integers.

> - Parse a string which contains a sequence of such digits to yield
>    an integer value, and assign a variable from that value

same name as the C function (hint) - sscanf
list($month, $day, $year) = sscanf($mandate, "%s %d %d")

--------------
John Thingstad
From: lisp linux
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <ouCdnW4sOplS8yfanZ2dnUVZ_o-mnZ2d@comcast.com>
Robert Maas, see http://tinyurl.com/uh3t wrote:
>> From: Joachim Durchholz <····@durchholz.org>
>> PHP has lots of bugs and design problems, but it does have an
>> integer type.
> 
> Hmm, I missed that. Thanks for correction.
> Could you give me PHP code example snippets of how to:
> - Assign a variable from an integer literal of 60 digits, such as:
>     438924389065827658792346589265897236589716598723658743265872
> - Parse a string which contains a sequence of such digits to yield
>    an integer value, and assign a variable from that value
> If integers can't be of arbitrary size in PHP, how large can they be?
Not entirely related to the post being replied to.

I thought this may be useful info to share for comparison with php or just for fun
 From the php manual
http://www.php.net/manual/en/language.types.integer.php
quote
Integer overflow

If you specify a number beyond the bounds of the integer type,
it will be interpreted as a float instead. Also, if you perform
an operation that results in a number beyond the bounds of the
integer type, a float will be returned instead.

endquote

I actually ran into this *feature* with some php code that looked like literal
translation of c code. But the c code depended on the overflow (for some key
distribution across bucket kind of stuff). The php code failed in a way
that was hard to find. This was some memcached client written in php.
I forget the details.

BTW, of the languages I have worked with, php seems to be unique in this *feature*
Anyone know of other examples ?

-Antony
From: Jon Harrop
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <13rkuuheq07jg6e@corp.supernews.com>
lisp linux wrote:
> BTW, of the languages I have worked with, php seems to be unique in this
> *feature* Anyone know of other examples ?

Nearest I can think of is Mathematica promoting underflowed floats to
arbitrary precision. Everything was already approximate so it doesn't break
code but it does make it 100x slower for no discernable reason. :-)

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/products/?u
From: John Thingstad
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <op.t538w5u2ut4oq5@pandora.alfanett.no>
>
> The only aspect of Mathematica that is far more advanced that anything  
> else
> I've ever seen is its GUI "notebook" interface.
>

Well said. That is my experience too.

--------------
John Thingstad
From: John
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <c35c112a-af36-461d-8860-d366476cba95@l1g2000hsa.googlegroups.com>
On Feb 6, 10:42 am, "John Thingstad" <·······@online.no> wrote:
> > The only aspect of Mathematica that is far more advanced that anything
> > else
> > I've ever seen is its GUI "notebook" interface.
>
> Well said. That is my experience too.
>
> --------------
> John Thingstad

Apparently it's really great if you can't handle the concept of a
linked list of pointers, those confusing cons thingies ;)
From: Robert Maas, see http://tinyurl.com/uh3t
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <rem-2008feb09-004@yahoo.com>
> From: Jon Harrop <······@jdh30.plus.com>
> During my PhD I needed to convolve two "lists" and I found a
> serious bug in Mathematica's ListConvolve function where it was
> silently destroying its input (my data!).

Does the documentation clearly say it's a destructive operation? If
not, I agree it's a *horrible* bug, so that function needs to be
avoided (until the bug is fixed). But ...

> I also wanted to use Mathematica's Compile function to improve
> the performance of a function but discovered that it hung on
> recursive functions (!). Mathematica 6 addresses this problem by
> rejecting all recursive definitions. Can you imagine if Lisp's EVAL
> hung on or rejected recursive functions?!

IMO that's a showstopper, a complete language rejector! The only
language I ever used that didn't support recursion was FORTRAN
("with format" and "2d") circa 1964-67 and possibly also FORTRAN IV
a few years later. The reason was that it stored the return address in
an impure cell contiguous with the function instead of on a separate
stack, so each function had only one such place to hold *its* *own*
return address. (First *-* for emphasis, like bold. Second *-* to
denote computer science jargon "own variables" which are privately
accessible but static bound.) FORTRAN IV on VM/CMS (as recently as
1991 when I last worked at Stanford) had a similar problem: IBM
linkage, used by FORTRAN, has all registers saved and restored by
means of a block of memory, one such block per function. (Note that
even if a function actually changes only a couple registers, all
sixteen of them are saved and restored, for simplicity, not
efficiency.)

> My PhD was largely on wavelets so I obtained Wolfram Research's
> own ($595!) WaveletExplorer add-on only to discover that they deal
> solely with discrete wavelets and completely ignore continuous
> wavelets.

Why didn't you ask around on newsgroups for consumer feedback, like
the book reviews o Amazon.com, before making such a significant
purchase for which you'd need a requisition rather than take
(shake?) out of the petty cash jar (piggy bank?)?

> Also, I needed textbook data structures and algorithms during my
> PhD and found that Mathematica not only lacked them but could not
> be used to implement them efficiently. Given that an efficient set
> union is impossible for Mathematica users, for example, I would not
> say that its standard library is comprehensive even in the context
> of mathematics, let alone general-purpose computing.

If what you say is true, that's pretty awful. What low-level
primitive is missing such that you can't even efficiently roll your
own? What do you classify as "efficient" for set union? Do you have
in mind using lists (with duplicates already eliminated) for
emulating each set, and a hash table for registering all the
elements in each set, and thereby being able to efficiently detect
which elements are in both sets (for example by mapping down the
elements of the smaller set checking if they are also in the
hashtable of the larger set), hence need to be put just once in the
union?

> Putting everything into the standard library is only useful if
> you do not have a package manager.

That's a good point. Actually the word "package" doesn't mean the
same thing in different languages, so more needs to be said here.
First, there are two primary meanings of the term:
- A namespace (that's what "package" means in Common Lisp, and at a
   higher level what it means in Java where a package is the next
   level above a class but a class is really the namespace of
   interest except when doing include package.*)
- A single file of utilities to be loaded

In languages such as C which have only one namespace, the second
meaning is used. include "stdio.h" causes a single file to be
included in the source causing (usually) a single compiled file to
be loaded at load time. A "package manager" for C is nothing more
than the collection of header files to pick-and-choose inclusion in
source, and corresponding documentation so that application
programmers know which heaader hence library to include.

In languages like Lisp, where there are many packages, but there's
a USER package for developing new code, and programmers do most
development work peacemeal within a single invocation of the
runtime environment, loading source files or even indiidual
functions one by one, there's a "package manager" within the core
language to deal with keeping track of which packages are loaded
and what symbols are in each package, but also another kind of
"package manager" consisting of whatever method (such as asdf, or
undefined-function-autoload-hook) is used to load individual files
as needed.

I take it from what you say that Mathematica has only one name
space and no way to incrementally load new files as it's discovered
that some function/method within them is needed for the task in
progress (triggered by encountering an undefined function) or even
the next task to start (invoked by the user manually)?
From: Jon Harrop
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <13qtkj4pq1n3k4f@corp.supernews.com>
Robert Maas, see http://tinyurl.com/uh3t wrote:
>> From: Jon Harrop <······@jdh30.plus.com>
>> During my PhD I needed to convolve two "lists" and I found a
>> serious bug in Mathematica's ListConvolve function where it was
>> silently destroying its input (my data!).
> 
> Does the documentation clearly say it's a destructive operation? If
> not, I agree it's a *horrible* bug, so that function needs to be
> avoided (until the bug is fixed). But ...

Exactly. The documentation explicitly states that it is a non-destructive
operation but it was destroying its input.

>> I also wanted to use Mathematica's Compile function to improve
>> the performance of a function but discovered that it hung on
>> recursive functions (!). Mathematica 6 addresses this problem by
>> rejecting all recursive definitions. Can you imagine if Lisp's EVAL
>> hung on or rejected recursive functions?!
> 
> IMO that's a showstopper, a complete language rejector! The only
> language I ever used that didn't support recursion was FORTRAN
> ("with format" and "2d") circa 1964-67 and possibly also FORTRAN IV
> a few years later. The reason was that it stored the return address in
> an impure cell contiguous with the function instead of on a separate
> stack, so each function had only one such place to hold *its* *own*
> return address. (First *-* for emphasis, like bold. Second *-* to
> denote computer science jargon "own variables" which are privately
> accessible but static bound.) FORTRAN IV on VM/CMS (as recently as
> 1991 when I last worked at Stanford) had a similar problem: IBM
> linkage, used by FORTRAN, has all registers saved and restored by
> means of a block of memory, one such block per function. (Note that
> even if a function actually changes only a couple registers, all
> sixteen of them are saved and restored, for simplicity, not
> efficiency.)

To be fair, this only affects compiled anonymous functions. Mathematica can
still interpret recursive functions but it is very slow (~1,000x slower
than a compiled language in general).

>> My PhD was largely on wavelets so I obtained Wolfram Research's
>> own ($595!) WaveletExplorer add-on only to discover that they deal
>> solely with discrete wavelets and completely ignore continuous
>> wavelets.
> 
> Why didn't you ask around on newsgroups for consumer feedback, like
> the book reviews o Amazon.com, before making such a significant
> purchase for which you'd need a requisition rather than take
> (shake?) out of the petty cash jar (piggy bank?)?

Probably wouldn't have done much good, as it happens. Computer scientists
seem to be blissfully unaware of continuous wavelets and often teach and
learn that only discrete wavelets are significant.

>> Also, I needed textbook data structures and algorithms during my
>> PhD and found that Mathematica not only lacked them but could not
>> be used to implement them efficiently. Given that an efficient set
>> union is impossible for Mathematica users, for example, I would not
>> say that its standard library is comprehensive even in the context
>> of mathematics, let alone general-purpose computing.
> 
> If what you say is true, that's pretty awful. What low-level
> primitive is missing such that you can't even efficiently roll your
> own?

Compilation. :-)

I ported my ray tracer benchmark to Mathematica for a laugh and it runs 30x
slower than OCaml's interpreted bytecode, let alone its optimizing
native-code compiler.

> What do you classify as "efficient" for set union? Do you have 
> in mind using lists (with duplicates already eliminated) for
> emulating each set, and a hash table for registering all the
> elements in each set, and thereby being able to efficiently detect
> which elements are in both sets (for example by mapping down the
> elements of the smaller set checking if they are also in the
> hashtable of the larger set), hence need to be put just once in the
> union?

I would copy the OCaml implementation and use immutable balanced binary
trees.

> I take it from what you say that Mathematica has only one name
> space and no way to incrementally load new files as it's discovered
> that some function/method within them is needed for the task in
> progress (triggered by encountering an undefined function) or even
> the next task to start (invoked by the user manually)?

Mathematica does have something like that: hierarchical namespaces and it
can recognise packages when they're stored in the correct location on disk.
That's how our wavelet transform package integrates neatly into
Mathematica. Look at our examples:

  http://www.ffconsultancy.com/products/CWT/HTML/tutorial.html

However, Mathematica lacks a decent number of third party packages (Debian
has ~100,000 packages but Mathematica has more like 100) and repository for
them.

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/products/?u
From: Robert Maas, see http://tinyurl.com/uh3t
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <rem-2008feb15-001@yahoo.com>
> >> ... Mathematica 6 addresses this problem by
> >> rejecting all recursive definitions. ...

OK, I took this to mean that Mathematica 6 rejects all recursive
definitions, absolutely, not just its compiler (which had the *bug*
in earlier versions), presumably so that interpreted and compiled
behaviour will be consistent. On that basis I wrote:

> > IMO that's a showstopper, a complete language rejector!

> From: Jon Harrop <······@jdh30.plus.com>
> To be fair, this only affects compiled anonymous functions.
> Mathematica can still interpret recursive functions but it is
> very slow (~1,000x slower than a compiled language in general).

Are you talking about the earlier versions that merely had a bug,
or the new major-version 6 which "rejects all recursive
definitions" in the words of the other poster?

> >> My PhD was largely on wavelets so I obtained Wolfram Research's
> >> own ($595!) WaveletExplorer add-on only to discover that they deal
> >> solely with discrete wavelets and completely ignore continuous
> >> wavelets.
> > Why didn't you ask around on newsgroups for consumer feedback, like
> > the book reviews on Amazon.com, before making such a significant
> > purchase for which you'd need a requisition rather than take
> > (shake?) out of the petty cash jar (piggy bank?)?
> Probably wouldn't have done much good, as it happens. Computer
> scientists seem to be blissfully unaware of continuous wavelets and
> often teach and learn that only discrete wavelets are significant.

Well, with nearly six hundred dollars to spend, you should at least
have *tried* to get some useful info prior to purchase. Try both
newsgroups or other public sources (has anybody purchased Wolfram
Research's WaveletExplorer, and if so have you checked whether it
handles continuous wavelets), and asking sales representatives at
Wolfram Research itself (get something in writing, even if by
e-mail, that says it does indeed handle continuous wavelets as an
informal guarantee that it'll be of use to you). Since you got
burned, have you tried to get your money refunded?

> >> Also, I needed textbook data structures and algorithms during my
> >> PhD and found that Mathematica not only lacked them but could not
> >> be used to implement them efficiently. Given that an efficient set
> >> union is impossible for Mathematica users, for example, I would not
> >> say that its standard library is comprehensive even in the context
> >> of mathematics, let alone general-purpose computing.
> > If what you say is true, that's pretty awful. What low-level
> > primitive is missing such that you can't even efficiently roll your
> > own?
> Compilation. :-)
> I ported my ray tracer benchmark to Mathematica for a laugh and
> it runs 30x slower than OCaml's interpreted bytecode, let alone its
> optimizing native-code compiler.

I take it your benchmark is recursive? Is there any reason you
couldn't have re-written it slightly to emulate stacks by linked
lists, thereby avoided recursion, so that you could have gotten it
compiled?

Note that there are multiple levels of interpreting. Reference to
Lisp and Java and Basic are useful here:
- Source-text-level interpreting, as with the original BASIC
   interpretor for the 8080 CPU: The raw text is re-parsed every
   time a line of code is re-executed. The only things that help at
   all are that every line of code starts with a keyword such as
   LET or GOSUB etc. allowing dispatching to special-purpose
   parsers for each type of statement, every variable is either a
   single letter (for integers) or a dollar sign and single letter
   (for floating-point), and there is no syntax extending beyond
   single lines of text.
- Parse-tree interepreting, such as what was done by Lisp EVAL
   (hence within the Read-Eval-Print loop, all code called by it
   except anything that was previously compiled)) before the advent
   of JIT (Just In Time) compilation by the REP loop. Parsing was
   done just once at READ time. Function definitions as well as
   toplevel forms were all parse trees (CONS-cell nested lists)
   rather than raw text strings.
- Byte-code dispatching/coding, such as the JVM (Java Virtual
   Machine), where a rather efficient direct-dispatching mechanism
   dominates the "interpreting" algorithm. On some architectures,
   byte-code software actually runs faster than machine code
   because both the commonly used parts of the byte interpretor and
   the byte-code version of the active parts of the application
   reside in *fast*cache* most of the time, reducing the number of
   times main-memory (RAM) needs to be used. This works because
   byte-code is much more compact than machine code, because byte
   code needs handle only the primitives of that one high-level
   language rather than also all the alternatives provided by the
   design team of the CPU itself.

Consequently the "let alone ..." remark may be out of line. If
byte-code is the fastest available for large software programs
where cache performance dominates over instruction execution per
se, then perhaps the benchmark is running only 30x as slow as the
fastest possible on that architecture? Do you have a direct
comparison of byte-code and native-code (both with OCaml) to tease
out the full truth of these comparisons? (Just curious.)

> > What do you classify as "efficient" for set union? Do you have
> > in mind using lists (with duplicates already eliminated) for
> > emulating each set, and a hash table for registering all the
> > elements in each set, and thereby being able to efficiently detect
> > which elements are in both sets (for example by mapping down the
> > elements of the smaller set checking if they are also in the
> > hashtable of the larger set), hence need to be put just once in the
> > union?
> I would copy the OCaml implementation and use immutable balanced
> binary trees.

I presume you mean the algorithms where trees are updated by
creating new structure for all the parts that changed and sharing
any sub-trees that haven't changed? Thus you can collate two trees
(for all the usual purposes: all the nontrivial boolean functions:
and ior xor eqv nand nor andc1 andc2 orc1 orc2; all the nontrivial
predicates: equal notequal subset notsubset backsubset
notbacksubset eithersubset neithersubset) making just one pass over
most/all of the nodes (in some cases you can skip a whole sub-tree
in one of the given trees if there's a gap between adjacent
elements in the other tree)?

I suppose in theory you can re-write those algorithms to emulate
the stack via linked lists, then you will be able to compile the
code in Mathematica? That would be a big pain of course, but has
anybody tried it?

It would actually be of general interest to compare true
machine-stack recursion against emulation of recursion via pushing
and popping on linked lists, where both modes are available for use
by compiled code, to see whether there's any signficant speed
advantage to true machine-stack recursion. It would be especially
interesting to try the comparison on systems that have multiple
processors, where the application programmer has control over when
thread-splitting may occor in the algorithm:
For example, with a recursive algorithm (defun myfun (tree) ...)
that at one point says:
(cons (myfun (car tree)) (myfun (cdr tree)))
it might instead say:
(thread-split-let ((val1 (myfun (car tree)))
                   (val2 (myfun (cdr tree))))
  (cons val1 val2))
Likewise an interative version of the algorithm that looked like this:
(let ((todostack (list tree)) (resultsstack (list)))
  (loop ... ;various code for pushing incompleted second branches
            ; onto todostack and later pushing unused return values
            ; onto resultstack, something like that anyway, I haven't
            ; written this kind of code in like forever
    ))
might be converted for multiple threads by not using a todostack at
all, instead always calling thread-split-let to descend deeper into
the two branches of the given tree in parallel.

Threads are a standard part of Java. Maybe I should try writing
this in Java? First I need to implement thread-split-let of course!

For my benchmark, I need an algorithm where traversing the two
trees, not performing calculations, dominates the process. Thus my
divide-and-conquer version of factorial would *not* be a good
example. Maybe something like give two large sets (as trees), or
other large collection of data stored in a tree, which are almost
identical, find all the (very small number of) differences. To make
reasonable data, I'd read in a corpus of text, building a
left-context tree from it, then read in a little bit more text,
which includes just a few additinal left contexts, then compare the
old and new tree to show what new contexts were added? Only that
final comparison (find all differences) would be timed.

In any case, four comparisons on any given machine architecture:
- Normal recursion
- Thread-splitting recursion
- Stack emulation of recursion, single thread
- Stack emulation of recursion, splitting threads
Does anybody have access to a quad-core or larger machine running
the JVM which could be applied to this benchmark, where all cores
are kept busy all the time whenever enough threads are active, if I
ever get around to writing it in Java?

> Mathematica does have something like that: hierarchical
> namespaces and it can recognise packages when they're stored in
> the correct location on disk.

Ah, that sounds somewhat like the Java model for dealing with
hierarchial namespace.
From: Jon Harrop
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <13rek046g8fsb25@corp.supernews.com>
Robert Maas, see http://tinyurl.com/uh3t wrote:
>> >> ... Mathematica 6 addresses this problem by
>> >> rejecting all recursive definitions. ...
> 
> OK, I took this to mean that Mathematica 6 rejects all recursive
> definitions, absolutely, not just its compiler (which had the *bug*
> in earlier versions), presumably so that interpreted and compiled
> behaviour will be consistent. On that basis I wrote:

Sorry, I should have been more explicit when I wrote that.

>> > IMO that's a showstopper, a complete language rejector!
> 
>> From: Jon Harrop <······@jdh30.plus.com>
>> To be fair, this only affects compiled anonymous functions.
>> Mathematica can still interpret recursive functions but it is
>> very slow (~1,000x slower than a compiled language in general).
> 
> Are you talking about the earlier versions that merely had a bug,
> or the new major-version 6 which "rejects all recursive
> definitions" in the words of the other poster?

Previous versions hung, the latest version bails with an error message
stating that Compile does not support recursion (AFAIK).

>> >> My PhD was largely on wavelets so I obtained Wolfram Research's
>> >> own ($595!) WaveletExplorer add-on only to discover that they deal
>> >> solely with discrete wavelets and completely ignore continuous
>> >> wavelets.
>> > Why didn't you ask around on newsgroups for consumer feedback, like
>> > the book reviews on Amazon.com, before making such a significant
>> > purchase for which you'd need a requisition rather than take
>> > (shake?) out of the petty cash jar (piggy bank?)?
>> Probably wouldn't have done much good, as it happens. Computer
>> scientists seem to be blissfully unaware of continuous wavelets and
>> often teach and learn that only discrete wavelets are significant.
> 
> Well, with nearly six hundred dollars to spend, you should at least
> have *tried* to get some useful info prior to purchase...

That's exactly what I did. I never actually bought the software. I used
WRI's documentation to find an obscure file name from its distro and
Googled for a user who had been stupid enough to export their copy of the
software. Then I downloaded it and installed it in order to try it out.
Finding out that it didn't provide any of the wavelet-related features I
required, I deleted it and didn't buy it.

>> I ported my ray tracer benchmark to Mathematica for a laugh and
>> it runs 30x slower than OCaml's interpreted bytecode, let alone its
>> optimizing native-code compiler.
> 
> I take it your benchmark is recursive? Is there any reason you
> couldn't have re-written it slightly to emulate stacks by linked
> lists, thereby avoided recursion, so that you could have gotten it
> compiled?

The benchmark is recursive but Mathematica actually spends a lot of its time
evaluating ray-sphere intersections which is a straightforward block of
code (no recursion). Mathematica's Compile does work on that and it does
increase performance a bit but still nothing like as fast as a native-code
executable.

> Note that there are multiple levels of interpreting. Reference to
> Lisp and Java and Basic are useful here:
> - Source-text-level interpreting, as with the original BASIC
> ...
> - Parse-tree interepreting, such as what was done by Lisp EVAL
> ...
> - Byte-code dispatching/coding, such as the JVM (Java Virtual
> ...

Mathematica's generic term rewriting is most closely approximated by what
you just called "Parse-tree interepreting", also known as "term-level
interpreting".

OCaml bytecode is an example of interpreted bytecode using an optimized
interpreter written in C.

> Consequently the "let alone ..." remark may be out of line. If
> byte-code is the fastest available for large software programs
> where cache performance dominates over instruction execution per
> se, then perhaps the benchmark is running only 30x as slow as the
> fastest possible on that architecture? Do you have a direct
> comparison of byte-code and native-code (both with OCaml) to tease
> out the full truth of these comparisons? (Just curious.)

Sure, I've still got the code here. OCaml's native code compiler is 17x
faster than OCaml's bytecode interpreter on this benchmark. So Mathematica
is ~500x slower than native-code compiled OCaml.

>> I would copy the OCaml implementation and use immutable balanced
>> binary trees.
> 
> I presume you mean the algorithms where trees are updated by
> creating new structure for all the parts that changed and sharing
> any sub-trees that haven't changed? Thus you can collate two trees
> (for all the usual purposes: all the nontrivial boolean functions:
> and ior xor eqv nand nor andc1 andc2 orc1 orc2; all the nontrivial
> predicates: equal notequal subset notsubset backsubset
> notbacksubset eithersubset neithersubset) making just one pass over
> most/all of the nodes (in some cases you can skip a whole sub-tree
> in one of the given trees if there's a gap between adjacent
> elements in the other tree)?

Exactly, yes.

> I suppose in theory you can re-write those algorithms to emulate
> the stack via linked lists, then you will be able to compile the
> code in Mathematica? That would be a big pain of course, but has
> anybody tried it?

Ugh. Yes, I think you're right. I've never seen that done though.

> It would actually be of general interest to compare true
> machine-stack recursion against emulation of recursion via pushing
> and popping on linked lists, where both modes are available for use
> by compiled code, to see whether there's any signficant speed
> advantage to true machine-stack recursion.

I'm actually interested in something similar because I'd like to generate
garbage collected programs using LLVM. They provide a "shadow stack"
algorithm that does exactly that. I still haven't found the time to play
with it but LLVM is an awesome project...

> It would be especially 
> interesting to try the comparison on systems that have multiple
> processors, where the application programmer has control over when
> thread-splitting may occor in the algorithm:
> ...
> Threads are a standard part of Java. Maybe I should try writing
> this in Java? First I need to implement thread-split-let of course!

That would be interesting but I'd reach for F# rather than Java (assuming
you have a Windows box). You might also be interested in the SML/NJ
compiler for SML which does something similar (transformation to
continuation passing style) to eliminate all system stack usage.

> For my benchmark, I need an algorithm where traversing the two
> trees, not performing calculations, dominates the process. Thus my
> divide-and-conquer version of factorial would *not* be a good
> example. Maybe something like give two large sets (as trees), or
> other large collection of data stored in a tree, which are almost
> identical, find all the (very small number of) differences. To make
> reasonable data, I'd read in a corpus of text, building a
> left-context tree from it, then read in a little bit more text,
> which includes just a few additinal left contexts, then compare the
> old and new tree to show what new contexts were added? Only that
> final comparison (find all differences) would be timed.

Yes, I think that would be an awesome benchmark for such things. I was
actually talking to Simon Peyton-Jones about this recently, in the context
of benchmarking GHC's parallel performance.

There is a practically-relevant program that uses this technique from my
book "OCaml for Scientists", the "n"th-nearest neighbor code:

  http://www.ffconsultancy.com/products/ocaml_for_scientists/complete/?cll

That would be much more compelling that Fibonacci or factorial numbers. You
can even do a funky interactive real-time 3D visualization of the results
as I do in my new book "F# for Scientists". I'll upload the code to this
page ASAP:

  http://www.ffconsultancy.com/products/fsharp_for_scientists/?cll

> In any case, four comparisons on any given machine architecture:
> - Normal recursion
> - Thread-splitting recursion
> - Stack emulation of recursion, single thread
> - Stack emulation of recursion, splitting threads
> Does anybody have access to a quad-core or larger machine running
> the JVM which could be applied to this benchmark, where all cores
> are kept busy all the time whenever enough threads are active, if I
> ever get around to writing it in Java?

The biggest multicore I can buy is on my shopping list... ;-)

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/products/?u
From: Robert Maas, see http://tinyurl.com/uh3t
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <rem-2008feb07-001@yahoo.com>
> > IMO the way to write a program is to design an abstract data type
> > then to use it. Only the person hand-coding the implementation of
> > that ADT need worry about low-level details such as cons cells.
> From: Xah Lee <····@xahlee.org>
> Exactly. The problem with lisp's cons business, is that it
> surfaced to the language level in a way that a programer cannot
> ignore it.

So it's accessible to any random programmer, rather than being
accessible only to the wizards deep inside the vendor. Is that a
good or a bad thing? What are the alternatives to the Lisp way,
assuming you're going to have a linked-list data type, with methods
or functions that do ordinary things with them such as traversing
and mapping and reading and printing, in the first place.

First, why would you *want* a linked-list data type?
-1- So that you can use it as a stack, in o(1) time to push or pop an
     item, without requiring allocation of a block of contiguous
     memory with fixed size (or royal pain to re-size if the stack
     gets too large).
-2- So that tails can be shared, to conserve memory and reduce CPU
     time when copying a list with only a small change at the start,
     while sharing side-effects (if any) on all the shared part.
     (Special case: If the list is readonly after construction,
      i.e. no side effects whatsoever on existing list elements.)
-3- So that you can read a large list into memory without knowing at
     the outset how large it may be and without needing to pre-allocate
     how large you guess it may be and re-copy the entire old allocation
     into a new larger allocation if it turns out larger than guessed.

Now purposes -1-,-3- can be satisfied by an OO-style object with
limited actions allowed: make, push, pop, isEmpty, peekTop.
Reverse and nreverse can be included if you ever need them.
(READ is higher level, calling make once, then calling push many times,
 then calling reverse or nreverse once.)

Now for READ of multi-level lists, you need to decide whether there
will be a single toplevel Object, with all ordinary lisp CONS-cell
structure under it, so that NTH will need to make a new Object
whenever the element is itself a linked list, or READ will create a
new Object immediately for every sub-list at every level, so then
NTH doesn't need to do anything special. Do you have a preference?

Head and tail pointers can be kept in the Object if you want,
whereupon you can allow pushEnd also (but not popEnd).
(Then READ can call pushEnd instead of push, and doesn't need reverse
 or nreverse at very end. But then you are forced to build a whole
 new Object at each level of list structure, because READ is a
 high-level program that can't know about the internals of the
 linked-list class, right?)
But then it's impossible to safely share tails between two
different linked-list Objects.

Furthermore, the Object can keep track of the length dynamically,
giving length in time o(1) instead of o(n). So if that's all you
want, then I suggest you have somebody implement a CLOS-style
object like that and then you never have to mess with car/cdr/cons
ever again. NTH, and SETF of NTH, can be provided in o(n) time
also. NTHCDR can be provided in o(n) time plus the overhead of
making a whole new object, and you have to decide whether the
implementation shares structure or copies all tails entirely. SETF
of NTHCDR can be provided too if you want, time o(n) but underlying
car/cdr/rplacd hidden from user, except then you lose the ability
to keep track of length dynamically and thereby deliver it in o(1)
time.

For all that plus having NTH and related methods available in
o(log(n)) time, but at the cost of push and pop o(log(n)) instead
of o(1) time, keep same API but switch to some kind of binary tree
internally. But that's not what you really want!!! The whole point
of using linked lists is that push and pop are o(1) time (and READ
doesn't have to know the length of input in advance). So back to
linked lists as the underlying implementation ...

But if you really need -2- done extensively, I see no really good
idea for implementation except to have linked-list cells directly
visible, i.e. car/cdr/cons. You *could* use the OO implementation
where it builds a whole new Object every time you do anything that
gives you a new "starting point" into the linked list, i.e.
whenever you do NTHCDR (probably renamed NTHTAIL), while still
sharing the underlying structure, but if you are doing a *lot* of
such operations, sharing tails among a very large number of
different linked-list Objects, that's a lot of overhead just to
hide the underlying CONS cells from the programmer. Is that what
you really want?

My thought here is that if the application programmer really wants
the linked-list type of data, then he/she needs to understand that
some sort of pairs (cells with down and right pointers) exists, and
would be frustrated at not being able to get at them directly.

Does any intelligent application programmer *really* want an API
that secretly uses linked-list representation without revealing
that fact, where the application programmer must read the timing
specs (i.e. o(1) for push/pop, o(n) for NTH, etc.) for the API and
wonder why it doesn't just come out and say it's a linked-list
representation, bringing us back to the frustration of the previous
paragraph?

> a lisp programer, can not practically code in lisp by using
> higher level list abstractions and functions without
> understanding the cons business.

Why would any application programmer, who has deliberately chosen
to use a linked-list ADT for purpose of constant-time push&pop and
the other advantages I described above, want to *not* understand
that a linked-list has cells with links pointing down to elements
and across to other linked-list cells??? What's the advantage of
not understanding the basic idea of the underlying representation
of the ADT that you have chosen to use??

> Because, in practice, the using cons is common.

If you're using linked lists to emulate stacks, then CONS is
actually non-destructive PUSH. Why wouldn't an application
programmer using lists to emulate stacks, and wishing to keep a
backup of the old value in case a rollback is needed, want to know
how to do a PUSH while *not* destroying the old value? It seems to me
that it'd be harder to understand that you can SETQ the old value
to a new value then PUSH one of the two variables and the *other*
variable won't be affected, because in OO programming if you assign
the same object to two variables and modify the object then *both*
variables are pointing at the modified object. It seems to me that
it makes more sense to directly offer CONS (which can be renamed
PUSH-NONDESTRUCTIVE if you want).

If you're building binary trees, then CONS is a way to build a new
tree out of two old trees, and CAR/CDR are ways to recover one or
the other of the original lists. Why wouldn't a person building
trees out of parts, and decomposing them back to parts, want to
know how to combine two trees into a larger tree and how to recover
the two sub-trees later??? (Of course you'd rename all three
functions, for example MAKE-TREE-FROM-PARTS LEFT-SUBTREE and
RIGHT-SUBTREE.)

For virtually every other application of linked lists, you do *not*
need car/cdr/cons directly at all. You use LIST to directly build
lists with a fixed set of parts. You use the mapping functions to
transform an old list to a new list elementwise. You use NTH to
select single elements by index, and SETF of NTH to change such an
element. You use SUBSEQ to extract ranges of elements from an old
list. You use APPEND to combine two lists into a list of all the
elements from both in the obvious way. You use READ and
PRINT/FORMAT to convert between internal lists and external print
representation, etc. None of those require the application
programmer to directly deal with cons/car/cdr operations. All you
need to understand is how some operations take longer than others
because they require copying of structure, but you need to know
that anyway even if you hide all the implementation details within
an ADT.

> You will encounter its use in just about every lisp program.

No. Most commonly you PUSH and POP without requiring a backup kept
around, or you don't do list-implementing-stack or
cons-implementing-tree at all in your application. You define
higher-level ADTs of lists-implementing-structures via NTH and SETF
of NTH, not by direct CAR/CDR.

> And, due to the cons business at the language level, there is the
> problem of =E2=80=9Cproper list=E2=80=9D and =E2=80=9Cimproper
> list=E2=80=9D. So, you can't pretend that cons doesn't exist and
> use only high-level list functions. They won't work.

You aren't *required* to use any improper lists in your
application, so you don't *need* to have this problem around in the
first place.

> > Look up the fu[n]ction NTH,
> > which indexes down a list by a given number.
> > Then imagine how trivial it is to write a function that uses a list
> > of successive indexes to specify the element number at successive
> > levels within a list, calling NTH at each level.

> The issue here is not whether it is easy to code it.

Then what *is* the issue? If you want a trivial extra layer of
software, and it's very very easy to implement it, and there's no
performance penalty compared to the wizards implementing it, then
why make such a fuss that somebody else didn't already write it for
you?

> First of all, as i gave the reason above, the cons business in the
> language stops any possibility of a uniform interface of list
> manipulating functions to the underlying cons.

That comment makes no sense to me. I gave above a sketch of a
uniform interface for using a linked-list as a stack, hiding the
implementation within a CLOS class if you want. The uniform
interface for mapping and reading/printing and
appending/subsequencing and for choosing/modifying element by index
already exist in CL, and you could embed that in a CLOS layer too
if you really wanted to hide the implementation. What more could
you want?

> Secondly, there is a big practical benefit of having functions
> built-in, even if trivial to code.

I agree. But IMO CL already has a very large set of useful
functions, both the kinds of stuff that wizards need to implement
because it's too hard or inefficient for the appliation programmer
to try to do it, and stuff that J. Random application programmer
could have done him/herself but having it built-in makes it more
convenient. But there has to be a limit of what's already built-in
the core language. Maybe there are five or ten additional functions
that would be really nice to also include, such as your proposed
(defun map-indexes-down-list-levels (listOfIndexes multiLevelList) ...)
and the HDDL mechanism for running ASSOC down multiple levels of
tagged lists that Tryg Ager implemented in PSL about twenty years
ago, etc. So maybe you can write a nice add-on package to include
these functions you feel just about everyone would want, and market
it? This is in addition to the special-purpose add-on packages
such as support for regular expressions, XML, HTTP/HTML, etc.

But if you are the only person in the world who wants your proposed
function, then why should somebody else have implemented it and
fixed it into the core CL standard???

> Also, the availability large number of built-in functions has
> tremendous impact on the ease of coding with the language.

Only if it's easy to find the function you want when you have a
need for its functionality. See for example my attempt to
re-organize the commonly useful functions to process the
commonly-available data types across six common programming
languages (Common Lisp, Java, C, C++, Perl, and PHP) that are easy
to use with CGI.
<http://www.rawbw.com/~rem/HelloPlus/CookBook/Matrix.html>

> When you don't have the set A, it may be that it's trivial to
> create A by coding it yourself. But if you have A built in, it's
> trivial to build set B. But without A built in, B is now a bit
> difficult. The gist is that, you can't built a 2nd story without a
> 1st story.

That's why various add-on packages have become popular. But the key
is to have an information retrieval system that lets you find the
appropriate 1st-story package that you'll need for the de novo
2nd-story you intend to craft. Would be willing to survey all the
existing CL add-on packages that are available on the net, discard
the obscure very-specific or badly-designed ones, and nicely index
the non-obscure well-designed ones that ought to be available to
application programmers?

> Mathematica comes with a thousand (or few thousands) of built in
> functions (and high-quality ones, not just web collection of Joes
> like Perl's CPAN). Similarly, is Java. And to a lesser degree,
> Python, PHP. These large number of built-in functions contribute to
> their success.

It's my understanding that CL has a comparable number of functions,
of overall quality comparable to Java's standard release of J2SE.
Some Java stuff is missing in CL, and some CL stuff is missing in
Java, but I don't think CL is overall deficient compared to Java.

> ... for a language to be popular and practically useful, a huge
> number of built-in functionality is one of the main contribution[s].

Which both Common Lisp and Java have, so what's your complaint?

> Language's features typically talked about by computer scientist morons

Oh lookey, you're being all derogatory. (I snipped when you
complained that I said something you interpreted as derogatory, but
this *is* derogatory/insulting, no question about it.)

> such as all the details and variations of types,

All the essential types, except one, are in Common Lisp:
- integers of size limited only by available memory, present in
   Java only if you switch to a special object type that doesn't
   work well with the rest of the language, not present in any
   other core language that I know of (for example PHP doesn't have
   integers at all, it uses floats for everything)
- rationals with size of numerator and denominator limited only by
   available memory, not present in any other general purpose
   language that I know of, present in MacSyma (which is based on
   MacLisp), probably present in other special-purpose math
   packages I've never had access to
- floating-point approximations to real numbers (present everywhere)
- complex numbers
- interval arithmetic not present in *any* mainstream language, not
   even Common LIsp
- general-purpose pairs, not present in Java at all unless you
   emulate pairs by two-element arrays of element-type java.lang.Object
   and always use class wrappers for numbers whenever they might be
   included in pairs
- support for linked lists using those general-purpose pairs, no
   support whatsoever for such in Java
- **symbols**, no support whatsoever in any other language, the
   closest you can get in Java is barebones hash tables, then you
   have to re-invent the wheel if you want an object in the hash
   table to have a print name and a value cell and a function cell
   and a property list and a link back to the hash table it's
   cataloged in
- streams, limited support in Common Lisp for streams connected to
   files in the system's filesystem, and to stdio, and to strings
   acting as if files; this is where Java shines, where every
   stream is a full Object, and anyone can build a stream connected
   to anything and the wrap the usual BufferedReader etc. around it
   for normal I/O. MacLisp had filearrays which allowed application
   programmers to define new types of streams that worked just like
   regular ones, but Common Lisp dropped that as not anything the
   general community would want, which I still consider to have
   been a serious mistake
- closures fully wonderful, merely hacked in Java
- user-defined classes in CLOS better in Common Lisp than in Java
   because they are runtime-polymorphic on *every* parameter instead
   of runtime-polymorphic on just one special pseudo-parameter and
   only compiletime-polymorphic on all the regular parameters

It seems to me that all other data types are application specific.
If you do network stuff, you might want XML parser, or HTML parser,
or HTTP client and server, or RPC/RMI client and server, etc.,
depending on what specific kind of network stuff you wish to do.
If you write GUI applications, then of course you need the usual
assortment of windows/panes/panels and controls (close/resize/minimize,
textfield, textarea, checkbox, radiobutton, regular button, etc.).
If you want to process text in multiple non-USASCII languages, or
in any single language that has more than 256 characters, you
probably need support for UniCode, including input/output of UTF-8.
(I'll have to say lack of built-in support for UniCode is, for
 non-American users, the number one deficiency in Common Lisp.
 The major vendors have filled this gap, but not in a standard
 portable way.)

> inferences

Please define what you mean there. It sounds like a special
application area to me, not anything the average application
programer will crave for if it's missing.

> higher-level/first-class function

This is what Common Lisp has, right? Or do you mean something more?

> curry

(defun foo (x y z) ...)
(defun curry-foo-x (x)
  (function (lambda (y z) (foo x y z))))
(defun curry-foo-y (y)
  (function (lambda (x z) (foo x y z))))
(defun curry-foo-z (z)
  (function (lambda (x y) (foo x y z))))
;is that right?
You can pick whichever parameter you want to curry.
You aren't limited to currying only the first or last depending on
which was built into the language.

You can even write generic currying meta-functions, for example:
(defun curry-3-1 (f123 p1)
  (function (lambda (p2 p3) (funcall f123 p1 p2 p3))))

You can even write a totally generic macro for generating currying
functions on the fly (just a sketch here):
(defmacro curry-k-of-n (f n k)
  "f is a function object
   0 <= k < n = number of parameters of f
   curry the kth parameter of f, leaving all the rest as parameters of result"
   ...)
;(curry-k-of-n #'foo 3 0) ==> the curry-foo-x from above

So what else do you want??
Should curry-k-of-n be built into the delivered system, including
correct handling of optional/keyword parameters off the end after
the first n parameters?

> closure, multi-inheritance

In CL already, as I said above.

> dispatch

What do you mean?
CASE/SELECT statements/expressions?
Runtime polymorphism of objects?
Handling interrupts/events (both device-generated and software-generated)?
Please specify which you want.

> lazy evaluation

This has not yet been reduced to standard practice, to where there
is any single specification that we can all agree upon. For
example, in recent years I have been advocating a system for
drawing a dataflow diagram for interval arithmetic where lazy
evaluation is used to compute approximate intervals narrower and
narrower as needed to produce a meaningful decision/output. The
general idea is to use a mathematical identity to yield a directed
computation which forms a stable feedback loop. For example, given
N known non-negative, you compute the square root of N via Newton's
method expressed as a feedback loop. Whenever you need to narrow
the interval, you track the dataflow backwards until you reach a
place where you can get more accuracy, then you compute forwards
again to see how narrow your output interval is now.
<http://www.rawbw.com/~rem/IntAri/>
That's doing the dataflow manually to illustrate how the interval
arithmetic itself works. Nobody expressed any interest in this
work, so there's no funding, so I haven't developed the idea
further except in my mind. Maybe you'd like to show some interest?

> pattern matching

This is a totally open-ended subject. Until you specify what kinds
of patterns you want to match (patterns of characters in a string,
patterns of trends in an economy, patterns of facial
characteristics, patterns of debris in high-energy atomic
collisions, patterns of sub-graphs in large mathematical graphs,
patterns in signals received from deep space which might possibly
be from ETI, patterns in whale "songs" which might represent a
meaningful grammar and/or meaningful discourse about ocean
conditions, patterns in brain activity which might be useful for
detecting honesty vs. deceit, etc. etc. etc.), you haven't a chance
of even getting started at the research project much less a
standard algorithm/heuristic.

> tail recursion

This is not as useful as it's hyped up to be. Sure if you write a
factorial recursively to emulate iteration, a compiler can convert
it back to the equivalent iterative algorithm to avoid stack
overflow. Whooptie-doo. True recursion isn't tail recursion in the
first place so you don't need this case handled specially. For
example, the right way to compute factorial recursively is:
(defun prod (m n)
  (if (< (+ m 1) n)
      (let ((mid (floor (+ m n) 2)))
        (* (prod m mid) (prod mid n)))
      m))
(defun fact (n) (prod 1 (+ 1 n)))

The whole point of recursion is divide-and-conquer, epecially if
you have multiple processors which can run in parallel on sub-tasks.
Always spawing one sub-task to do a trivial case and the other
sub-task to do all the work except that one trivial case, whereby
you actually compute the trivial case directly and tail-recurse on
the all-the-rest, is outright stupid.

> closure

Didn't we already cover this? Lisp has it.

> call-with-continuation

I'm not familiar with that jargon, but it sounds like something to
do with co-routines. I'll do a Google search ...

<http://jist.ece.cornell.edu/jist-user/node12.html>
   ... For example, an entity that models a file-transfer is
   more readily encoded as a process than as a sequence of events.
   Specifically, one would rather use a tight loop around a blocking send
   routine than dispatch send events to some transport entity ...
Yeah, co-routines. Two processes each think they are calling the other,
and they take turns waiting for the co-routine linkage to return.

<http://www.pluralsight.com/blogs/dbox/archive/2005/04/27/7780.aspx>
   continuation passing style (CPS) ... is based on the
   notion that instead of returning a value from a function, the value is
   passed to the code that will continue the computation.
Is that anything like what you have in mind?

<http://www.ibm.com/developerworks/java/library/os-lightweight9/index.html>
          In Ruby, you use callcc, which means call with continuation.
          Call with continuation gives you a code block, and the
          continuation (or a saved call stack, with instance variables),
          which you assign to a variable. Think of a continuation as all
          of the program that hasn't executed yet.
<same question>

By the way, I've come to the line of thinking that the "business
logic" (actually the business calculations mostly) should be
written as an explicit state-of-computation object together with
functions/methods which perform individual transformation steps of
the business algorithm. Once that'd done, it's relatively trivial
to write a toplevel wrapper that has one process call the other, or
vice versa, or have the two processes act as co-routines. Any time
you need to parse an input stream, you design the parser explicitly
as a state machine (using a parser-compiler to convert from BNF or
equivalent expression of the grammar to the state machine). So if
we want to standardize this methodology, we need to agree upon a
standard representation of the state machine which can then be
automatically processed by the main application-builder. Do you
know of any good results in this area?

> and many other jargons, really have _little_ significance to
> daily practice.

Is that meant to be derogatory towards the daily programmers who
don't know beans about powerful concepts that could be of great use
to them but which unfortunately aren't provided by their favorite
programming languages or aren't tutorialized well enough, or
derogatory towards the people who envision these wonderful ideas
which turn out to be useless in practice?? I seriously believe
dataflow feedback loops with interval arithmetic is the *right* way
to go with complicated real-metric numeric calcuiations. I am
totally sure that lexical closures are useful. (I use them "all the
time" when I program applications!!)

> Every studious moron with a computer science degree, can create a
> language (like that of Larry Wall, Guido van Rossum, and now Paul
> Graham). But to create a high-quality set of huge number of
> libraries (as in Java, Mathematica), is extremely difficult.

I really can't believe that Mathematica has a huge set of
general-purpose libraries, the way that Java and Common Lisp do.
I thought that Mathematica was a special purpose system for doing
mostly the same sorts of things that MacSyma used to do, only
better and more commercially, i.e. closed-form mathematical
expression manipulation such as calculus and infinite-series
summation, and graphing the results. Am I mistaken??
From: Jon Harrop
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <13qphamemuf7p61@corp.supernews.com>
I think you're explaining why CL has things from Xah's list of things that
he does not value, e.g. Mathematica lacks many of the features that Xah
enumerated. It is interesting to me, but probably not to Xah.

Robert Maas, see http://tinyurl.com/uh3t wrote:
>> lazy evaluation
> 
> This has not yet been reduced to standard practice, to where there
> is any single specification that we can all agree upon. For
> example, in recent years I have been advocating a system for
> drawing a dataflow diagram for interval arithmetic where lazy
> evaluation is used to compute approximate intervals narrower and
> narrower as needed to produce a meaningful decision/output. The
> general idea is to use a mathematical identity to yield a directed
> computation which forms a stable feedback loop. For example, given
> N known non-negative, you compute the square root of N via Newton's
> method expressed as a feedback loop. Whenever you need to narrow
> the interval, you track the dataflow backwards until you reach a
> place where you can get more accuracy, then you compute forwards
> again to see how narrow your output interval is now.
> <http://www.rawbw.com/~rem/IntAri/>
> That's doing the dataflow manually to illustrate how the interval
> arithmetic itself works. Nobody expressed any interest in this
> work, so there's no funding, so I haven't developed the idea
> further except in my mind. Maybe you'd like to show some interest?

Ironically, Mathematica already does that (and it is a good idea, yes). :-)

>> pattern matching
> 
> This is a totally open-ended subject. Until you specify what kinds
> of patterns you want to match (patterns of characters in a string,
> patterns of trends in an economy, patterns of facial
> characteristics, patterns of debris in high-energy atomic
> collisions, patterns of sub-graphs in large mathematical graphs,
> patterns in signals received from deep space which might possibly
> be from ETI, patterns in whale "songs" which might represent a
> meaningful grammar and/or meaningful discourse about ocean
> conditions, patterns in brain activity which might be useful for
> detecting honesty vs. deceit, etc. etc. etc.), you haven't a chance
> of even getting started at the research project much less a
> standard algorithm/heuristic.

I assume Xah is referring to pattern matching over algebraic datatypes.

>> tail recursion

Note that Mathematica doesn't implement tail calls and (probably) doesn't
even implement mutual tail recursion.

>> call-with-continuation
> 
> I'm not familiar with that jargon, but it sounds like something to
> do with co-routines. I'll do a Google search ...

Callcc can be useful and is missing in all of the languages I use (and
Common Lisp).

>> Every studious moron with a computer science degree, can create a
>> language (like that of Larry Wall, Guido van Rossum, and now Paul
>> Graham). But to create a high-quality set of huge number of
>> libraries (as in Java, Mathematica), is extremely difficult.
> 
> I really can't believe that Mathematica has a huge set of
> general-purpose libraries, the way that Java and Common Lisp do.
> I thought that Mathematica was a special purpose system for doing
> mostly the same sorts of things that MacSyma used to do, only
> better and more commercially, i.e. closed-form mathematical
> expression manipulation such as calculus and infinite-series
> summation, and graphing the results. Am I mistaken??

The whole of Mathematica is designed to make numeric and symbolic
computation easy. In the context of general-purpose programming,
Mathematica's standard library is grossly deficient. For example, you
cannot create efficient implementations of any tree-based data structures
(RB, AVL, finger trees etc.).

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/products/?u
From: Robert Maas, see http://tinyurl.com/uh3t
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <rem-2008feb09-006@yahoo.com>
> From: Jon Harrop <······@jdh30.plus.com>
> I think you're explaining why CL has things from Xah's list of
> things that he does not value, e.g. Mathematica lacks many of the
> features that Xah enumerated.

As I said, his wording was clear as to why he said "... jargons,
really have _little_ significance to daily practice", since he
included some really essential features I use all the time. Was he
saying daily practice fails to use these wonderful features because
people treat those features with undue disrespect, or because he
(Xah) disrespects them himself and is saying they aren't needed (in
his opinion)? Is he a total idiot who can't understand why closures
and a few other things are generally useful in programming? I was
giving him the benefit of doubt that he was saying that normal
Programming Dummies aren't using those good features, rather than
that those features aren't good in the first place.

By the way, one of the most common uses for closures is to curry
(freeze) one or more of the parameters to a function so that it can
be passed to map the one remaining parameter down a list:
For example: constant c, and list l, lexically visible here:
  (mapcar #'(lambda (x) (+ x c)) l)

> It is interesting to me, but probably not to Xah.

Closures *should* be interesting/useful to just about *anyone*
wanting to write serious software.

> > ... in recent years I have been advocating a system for
> > drawing a dataflow diagram for interval arithmetic where lazy
> > evaluation is used to compute approximate intervals narrower and
> > narrower as needed to produce a meaningful decision/output. The
> > general idea is to use a mathematical identity to yield a directed
> > computation which forms a stable feedback loop. For example, given
> > N known non-negative, you compute the square root of N via Newton's
> > method expressed as a feedback loop. Whenever you need to narrow
> > the interval, you track the dataflow backwards until you reach a
> > place where you can get more accuracy, then you compute forwards
> > again to see how narrow your output interval is now. ...
> Ironically, Mathematica already does that (and it is a good idea, yes). :-)

Double irony that they don't even support recursion. It must take
some presence of mind to make sure you never do recursion when
backtracking dataflow around a loop during lazy evaluation!

Do you know what this is *called* there, so that I can do a Google
search to find a description of their methodology to compare with
my own ideas? Trying Google anyway:  Mathematica interval arithmetic dataflow
   Did you mean: Mathematica interval arithmetic data flow
   CONDOR: Constraint-Based Dataflow
Is that it?
   Processor with reconfigurable arithmetic data path - Patent 6247036
Beware of dog!!

That Google search didn't turn up anything vaguely like what we're
talking about. What keyword to search for??

> I assume Xah is referring to pattern matching over algebraic datatypes.

I don't even know what that means!

> Note that Mathematica doesn't implement tail calls and (probably)
> doesn't even implement mutual tail recursion.

Did you miss the note that it doesn't support recursion at all?

> >> call-with-continuation
> > I'm not familiar with that jargon, but it sounds like something to
> > do with co-routines. I'll do a Google search ...
> Callcc can be useful and is missing in all of the languages I use
> (and Common Lisp).

Google search for:  Callcc

<http://en.wikipedia.org/wiki/Continuation>
The remarks about continuation-style programming possibly being
useful for the Web (to avoid needing to explicitly write algorithms
as state machines to achieve inverted control) is interesting.

http://en.wikipedia.org/wiki/Call-with-current-continuation>
   Taking a function f as its only argument, call/cc reifies the current
   continuation (i.e. control context or control state of the program) as
   an object and applies f to it. ...
Let's see if I understand that: The only kind of function that can
be passed to call/cc is one that takes exactly one parameter, and
that one parameter is the passed state-of-computation object, right?
So if you want to pass a function with actual parameters other than
the state-of-computation, you need to curry all those other
parameters first, then pass the resultant zero-parameter (except
state-of-computation parameter) function as f, right?

So Scheme (the pragramming environment we're talking about) doesn't
keep the program-state as a first-class citizen all the time it
runs, rather call/cc does an introspection of the (efficient)
program state to *construct* the first-class-object representation
of it to pass? But how does that representation of a program state
ever get converted back to an actual (efficient) program state to
allow continued running from there? Is it essentially an emulation
of threads, where sometime later somebody says "hey, let's pick up
that process where it left off" and converts the object back to the
actual machine state? If so, why not just use threads, which are
well supported in several modern programming languages?

> In the context of general-purpose programming, Mathematica's
> standard library is grossly deficient. For example, you cannot
> create efficient implementations of any tree-based data structures
> (RB, AVL, finger trees etc.).

Ah, this is why I chose your article for a response!

What primitives are missing from Mathematica that cause this
deficiency? Is it that ordered pairs (like CONS cells, or Java's
arrays of class Object *1*) are entirely missing? Or that you *can*
make ordered pairs if you want, but the storage and processing
overhead per pair is much larger than in Lisp, perhaps as bad as if
you had to use a Java instance of class Vector for *each* ordered
pair, and go through the entire object-oriented dispatch every time
you did the equivalent of CAR or CDR or SETF thereof?

*1* Something like this perhaps:
  public static Object[] CONS(Object car, Object cdr) {
    Object[] newCell = new Object[2];
    newCell[0] = car;
    newCell[1] = cdr;
    return newCell;
  }
  public static Object CAR(Object cell) { return cell[0]; }
  public static Object CDR(Object cell) { return cell[1]; }
From: Jon Harrop
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <13qtkic3kki1l4e@corp.supernews.com>
Robert Maas, see http://tinyurl.com/uh3t wrote:
>> Ironically, Mathematica already does that (and it is a good idea, yes).
>> :-)
> 
> Double irony that they don't even support recursion. It must take
> some presence of mind to make sure you never do recursion when
> backtracking dataflow around a loop during lazy evaluation!
> 
> Do you know what this is *called* there, so that I can do a Google
> search to find a description of their methodology to compare with
> my own ideas? Trying Google anyway:  Mathematica interval arithmetic
> dataflow
>    Did you mean: Mathematica interval arithmetic data flow
>    CONDOR: Constraint-Based Dataflow
> Is that it?
>    Processor with reconfigurable arithmetic data path - Patent 6247036
> Beware of dog!!
> 
> That Google search didn't turn up anything vaguely like what we're
> talking about. What keyword to search for??

I've no idea what they call it, sorry.

>> I assume Xah is referring to pattern matching over algebraic datatypes.
> 
> I don't even know what that means!

Algebraic datatypes are found in MLs (like OCaml and F#) and Haskell. They
are composed of sum and product types:

  type t =
    | A of int * string
    | B of bool * float * char
    | C

The choice A|B|C is a sum type: a value is any one of those constructors.
The arguments "bool * float * char" are a product type: a value must
contain all of those fields.

Mathematica has a single type: the symbolic expression. This takes a form
like:

  type expr =
    | Number of number
    | Symbol of string
    | String of string
    | List of sequence
    | Function of string * sequence
  and sequence = expr array

Mathematica lacks static type checking so its pattern matcher is a lot like
the cl-unification library for Lisp.

>> Note that Mathematica doesn't implement tail calls and (probably)
>> doesn't even implement mutual tail recursion.
> 
> Did you miss the note that it doesn't support recursion at all?

Mathematica does support recursion but its Compile function (that compiles
to an interpreted bytecode) does not support recursive anonymous functions.

Actually, I can't think of any other languages that support recursive
anonymous functions anyway: none of Lisp, Scheme, SML, OCaml, F# and
Haskell do AFAIK.

>> Callcc can be useful and is missing in all of the languages I use
>> (and Common Lisp).
> 
> Google search for:  Callcc
> 
> <http://en.wikipedia.org/wiki/Continuation>
> The remarks about continuation-style programming possibly being
> useful for the Web (to avoid needing to explicitly write algorithms
> as state machines to achieve inverted control) is interesting.
> 
> http://en.wikipedia.org/wiki/Call-with-current-continuation>
>    Taking a function f as its only argument, call/cc reifies the current
>    continuation (i.e. control context or control state of the program) as
>    an object and applies f to it. ...
> Let's see if I understand that: The only kind of function that can
> be passed to call/cc is one that takes exactly one parameter, and
> that one parameter is the passed state-of-computation object, right?
> So if you want to pass a function with actual parameters other than
> the state-of-computation, you need to curry all those other
> parameters first, then pass the resultant zero-parameter (except
> state-of-computation parameter) function as f, right?
> 
> So Scheme (the pragramming environment we're talking about) doesn't
> keep the program-state as a first-class citizen all the time it
> runs, rather call/cc does an introspection of the (efficient)
> program state to *construct* the first-class-object representation
> of it to pass? But how does that representation of a program state
> ever get converted back to an actual (efficient) program state to
> allow continued running from there? Is it essentially an emulation
> of threads, where sometime later somebody says "hey, let's pick up
> that process where it left off" and converts the object back to the
> actual machine state? If so, why not just use threads, which are
> well supported in several modern programming languages?

Callcc is a lot more like forking than threads. Some Scheme implementations
support callcc and there is an extension for OCaml's bytecode interpreter
that does (by copying the bytecode stack).

>> In the context of general-purpose programming, Mathematica's
>> standard library is grossly deficient. For example, you cannot
>> create efficient implementations of any tree-based data structures
>> (RB, AVL, finger trees etc.).
> 
> Ah, this is why I chose your article for a response!
> 
> What primitives are missing from Mathematica that cause this
> deficiency? Is it that ordered pairs (like CONS cells, or Java's
> arrays of class Object *1*) are entirely missing?

Arrays are Mathematica's primitive data type and it calls them
(confusingly) "List"s.

> Or that you *can* make ordered pairs if you want,

Yes, this is an ordered pair in Mathematica:

  {1, 2}

> but the storage and processing 
> overhead per pair is much larger than in Lisp, perhaps as bad as if
> you had to use a Java instance of class Vector for *each* ordered
> pair, and go through the entire object-oriented dispatch every time
> you did the equivalent of CAR or CDR or SETF thereof?

Storage isn't a problem (Mathematica is actually quite good in this respect)
but processing is unbelievably slow in Mathematica. The reason is that it
is not your usual interpreted language, where you have an evaluator of the
form:

  val eval : state -> expr -> state * value

i.e. it interprets expressions into values.

In Mathematica, the whole language is built around a giant term rewriter:

  val rewrite : rules -> expr -> rules * expr

i.e. it rewrites expressions to create more expressions and there are never
any values.

When you write:

  Function[{n}, n+2][3]

Mathematica literally goes through the reduction steps in evaluating your
program:

  Function[{n}, n+2][3]
  With[{n=3}, n+2]
  3+2
  5

Mathematica goes to great lengths to evaluate and reduce in optimal order
but, at the end of the day, all this buys you is the performance of a
term-level interpreter (the slowest kind of language interpreter you can
get!).

Provided you stick to built in functions like Fourier, your program will run
almost as quickly as C. But as soon as there is no built-in function
implementing the core of your program for you, you're screwed and
performance goes out the window and everything runs 1,000x slower.
Mathematica's Compile function can be a saviour here, increasing
performance to that of an interpreted bytecode. But it is not without its
problems, lack of support for recursion being the most important one IMHO.

> *1* Something like this perhaps:
>   public static Object[] CONS(Object car, Object cdr) {
>     Object[] newCell = new Object[2];
>     newCell[0] = car;
>     newCell[1] = cdr;
>     return newCell;
>   }
>   public static Object CAR(Object cell) { return cell[0]; }
>   public static Object CDR(Object cell) { return cell[1]; }

Yes. That could be written:

  Cons[car_, cdr_] := {car, cdr}

But Mathematica's equivalent of cons cells are arrays of objects, so they
can store an arbitrary number of elements contiguously.

Speaking of which, why doesn't Lisp do that?

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/products/?u
From: David Formosa (aka ? the Platypus)
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <slrnfqtojh.tjo.dformosa@localhost.localdomain>
On Sun, 10 Feb 2008 10:26:42 +0000, Jon Harrop <······@jdh30.plus.com>
wrote: 

[...]

> Mathematica does support recursion but its Compile function (that compiles
> to an interpreted bytecode) does not support recursive anonymous functions.
>
> Actually, I can't think of any other languages that support recursive
> anonymous functions anyway: none of Lisp, Scheme, SML, OCaml, F# and
> Haskell do AFAIK.

You can do anonymous recusion in Lisp and Scheme via letrec.  Any
language where you can implement the Y-combinator you can get
anonymous recursion.
From: Joost Diepenmaat
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <87prv5c4re.fsf@zeekat.nl>
"David Formosa (aka ? the Platypus)" <········@usyd.edu.au> writes:

> On Sun, 10 Feb 2008 10:26:42 +0000, Jon Harrop <······@jdh30.plus.com>
> wrote: 
>> Actually, I can't think of any other languages that support recursive
>> anonymous functions anyway: none of Lisp, Scheme, SML, OCaml, F# and
>> Haskell do AFAIK.
>
> You can do anonymous recusion in Lisp and Scheme via letrec.  Any
> language where you can implement the Y-combinator you can get
> anonymous recursion.

FTR:

One language I know that directly supports anonymous recursion (without
Y-combinator) is JavaScript:

var fib = function(i) {
  if (i == 0 || i == 1) return 1;
  return (arguments.callee.call(null,i-1) + arguments.callee.call(null,i-2));
};

-- 
Joost Diepenmaat | blog: http://joost.zeekat.nl/ | work: http://zeekat.nl/
From: Ken Tilton
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <47af6a0f$0$15189$607ed4bc@cv.net>
Joost Diepenmaat wrote:
> "David Formosa (aka ? the Platypus)" <········@usyd.edu.au> writes:
> 
> 
>>On Sun, 10 Feb 2008 10:26:42 +0000, Jon Harrop <······@jdh30.plus.com>
>>wrote: 
>>
>>>Actually, I can't think of any other languages that support recursive
>>>anonymous functions anyway: none of Lisp, Scheme, SML, OCaml, F# and
>>>Haskell do AFAIK.
>>
>>You can do anonymous recusion in Lisp and Scheme via letrec.  Any
>>language where you can implement the Y-combinator you can get
>>anonymous recursion.
> 
> 
> FTR:
> 
> One language I know that directly supports anonymous recursion (without
> Y-combinator) is JavaScript:
> 
> var fib = function(i) {
>   if (i == 0 || i == 1) return 1;
>   return (arguments.callee.call(null,i-1) + arguments.callee.call(null,i-2));
> };
> 

Maybe I should do Cells/JS. Flapjax does FRP, but really badly, with 
"lifting".

The subtext here is that a port of my Algebra software to JS let's me 
distribute via the Web /and/ push the ship date past my shuffle-off 
date. Sweet!

kenny

-- 
http://smuglispweeny.blogspot.com/
http://www.theoryyalgebra.com/

"In the morning, hear the Way;
  in the evening, die content!"
                     -- Confucius
From: David Formosa (aka ? the Platypus)
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <slrnfqunc9.tjo.dformosa@localhost.localdomain>
["Followup-To:" header set to comp.lang.lisp.]
On Sun, 10 Feb 2008 14:36:37 +0100, Joost Diepenmaat <·····@zeekat.nl> wrote:
[...]
> One language I know that directly supports anonymous recursion (without
> Y-combinator) is JavaScript:
>
> var fib = function(i) {
>   if (i == 0 || i == 1) return 1;
>   return (arguments.callee.call(null,i-1) + arguments.callee.call(null,i-2));
>};

Likewise perl supports it.

my $fib;
$fib = sub {
  my $i = shift;
  return 1 if ($i <= 1);
  return $fib->($i-2) + $fib($i-1);
};
From: George Neuner
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <1hjvq3ph09fo5vgcuqqceh974vdm6cjn2h@4ax.com>
On Sun, 10 Feb 2008 11:23:32 GMT, "David Formosa (aka ? the Platypus)"
<········@usyd.edu.au> wrote:

>On Sun, 10 Feb 2008 10:26:42 +0000, Jon Harrop <······@jdh30.plus.com>
>wrote: 
>
>[...]
>
>> Mathematica does support recursion but its Compile function (that compiles
>> to an interpreted bytecode) does not support recursive anonymous functions.
>>
>> Actually, I can't think of any other languages that support recursive
>> anonymous functions anyway: none of Lisp, Scheme, SML, OCaml, F# and
>> Haskell do AFAIK.
>
>You can do anonymous recusion in Lisp and Scheme via letrec.  Any
>language where you can implement the Y-combinator you can get
>anonymous recursion.

Lisp doesn't have letrec - you use labels instead.  And technically
the functions are not anonymous because the forms require you to bind
or name them.  A function has to be named or symbol bound to be
recursive - else no way to refer to itself.  Ditto in most other
languages that support the equivalent of lambda.

I don't know Mathematica so I'm not sure what John really meant by
"recursive anonymous".  I can imagine having special syntax to
recursively call the current function without specifying it by name
... but it would be senseless fluff and I've never seen it done.

George
--
for email reply remove "/" from address
From: Rob Warnock
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <F4mdnWlBPvorkC3anZ2dnUVZ_ryqnZ2d@speakeasy.net>
George Neuner  <·········@/comcast.net> wrote:
+---------------
| "David Formosa (aka ? the Platypus)" <········@usyd.edu.au> wrote:
| >Jon Harrop <······@jdh30.plus.com> wrote: 
| >> Actually, I can't think of any other languages that support recursive
| >> anonymous functions anyway: none of Lisp, Scheme, SML, OCaml, F# and
| >> Haskell do AFAIK.
| >
| >You can do anonymous recusion in Lisp and Scheme via letrec.  Any
| >language where you can implement the Y-combinator you can get
| >anonymous recursion.
| 
| Lisp doesn't have letrec - you use labels instead.  And technically
| the functions are not anonymous because the forms require you to bind
| or name them.  A function has to be named or symbol bound to be
| recursive - else no way to refer to itself.
+---------------

Well, true, but you can keep the internal name from escaping by
returning the function value, making it effectively "anonymous":

    > (funcall
       (labels ((self (x) (if (< x 2) 1 (* x (self (1- x)))))) #'self)
       5)

    120
    > 

Or, unfolded to show that the function really *is* "anonymous":

    > (labels ((self (x) (if (< x 2) 1 (* x (self (1- x)))))) #'self)

    #<Interpreted Function (LABELS SELF) {48987EE1}>
    > (fboundp 'self)

    NIL
    > (funcall ** 5)

    120
    > (funcall *** 10)

    3628800
    >

Perhaps David Formosa was referring to this sort of thing...?


-Rob

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: George Neuner
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <57s1r3166e52nb1cq6i40re82m6se4tef6@4ax.com>
On Mon, 11 Feb 2008 02:56:22 -0600, ····@rpw3.org (Rob Warnock) wrote:

>George Neuner  <·········@/comcast.net> wrote:
>+---------------
>| "David Formosa (aka ? the Platypus)" <········@usyd.edu.au> wrote:
>| >Jon Harrop <······@jdh30.plus.com> wrote: 
>| >> Actually, I can't think of any other languages that support recursive
>| >> anonymous functions anyway: none of Lisp, Scheme, SML, OCaml, F# and
>| >> Haskell do AFAIK.
>| >
>| >You can do anonymous recusion in Lisp and Scheme via letrec.  Any
>| >language where you can implement the Y-combinator you can get
>| >anonymous recursion.
>| 
>| Lisp doesn't have letrec - you use labels instead.  And technically
>| the functions are not anonymous because the forms require you to bind
>| or name them.  A function has to be named or symbol bound to be
>| recursive - else no way to refer to itself.
>+---------------
>
>Well, true, but you can keep the internal name from escaping by
>returning the function value, making it effectively "anonymous":
>
>    > (funcall
>       (labels ((self (x) (if (< x 2) 1 (* x (self (1- x)))))) #'self)
>       5)
>
>    120
>    > 

Yes, I suppose that is anonymous in the way John intended.

It's a little more verbose in Scheme:

 ((lambda (x)
   (let self ((x x)) 
     (if (< x 2) 1 (* x (self (- x 1))))))
  5)
 120

but I think the Scheme makes it a little clearer that there is a named
self referential function in an anonymous wrapper.  

George
--
for email reply remove "/" from address
From: Jon Harrop
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <13r20kq5dp95dc2@corp.supernews.com>
George Neuner wrote:
> On Mon, 11 Feb 2008 02:56:22 -0600, ····@rpw3.org (Rob Warnock) wrote:
>>    > (funcall
>>       (labels ((self (x) (if (< x 2) 1 (* x (self (1- x)))))) #'self)
>>       5)
>>
>>    120
>>    > 
> 
> Yes, I suppose that is anonymous in the way John intended.

No, I do not consider a function explicitly called "self" to be
anonymous. :-)

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/products/?u
From: John Thingstad
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <op.t6d7iynwut4oq5@pandora.alfanett.no>
P� Tue, 12 Feb 2008 03:17:33 +0100, skrev Jon Harrop  
<······@jdh30.plus.com>:

> George Neuner wrote:
>> On Mon, 11 Feb 2008 02:56:22 -0600, ····@rpw3.org (Rob Warnock) wrote:
>>>    > (funcall
>>>       (labels ((self (x) (if (< x 2) 1 (* x (self (1- x)))))) #'self)
>>>       5)
>>>
>>>    120
>>>    >
>>
>> Yes, I suppose that is anonymous in the way John intended.
>
> No, I do not consider a function explicitly called "self" to be
> anonymous. :-)
>

But as was demonstrated the name self is only known to it self.
Labels has lexical scope so the name isn't exported.

--------------
John Thingstad
From: Jon Harrop
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <13r3isu2gjprm5d@corp.supernews.com>
John Thingstad wrote:
> But as was demonstrated the name self is only known to it self.
> Labels has lexical scope so the name isn't exported.

But the name is imported, so the onus is on the author to ensure that the
name "self" is not otherwise used in the body of the function. That's why
this isn't an anonymous function but is, instead, a named function with the
name "self".

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/products/?u
From: George Neuner
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <n5u3r3dhrj3b6s44ap06i0et2v5nacvm4u@4ax.com>
On Tue, 12 Feb 2008 16:35:17 +0000, Jon Harrop <······@jdh30.plus.com>
wrote:

>John Thingstad wrote:
>> But as was demonstrated the name self is only known to it self.
>> Labels has lexical scope so the name isn't exported.
>
>But the name is imported, so the onus is on the author to ensure that the
>name "self" is not otherwise used in the body of the function. That's why
>this isn't an anonymous function but is, instead, a named function with the
>name "self".

Now I don't understand.  You said previously 

:An interesting quirk [of Mathematica] is that the function itself
:is called #0.  So you can do recursion without having to >bind to
:a name, e.g. factorial:
:
:  If[#1==0, 1, #1 #0[#1-1]]&

So "#0" is a lexically scoped alias for the name of the function and
the programmer has to ensure that it is not otherwise used in the
body.  To me that is not anonymity.

The only difference I see vs Rob's examples and my own ... and perhaps
this is what you mean ... is that the symbol #0 is automagically bound
to the current function.  All that means to me is that there is
another reserved word I can't use as a name.

George
--
for email reply remove "/" from address
From: Jon Harrop
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <13r3vsfd3bn9lca@corp.supernews.com>
George Neuner wrote:
> The only difference I see vs Rob's examples and my own ... and perhaps
> this is what you mean ... is that the symbol #0 is automagically bound
> to the current function.  All that means to me is that there is
> another reserved word I can't use as a name.

Exactly.

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/products/?u
From: Pascal J. Bourguignon
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <7cabm5p5ge.fsf@pbourguignon.anevia.com>
Jon Harrop <······@jdh30.plus.com> writes:

> John Thingstad wrote:
>> But as was demonstrated the name self is only known to it self.
>> Labels has lexical scope so the name isn't exported.
>
> But the name is imported, so the onus is on the author to ensure that the
> name "self" is not otherwise used in the body of the function. That's why
> this isn't an anonymous function but is, instead, a named function with the
> name "self".

What about this:

(defmacro anonymous (arguments &body declarations-and-body)
  (let ((noname (gensym)))
    `(labels ((,noname ,arguments
                (flet ((recursive-call (&rest arguments) (apply (function ,noname) arguments)))
                  (locally
                      ,@declarations-and-body))))
       (function ,noname))))

(funcall (anonymous (x)
                    (if (< x 1)
                        1
                        (* x (recursive-call (1- x))))) 10)
--> 10


-- 
__Pascal Bourguignon__
From: Jon Harrop
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <13r60sadp44td8f@corp.supernews.com>
Pascal J. Bourguignon wrote:
> What about this:
> 
> (defmacro anonymous (arguments &body declarations-and-body)
>   (let ((noname (gensym)))
>     `(labels ((,noname ,arguments
>                 (flet ((recursive-call (&rest arguments) (apply (function
>                 ,noname) arguments)))
>                   (locally
>                       ,@declarations-and-body))))
>        (function ,noname))))
> 
> (funcall (anonymous (x)
>                     (if (< x 1)
>                         1
>                         (* x (recursive-call (1- x))))) 10)
> --> 10

Yep, that's fine by me.

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/products/?u
From: Rob Warnock
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <ho2dnQY1mbW0GSzanZ2dnUVZ_uKpnZ2d@speakeasy.net>
George Neuner  <·········@/comcast.net> wrote:
+---------------
| ····@rpw3.org (Rob Warnock) wrote:
| >Well, true, but you can keep the internal name from escaping by
| >returning the function value, making it effectively "anonymous":
| >
| >    > (funcall
| >       (labels ((self (x) (if (< x 2) 1 (* x (self (1- x)))))) #'self)
| >       5)
| >
| >    120
| >    > 
| 
| Yes, I suppose that is anonymous in the way John intended.
| 
| It's a little more verbose in Scheme:
| 
|  ((lambda (x)
|    (let self ((x x)) 
|      (if (< x 2) 1 (* x (self (- x 1))))))
|   5)
|  120
| 
| but I think the Scheme makes it a little clearer that there is a named
| self referential function in an anonymous wrapper.  
+---------------

That's using the "named LET", though you also can do it in Scheme
almost exactly the same way I did in CL, by using LETREC instead
of LABELS and leaving off the FUNCALL:

    > ((letrec ((self (lambda (x) (if (< x 2) 1 (* x (self (1- x)))))))
	 self)
       5)
    120
    > 


-Rob

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: Jon Harrop
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <13r09qcprbh7md9@corp.supernews.com>
George Neuner wrote:
> I don't know Mathematica so I'm not sure what John really meant by
> "recursive anonymous".  I can imagine having special syntax to
> recursively call the current function without specifying it by name
> ... but it would be senseless fluff and I've never seen it done.

Mathematica has a notation for functions:

  #1 + 3 &

denoting something like:

  (fun #1 -> #1 + 3)

where the arguments are labelled #1 .. #n. An interesting quirk is that the
function itself is called #0. So you can do recursion without having to
bind to a name, e.g. factorial:

  If[#1==0, 1, #1 #0[#1-1]]&

Obviously binding to "self" or using external recursion like a y combinator
is not enlightening and you can do that in any language.

It is this #0 that Mathematica's Compile function barfs on.

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/products/?u
From: Barb Knox
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <see-3C3FD5.09522611022008@lust.ihug.co.nz>
In article <···············@corp.supernews.com>,
 Jon Harrop <······@jdh30.plus.com> wrote:
[SNIP]

> Mathematica's equivalent of cons cells are arrays of objects, so they
> can store an arbitrary number of elements contiguously.
> 
> Speaking of which, why doesn't Lisp do that?

Lisps implemented with "cdr-coding" do in fact store lists contiguously 
in most cases, and even allow pointing into the middle of a contiguous 
list.  This is all managed behind the scenes by cons and the GC.

-- 
---------------------------
|  BBB                b    \     Barbara at LivingHistory stop co stop uk
|  B  B   aa     rrr  b     |
|  BBB   a  a   r     bbb   |    Quidquid latine dictum sit,
|  B  B  a  a   r     b  b  |    altum viditur.
|  BBB    aa a  r     bbb   |   
-----------------------------
From: Robert Maas, see http://tinyurl.com/uh3t
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <rem-2008feb15-002@yahoo.com>
> From: Jon Harrop <······@jdh30.plus.com>
> Algebraic datatypes are found in MLs (like OCaml and F#) and
> Haskell. They are composed of sum and product types:
>   type t =
>     | A of int * string
>     | B of bool * float * char
>     | C
> The choice A|B|C is a sum type: a value is any one of those constructors.
> The arguments "bool * float * char" are a product type: a value must
> contain all of those fields.

Ah, the original Boolean trick of overloading addition and
multiplication symbols (plus and asterisk) to actually mean union
and intersection instead.
Common Lisp has the same thing, except without overloading.
Referring to CLtL (Steele), section 4.4 (page 44) "Type Specifiers
That Combine".
 (and type1 type2 ...) = intersection
 (or type1 type2 ...) = union
(Too lazy to find the same material in the HyperSpec today.)

> Mathematica lacks static type checking

So there's no way to assert a fact about some data value somewhere
in your code and have the compiler warn you if that fact can't be
guaranteed based on what it knew earlier in the same code?

> Actually, I can't think of any other languages that support
> recursive anonymous functions anyway: none of Lisp, Scheme, SML,
> OCaml, F# and Haskell do AFAIK.

OK, I'm taking that as a challenge!! Is there any way in Common
Lisp that a recursive anonymous function can be defined. I'm pretty
sure it's possible to use an uninterned symbol as a name of a
function, which in effect makes it anonymous. I'll use that as my
fallback if I can't do it *right* where there's no symbol
whatsoever used as a local "name" of a function. If Lisp were
truly interpreted, with the CONS structure kept for all time,
then it'd be trivial: Write an anonymous function that calls another function,
  (let ((tmp (function (lambda (x) ... (bar ...) ...)))
then RPLACA the name of that other function with the actual lambda
expression from the main definition. Here's a crude SETQ version,
writing the lambda expression directly:
(setq anon-pre-copy-tree
  '(lambda (tree)
     (if (atom tree) tree
       (cons (bar (car tree)) (bar (cdr tree))))))
(setq *print-circle* t)
(setq anon-recurse-copy-tree
  (nsubst anon-pre-copy-tree 'bar anon-pre-copy-tree))
#1=(LAMBDA (TREE)
     (IF (ATOM TREE) TREE (CONS (#1# (CAR TREE)) (#1# (CDR TREE)))))
(setq foo (list 1 2 3 (list 4 5) 6))
(eval (list anon-recurse-copy-tree 'foo))
Illegal instruction (core dumped)
2304 -rw-------   1 rem  user  2342912 Feb 12 09:36 lisp.core
That was in CMU Common Lisp 18b
If anybody is curious, I moved it to here temporarily:
<http://www.rawbw.com/~rem/NewPub/lisp.core>

Is there a version of Common Lisp where that would work??

Hmm, maybe I should write my own version of EVAL that works only
the old-fashioned way (by dynamic/special bindings, no lexical
stuff, and no JIT complilation, just recurse directly within EVAL
while traversing the tree of code to be executed) to avoid the bug
in CMUCL's version of EVAL?

> Callcc is a lot more like forking than threads.

I've been under the impression that forking was a special case of
multiple threads, where a single thread spawns two continuation
threads, taking opposite paths from some branch point, and the
original thread then waits for both to complete before combining
their results and proceeding as a single thread.

> Arrays are Mathematica's primitive data type and it calls them
> (confusingly) "List"s.

Well they *are* lists i.e. vectors, they just aren't linked lists,
they're contiguous lists, which have some advantages and some
disadvantages compared with linked lists. Can elements of
Mathematica's contiguous lists be contiguous lists (arrays)
themselves?

> But Mathematica's equivalent of cons cells are arrays of objects,
> so they can store an arbitrary number of elements contiguously.
> Speaking of which, why doesn't Lisp do that?

No, you're wrong. Mathematica doesn't have an exact equivalent of
CONS cells, i.e. ordered pairs of a distinct type different from
any multi-pointer object. Lisp has the equivalent of Mathematica's
"lists", namely one-dimensional general arrays. Lisp has *both*
kinds of objects, one-dimensional general arrays which can hold any
number of pointers, and CONS cells which are specialized to have
exactly two pointers and to be more efficient than a two-element
general array. CONS cells are *not* a sub-type of general arrays,
whereas a list of two elements in Mathematica, as you cite, *is* a
sub-type of multi-element lists.

Now it would be possible in Lisp to use two-element general arrays
as if CONS cells: Have a new version of READ that generated them
where CONS cells would normally be generated, a new version of
PRINT that printed as if they were CONS cells, and a new version of
EVAL that traversed them just as the regular EVAL traversed regular
CONS cells. That would be a cute hack, even if of no practical
value.
From: Jon Harrop
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <13rej2d5122n521@corp.supernews.com>
Robert Maas, see http://tinyurl.com/uh3t wrote:
> From: Jon Harrop <······@jdh30.plus.com>
>> Mathematica lacks static type checking
> 
> So there's no way to assert a fact about some data value somewhere
> in your code and have the compiler warn you if that fact can't be
> guaranteed based on what it knew earlier in the same code?

You only have run-time assertions in Mathematica.

>> Callcc is a lot more like forking than threads.
> 
> I've been under the impression that forking was a special case of
> multiple threads, where a single thread spawns two continuation
> threads, taking opposite paths from some branch point, and the
> original thread then waits for both to complete before combining
> their results and proceeding as a single thread.

The difference is that with forked threads the OS can lazily copy the shared
heap as it is mutated. So if you create a big immutable data structure and
then fork two threads that use it, you get sharing of the data structure
for free. If you use vanilla threads then you must either work with
explicit sharing or do a deep copy of the data structure.

>> Arrays are Mathematica's primitive data type and it calls them
>> (confusingly) "List"s.
> 
> Well they *are* lists i.e. vectors, they just aren't linked lists,
> they're contiguous lists, which have some advantages and some
> disadvantages compared with linked lists. Can elements of
> Mathematica's contiguous lists be contiguous lists (arrays)
> themselves?

Yes, that's exactly how it works.

>> But Mathematica's equivalent of cons cells are arrays of objects,
>> so they can store an arbitrary number of elements contiguously.
>> Speaking of which, why doesn't Lisp do that?
> 
> No, you're wrong. Mathematica doesn't have an exact equivalent of
> CONS cells, i.e. ordered pairs of a distinct type different from
> any multi-pointer object. Lisp has the equivalent of Mathematica's
> "lists", namely one-dimensional general arrays. Lisp has *both*
> kinds of objects, one-dimensional general arrays which can hold any
> number of pointers, and CONS cells which are specialized to have
> exactly two pointers and to be more efficient than a two-element
> general array. CONS cells are *not* a sub-type of general arrays,
> whereas a list of two elements in Mathematica, as you cite, *is* a
> sub-type of multi-element lists.

Ok, yes.

> Now it would be possible in Lisp to use two-element general arrays
> as if CONS cells: Have a new version of READ that generated them
> where CONS cells would normally be generated, a new version of
> PRINT that printed as if they were CONS cells, and a new version of
> EVAL that traversed them just as the regular EVAL traversed regular
> CONS cells. That would be a cute hack, even if of no practical
> value.

Assuming that is possible, why didn't scheme do it part of "cutting down"
Lisp?

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/products/?u
From: Joachim Durchholz
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <1203198531.7648.7.camel@kurier>
Am Freitag, den 15.02.2008, 23:17 -0800 schrieb Robert Maas, see
http://tinyurl.com/uh3t:
> Is there any way in Common
> Lisp that a recursive anonymous function can be defined.

Actually I'm wondering what's the point of doing so. To make a function
recursive, it somehow has to refer to itself, which is easiest to
achieve using a name... works and has no problems, at least not for the
programmer.
Of course, for people who want to nail down the exact semantics of a
recursive function, it may be advantageous to have recursion without
naming - but that's just to simplify proofs.

Or did I overlook something and having a way to define an anonymous
recursive function would be useful to a programmer?
I'm also wondering: would it be useful to have anonymous mutually
recursive functions?

Regards,
Jo
From: Joost Diepenmaat
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <871w7c4k9p.fsf@zeekat.nl>
Joachim Durchholz <··@durchholz.org> writes:

> Am Freitag, den 15.02.2008, 23:17 -0800 schrieb Robert Maas, see
> http://tinyurl.com/uh3t:
>> Is there any way in Common
>> Lisp that a recursive anonymous function can be defined.
>
> Actually I'm wondering what's the point of doing so. To make a function
> recursive, it somehow has to refer to itself, which is easiest to
> achieve using a name... works and has no problems, at least not for the
> programmer.
> Of course, for people who want to nail down the exact semantics of a
> recursive function, it may be advantageous to have recursion without
> naming - but that's just to simplify proofs.
>
> Or did I overlook something and having a way to define an anonymous
> recursive function would be useful to a programmer?

IMHO it's mainly that giving names to functions *just* so that they can
refer to themselves is a /bit/ ugly. And in many languages you can't
refer to a name in the declaration (so you need something special, like
LABELS instead of (F)LET to do it, or declare the variable first and
then assign the function to it). Of course, if the function is
complicated giving it a name can be clarifying.

-- 
Joost Diepenmaat | blog: http://joost.zeekat.nl/ | work: http://zeekat.nl/
From: Jon Harrop
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <13rf2b39gpvfsa3@corp.supernews.com>
Joost Diepenmaat wrote:
> Joachim Durchholz <··@durchholz.org> writes:
>> Or did I overlook something and having a way to define an anonymous
>> recursive function would be useful to a programmer?
> 
> IMHO it's mainly that giving names to functions *just* so that they can
> refer to themselves is a /bit/ ugly.

and verbose. Exactly.

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/products/?u
From: Pascal Bourguignon
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <87hcg8e88y.fsf@thalassa.informatimago.com>
Joachim Durchholz <··@durchholz.org> writes:

> Am Freitag, den 15.02.2008, 23:17 -0800 schrieb Robert Maas, see
> http://tinyurl.com/uh3t:
>> Is there any way in Common
>> Lisp that a recursive anonymous function can be defined.
>
> Actually I'm wondering what's the point of doing so. To make a function
> recursive, it somehow has to refer to itself, which is easiest to
> achieve using a name... works and has no problems, at least not for the
> programmer.
> Of course, for people who want to nail down the exact semantics of a
> recursive function, it may be advantageous to have recursion without
> naming - but that's just to simplify proofs.
>
> Or did I overlook something and having a way to define an anonymous
> recursive function would be useful to a programmer?
> I'm also wondering: would it be useful to have anonymous mutually
> recursive functions?

I don't know if it would be useful, but it certainly is possible and
trivial:

(defmacro bi-anonymouses ((arguments-1 &body declarations-and-body-1)
                          (arguments-2 &body declarations-and-body-2))
  "Defines a pair of anonymous recursive functions.
These function can call the other recursively by calling
the local function RECURSIVE-CALL-THE-OTHER.
The first function is returned."
  (let ((noname-1 (gensym))
        (noname-2 (gensym)))
    `(labels ((,noname-1 ,arguments-1
                (flet ((recursive-call-the-other (&rest arguments)
                         (apply (function ,noname-2) arguments)))
                  (locally
                      ,@declarations-and-body-1)))
              (,noname-2 ,arguments-2
                (flet ((recursive-call-the-other (&rest arguments)
                         (apply (function ,noname-1) arguments)))
                  (locally
                      ,@declarations-and-body-2))))
       (function ,noname-1))))


(mapcar (bi-anonymouses ((x) (if (zerop x)
                                  t
                                  (recursive-call-the-other (1- x))))
                         ((x) (if (zerop x)
                                  nil
                                  (recursive-call-the-other (1- x)))))
        '(1 2 3 4))
--> (NIL T NIL T)



I don't see the point either.

-- 
__Pascal Bourguignon__                     http://www.informatimago.com/

NOTE: The most fundamental particles in this product are held
together by a "gluing" force about which little is currently known
and whose adhesive power can therefore not be permanently
guaranteed.
From: Pascal Bourguignon
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <873as3m8v2.fsf@thalassa.informatimago.com>
·······@yahoo.com (Robert Maas, see http://tinyurl.com/uh3t) writes:
> [...]
>> tail recursion
>
> This is not as useful as it's hyped up to be. Sure if you write a
> factorial recursively to emulate iteration, a compiler can convert
> it back to the equivalent iterative algorithm to avoid stack
> overflow. Whooptie-doo. True recursion isn't tail recursion in the
> first place so you don't need this case handled specially. For
> example, the right way to compute factorial recursively is:
> 
> (defun prod (m n)
>   (if (< (+ m 1) n)
>       (let ((mid (floor (+ m n) 2)))
>         (* (prod m mid) (prod mid n)))
>       m))
> (defun fact (n) (prod 1 (+ 1 n)))
>
> The whole point of recursion is divide-and-conquer, epecially if
> you have multiple processors which can run in parallel on sub-tasks.
> Always spawing one sub-task to do a trivial case and the other
> sub-task to do all the work except that one trivial case, whereby
> you actually compute the trivial case directly and tail-recurse on
> the all-the-rest, is outright stupid.

Very good!

-- 
__Pascal Bourguignon__                     http://www.informatimago.com/

In a World without Walls and Fences, 
who needs Windows and Gates?
From: Xah Lee
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <f8bbddc4-fe80-4c01-bf16-de303542601b@d70g2000hsb.googlegroups.com>
Robert Maas wrote:
« [on lisp's cons for list] So it's accessible to any random
programmer, rather than being accessible only to the wizards deep
inside the vendor. Is that a good or a bad thing? What are the
alternatives to the Lisp way, assuming you're going to have a linked-
list data type, with methods or functions that do ordinary things with
them such as traversing and mapping and reading and printing, in the
first place.  »

Robert, your previous post is lengthy, addressing many points.

Let's keep this message focused on just the cons of lisp. I am sorry
if i didn't come to reply, address, or answer many parts and points of
your thoughtful message. Anyway, what i wrote in my previous message
all are in the context of lisp's cons, or intented to support my view
of lisp's cons.

It is surprisingly hard for me to explain how lisp cons is a problem
to lispers. I tried, and have wrote several essays that are thousands
words long, esp recently. I don't know if you have read them already.
Some of the following are tangential, but related in the general idea
of high-level abstraction.

• Lisp's List Problem
 http://xahlee.org/emacs/lisp_list_problem.html

• Jargons And High Level Languages
 http://xahlee.org/emacs/jargons_high_level_lang.html

• Is Lisp's Objects Concept Necessary?
 http://xahlee.org/emacs/lisps_objects.html

Though, i don't doubt that i have any problem at all to convey ideas
of lisp's cons problem to, say, Perl, Python, PHP, Javascript, Java,
programers.

----------------------------------------

Instead of me mostly repeating, i was wondering if you'd give me some
opinion on some questions in the following. I think, understanding how
you think, would clarify our discussion. Possibly, we don't have a
disagreement to begin with.

My question is this: In Perl, PHP, Python, Javascript, these so called
scripting languages, in these langs, there's no cons as in lisp. How
do you, think of that?

Do you think, because lack of cons, limits these languages's abilities
or domain in some way?

(note here, Perl, PHP, Python, Javascript, together with C, C++, Java,
all of them don't have cons, probably make up more than 90% of all
programing markets, roughly in anyway you count it (by existing code
base, by num of programers, by the money made out of software written
in these).)

Thanks for the lengthy and well thought message.

  Xah
  ···@xahlee.org
∑ http://xahlee.org/

☄
From: Robert Maas, see http://tinyurl.com/uh3t
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <rem-2008feb20-001@yahoo.com>
> From: Xah Lee <····@xahlee.org>
> Let's keep this message focused on just the cons of lisp. ...
> It is surprisingly hard for me to explain how lisp cons is a
> problem to lispers.

I don't believe it *is* a problem. It's hard to explain what ain't so.

> Lisp's List Problem
> http://xahlee.org/emacs/lisp_list_problem.html

> lisp's ways of list is historical and drags in the implementation details.

No. It makes those details *available* for any application
programmer wishing to take advantage of that info. Anyone who
wishes may simply ignore those details.

In fact the *real* implementation details, i.e. how a single cell
of a linked list is constructed at the machine level, is *not*
specified, and application programmers don't need nor use that
secret implementation-dependent info.

> in today's high-level languages, one do not deal with memory
> allocation, linked list, types, etc ...

False. Some algorithms are appropriate for linked lists but not for
arrays, such as any that perform lots of pushing/popping at the
start of the list, while other algorithms are appropriate for
arrays but not for liked lists, such as any that do lots of random
access within the list. If the application programmer has no way
(in some particular programming language) to choose between the two
forms of a list, and can't even *know* which is provided by
default, then coding *any* of such algorithms is not worth even
trying, and that programming language sucks IMO.

Java provides java.util.AbstractSequentialList (like a linked list)
specialized to java.util.LinkedList (in fact a doubly-linked list),
and java.util.ArrayList (like an array). Do you consider Java to be
one of today's high-level languages?

> For example, in Python, a list is like this [a,b,c,...] and if one
> wants arbitrarily nested list (such as a matrix, tree, associative
> list, hash, dictionary, key'd-list, vector, array, map, sequence
> ...however one wants to call them), the programer simply write
> [a,b,c,...] where each of the a,b,c can be another list of the form
> [...]. Similarly in Javascript.

How is that any different (except nitpicking details of syntax) in
Lisp? In lisp you use parens instead of brackets, and you don't
need to reach for the comma key, you just bump the space bar which
is easier/faster.

> a programer cannot program in lisp in a real world situation without
> having a good understanding of the cons.

In the other languages, a programmer can't write any code at all
that deals with lists, because the programmer has no idea whether
they are linked lists or arrays, hence can't code for either
assumption.

In Lisp, a programmer can deal with linked lists for appropriate
applications, and with arrays for other appropriate applications.
In Java, a programmer can likewise deal with java.util.LinkedList
or java.util.ArrayList. In the languages you cite, a programmer is
stuck with no choice available and no way to know what's given
hence no way to have the info needed to know how to write the
application.

> the reason for high-level lang being what they are is from all these
> little details.

Wrong. The reason for a high-level language is to offer the
programmer a choice of various common data types (arrays, linked
lists, hash tables, etc.), but hide the details of tag bits and
word size and other details of pointers within these "collections"
(Java jargon).

> a vector or array is simply a list with unchanging length

No. It's that plus the fact that it's laid out in consecutive RAM
so that it can take advantage of Random Accessing by index to
achieve guaranteed constat time index->value.

> A hash is simply a list of element where each element is a list
> of 2 items, and further with the property that one have a
> function that quickly determines whether a key exist etc. (i.e.
> so-called constant time access).

Why does it have to be a list at all?? Why can't it *only* provide
the usually-constant-time key->value function, and not provide
sequential access at all?

> A set is just a list that does not have identical elements.

It must somehow enforce that, unless only constant sets (defined at
compile time and *never* changed at runtime) are supported.
In math, all sets are constants. You can't add elements to a set
and have the *original* set now contain the additional elements.
What you said may be a valid definition of a mathematical set, but
it's not very useful for programming.
You've also begged the question whether the word "list" in this
context means an array or a linked-list.

> A matrix is just a list with element where all elements are lists
> of the same length.

Same length as what? Answer: Same length as each other, but *not*
necessarily same length as the top-level list.

> A tree is simply a arbitrarily nested list... and so on.

OK. But same question as for sets, do you mean array or linked-list?

> But, one may ask, so if one wants a hash-table in Mathematica (or
> other high-level lang) that just have a list, how can one do it? For
> example, its a list of pairs like this {{k1,v1},{k2,v2},...} but how
> can i know that it'll have constant access time when i use the
> language's function to gets the elements of a list? Does it mean the
> programer has to implement it himself? The answer is that the compiler
> should automatically figure out and decide this transparently.

There's no way the compiler can figure out the runtime statistics
of traversing a collection, specifically figuring out whether
access is dominated by sequential access or random access, whether
updates are dominated by ins/del at start, ins/del at end, or
ins/del at random places in the middle. That's the job of the
programmer to declare the intent, such as by choosing a different
data type as appropriate for a given application.

> In case the compiler is theoretically not able to do so, then the
> language could have some construct such as:
> myhash={{k1,v1},{k2,v2},...};
> AssignListProperty(myhash,hash-like);

You've contradicted yourself. If the compiler provides explicit
support for both array and linked list, and if both utilities are
available at runtime, and the compiler does the right thing of
matching the explicit compiletime request to the corresponding
runtime utility, then both datatypes *are* supported, just like in
Lisp, and the programmer must say which he/she wants, just as in
Lisp. So you say both datatypes shouldn't be supported, and the
programmer shouldn't have to say which is wanted, then you say the
programmer will say which is wanted. Do you see the contradiction?

> Also, as a consequence of the cons thing, lisp creates many
> various ways to code a tree of a given shape. (i.e. the
> distinction of "proper list")

That's a good thing (in lisp). In your favorite Mathematica, the
only kind of tree you can get is nested lists, which you can get in
Lisp just the same. Unfortunately nested lists don't provide any
way to automatically rebalance the tree as new elements are added
to it. If the programmer adds that capability to the basic form of
nested lists, to make an AVL tree or red/black tree etc. etc., then
you have just as many ways to implement a tree in either language,
and the programmer (either the application programmer, or some
third-party package vendor) must explicitly code for that
particular kind of self-balancing tree, so the difference you
complain about isn't a difference at all.

> the whole source code of any source code is one single, deeply nested list.

Is that supposed to be English??
Before you re-post URLs pointing to stuff you wrote long ago which
you've archived in your personal Web site, why don't you proofread?

By the way, the parse tree of *any* high-level-language program is
a deeply nested list/tree. (The raw syntax of any such program is
just a sequence of characters, which requires a parser to turn into
anything meaningful, i.e. the parse tree.)

> Mathematica ... provide many facilities to manipulate such a
> form. I can, for example, get all the nodes at level n.

How is that of any general use?? I can't think of even once I've
ever wanted a nested list flattened down to the third level, for
example.

> Map a function to level n.

Ditto. Totally useless mapping utility.

> Map a function to just leafs (which is considered level -1 in
> Mathematica).

How do you define a "leaf", if the whole structure is just a bunch
of nested lists, and any *element* can be another list if wanted?
How is the mapping utility suppose to know the difference between
sub-lists which represent more nesting and sub-lists which are
supposed to be leafs themselves??

For example, suppose an application programmer has implemented an
AVL tree, where each node in the tree looks like one of these:
(FORK <depth> <leftSubtree> <righSubtree>)
(LEAF <value>)
How is the built-in utility supposed to know that when it sees a
FORK node it's supposed to explore through only the third and
fourth sub-lists, while when it sees a LEAF node it's supposed to
*stop* there but use just the second sub-list, not use the whole
node itself? Answer: The built-in utility can't possibly know the
programmer's intent what leaves are supposed to be, hence the
utility is worthless.

> Dispite being a expert in trees, the lisp's cons business is
> truely a pain to deal with.

You need to proofread your English! You say there that lisp's cons
business is an expert in trees. Is that what you really meant to
say? Did you intend to say that *you* are an expert in trees?
That's not what you actually said. Please proofread and correct
your Web pages before re-posting links to them.

> To work with anything slightly nested, is a pain in Perl.

Yeah, Perl sucks when it comes to nested lists. Lisp doesn't suck.
You weaken your argument by pretending Perl and Lisp are the same,
and trying to use Perl's weakness as if a Lisp weakness.
From: Robert Maas, see http://tinyurl.com/uh3t
Subject: Re: the necessity of Lisp's Objects?
Date: 
Message-ID: <rem-2008feb20-002@yahoo.com>
> From: Xah Lee <····@xahlee.org>
(second part of reply:)
> Jargons And High Level Languages
> http://xahlee.org/emacs/jargons_high_level_lang.html

> ... a ideal high level language. ...
"a" should read "an". Please correct your Web page.

> my wishes are:
> The language will be absolutely high-level, meaning in particular:
   o The language's documentation, will not need to have mentioning any
   of the following words: pointer, reference, memory allocation, stacks,
   hash, cons cells, linked list, circular list.

Although "pointer" and "reference" with the specific meaning used
in C++ should be avoided, I see no reason to deny the application
programmer knowledge of the difference between linked records and
embedded/sequential records. When writing an algorithm to deal with
a very large amount of data that won't all fit in RAM at the same
time, forcing the programmer to deal with page faults and/or
explicitly-coded disk access to load data as-needed, it's essential
that the programmer have control over making sure that groups of
records are in contiguous memory (so that at most two page faults
are needed) rather than strung out over many pages (requiring as
many page faults as there are pages occupied by the different
blocks of data in the linked list).

It's considered implicit that when an object is instantiated, some
memory is allocated, and when it's either disposed (C++ style) or
garbage collected (LIsp/Java style) that memory is freed for other
subsequent use. It's not acceptable that programmers haven't the
slightest concept of how memory allocation is used by their
application, hence there's nothing the programmer can do to
alleviate memory leaks/bloat and other problems that might cause
extremely slow operation or an outright crash of the application.

Stacks are an essential part of many algorithms. Denying an
application programmer the use of built-in stack features forces
the programmer to emulate stacks via his own arrays, which is a
step backward toward coding in assembly language. I find that to be
unacceptable.

Consider for example the task of writing a recursive descent parser
without the ability to manipulate any stack whatsoever.

Hashs are one of several ways to implement an associative array
(lookup table, mapping key to value). Sorted arrays (using binary
search to find a key), and balanced binary trees (using traversal
down left-or-right sub-tree to find a key), are two other ways.
Each technique has its own advantages and disadvantages. IMO any
application programmer should be allowed to choose the appropriate
technique for his particular application, which requires "hash"
etc. to be mentionned somewhere in the documetation.

Instead of "cons cells", would you accept the term "linked-list
cells", or perhaps "standard pairs", or "binary cells", or some
other term which avoids the Lisp jargon yet still indicates the
existance of such a cell with exactly two pointers to other objects??

Why not mention linked list? That is an essential ADT for many
applications. Do you plan to deny every application programmer that
built-in feature, or confuse the programmer by calling it something
else?

Circular lists are not common enough that they absolutely need to
be included in the documentation for the language. But occasionally
they are useful. So do you propose to make it clear what primitives
are available that might be used for building circular lists, or
deny any ability whatsoever to create any circular list?

   o The language will not have concept of binary bits, bit operator,
   bytes, etc. (see the optimization section below) (However, the
   language can (and should) support computing with arbitrary number
   basis, and when the basis is 2, the compiler should of course
   automatically map it to bits on a chip as its method to achieve speed)

What do you mean by "number basis". Are you having trouble with
English, and you really meant to say "number base" i.e. "radix"?

I believe you're confusing two concepts:
- Number base used to represent arbitrary integers, whereby each
   position to the left is weighted base times the old weight.
- Arrays of enumeration objects, whereby each element in the array
   can hold just a fixed finite number of different objects but the
   elements all have the same weight.
For example, '49' represents fourty-nine (seven times seven)
under the first concept (if base is ten), but merely represents the
vector (list) [Yellow, White] under the second concept (if the
enumeration is {Black|Brown|Red|Orange|Yellow|Green|Blue|Purple|Gray|White}).

   o The language's computational model should be simple, high-level,
   mathematics based as possible, as opposed to being a model of some
   abstract machine, implementation, or computational model ...

In mathematics, everything is constant, only our meta-knowledge
*about* mathematical fact changes over time as we
discover/learn/prove additional facts. Such a mathematical model is
totally inappropriate for most data/processing tasks, where things
actually change over time, such as the number of refrigerators in
the warehouse decreases whenever one is purchased and increases
whenever a delivery from the wholesaler happens. It is completely
unacceptable to model this as a mathematical function of time,
where everything that is ever going to happen in the future is
already encompassed in a pre-established **FATE** function that we
can incrementally learn about but do nothing to affect.

   The language's syntax and semantics should be as much as consistent
   and regular as possible.

Agreed. Lisp is better than most languages. The basic structure of
the language is:
<list> ::= <atom>
           <openParens> <list>* <closeParens>
About the only use of special characters is for commonly used types
of literal constants (such as strings/numbers/characters) and shorthand
to avoid repetitive typing (such as #' to denote wrapping FUNCTION
around something).

> For example lisp's syntax would be considered not acceptable
> here. (it has "' # ;" and various ad hoc irregularities)

' is shorthand for wrapping QUOTE around something.
Nobody is forcing you to use that!! It's a *convenience* for people
who would rather type fewer characters than (QUOTE (RED GREEN BLUE)).

# is a multi-purpose way to generate character constants and shorthand
for wrapping FUNCTION around something etc.
For example, nobody is stopping you from saying:
  (defconstant charComma (code-char 44))
and then using charComma instead of #\, throughout your program.
(I avoided the $ around the name of the global because you don't
 like special characters?)
Except for read-time or load-time execution (#. and #,
respectively), is there any other use of # that you feel is
absolutely needed the way Common Lisp is currently defined?
Surely feature conditionals (#+) can be replaced by simply
distributing different source files for each target environment.

> As another example, Common Lisp's multi-meaning names would be
> considered ah hoc irregular semantics here. (i.e. a symbol can be a
> variable as well as a function).

IMO this is an absord complaint. An ordinary (not keyword) sumbol
in Common LIsp has more than just two purposes:
- Can be bound to a function.
- Can be bound to a value.
- Can have a property list.
- Always has a print name, by which it can be read and printed.
Do you honestly feel that each of those four purposes should
require a different type of object?? If not, why should value and
function be split apart, whereby the same symbol can't have both??
Or do you propose even further that each symbol be allowed to have
exactly one of those first three purposes, never two?
Or do you propose that there are different kinds of symbols, with
each having a distinct syntax, for example if the first character
of the symbol's name is F then it can be used only as the name of a
function, and if the first character of the symbol's name is V then
it can be used only as something with a value, and if the first
character of the symbal's name is P then it can have only a
property list?

   o The language's variables must not have types. Its values can have
   types.

So you'd forbid all compiler declarations that optimize code that
deals with variables that never have any but a single type of
value? So you'd force all Lisp code to run as slow as necessary to
be capable of handling all possible data types even where only a
single data type will ever be used?

   o The numbers in the language can have the following types: integer,
   rational, real, complex number. (and possibly extension from these,
   such as algebraic number, etc.)

That's not possible. It requires an infinite amount of data to
specify a single arbitrary real number, and it's impossible for
anything in the physical world to store even one such value.
You really need to re-think your essay.

Will you accept as an alternative a number representation which
gives an interval (upper and lower bound) on a real number, thereby
specifying an uncountably infinite set of possible reals within
that interval? Each endpoint of that interval could then be a
rational?