From: Alexander Baranovsky
Subject: LISPPA
Date: 
Message-ID: <27d7ccd8.0404111529.4f4fbf63@posting.google.com>
LISPPA (List Processing based on the Polymorphic Arrays) technology
provides a way to process dynamic data structures (lists, trees and
more) without using pointers. LISPPA uses polymorphic arrays as a base
of  the data representation.

LISPPA considerably extends the applicability of imperative
programming languages in the symbolic computations. So, now you can
use Pascal, Basic or C languages in the "Lisp data domain".

LISPPA has been implemented in the paxScript scripting engine:

http://www.paxscript.com

Programs, written in paxScript languages paxPascal and paxBasic,
illustrate the use of LISPPA in the mechanical theorem proving and
symbolic differentiation.

To learn more about LISPPA technology, please visit

http://www.virtlabs.com.ua/paxscript/lisppa.htm

Thank you.

Alexander Baranovsky,
VIRT Laboratory
··@cable.netlux.org
www.paxscript.com

From: William Bland
Subject: Re: LISPPA
Date: 
Message-ID: <pan.2004.04.11.23.54.34.360495@abstractnonsense.com>
On Sun, 11 Apr 2004 16:29:52 -0700, Alexander Baranovsky wrote:

> To learn more about LISPPA technology, please visit
> 
> http://www.virtlabs.com.ua/paxscript/lisppa.htm

Perhaps I'm missing something.  I did read the web page, and I just
kept thinking "why?"...  So, why?

I'm sorry, I really don't like knocking work that people have put
a lot of effort into - I just can't see the point.

Best wishes,
		Bill.
-- 
Dr. William Bland                          www.abstractnonsense.com
Computer Programmer                           awksedgrep (Yahoo IM)
Any sufficiently advanced Emacs user is indistinguishable from magic
From: Alexander Baranovsky
Subject: Re: LISPPA
Date: 
Message-ID: <27d7ccd8.0404120144.5b762bbb@posting.google.com>
> Perhaps I'm missing something.  I did read the web page, and I just
> kept thinking "why?"...  So, why?
> 
> I'm sorry, I really don't like knocking work that people have put
> a lot of effort into - I just can't see the point.

I just wanted to fit well known imperative programming languages for
the symbolic computations. The LISPPA is a solution. It uses a simple
concept (polymorphic array) as a base for the data representation,
LISPPA algorithmes of working with dynamic data structures (lists,
trees and more) are very laconic and uniform ones. At the same time,
LISPPA technology keeps main advantages of imperative programming in
comparison with the functional approach: effectiveness (random array
access, strict typing etc).

Using LISPPA, you might use C language directly (writing laconic and
effective programs without any pointers and memory
allocation/deallocation mechanismes!) instead of translating Common
Lisp to C etc.

However you have to visit LISSPA site to see it. :-) 

Regards,
Alexander
From: Rahul Jain
Subject: Re: LISPPA
Date: 
Message-ID: <87zn9a2jmf.fsf@nyct.net>
··@cable.netlux.org (Alexander Baranovsky) writes:

> I just wanted to fit well known imperative programming languages for
> the symbolic computations. The LISPPA is a solution. It uses a simple
> concept (polymorphic array)

Is that supposed to be a fancy term for the Lisp heap?

> as a base for the data representation, LISPPA algorithmes of working
> with dynamic data structures (lists, trees and more) are very laconic
> and uniform ones. At the same time, LISPPA technology keeps main
> advantages of imperative programming in comparison with the functional
> approach: effectiveness (random array access, strict typing etc).

Oh no. Have I been fooled all these years into using a language that has
both of those features and calling it "Lisp" when it's really
"not-a-Lisp"?

> Using LISPPA, you might use C language directly (writing laconic and
> effective programs without any pointers and memory
> allocation/deallocation mechanismes!) instead of translating Common
> Lisp to C etc.

Oh. In that case, I don't even need to use the C language in the first
place. :)

-- 
Rahul Jain
·····@nyct.net
Professional Software Developer, Amateur Quantum Mechanicist
From: Alexander Baranovsky
Subject: Re: LISPPA
Date: 
Message-ID: <27d7ccd8.0404182142.194cd9a5@posting.google.com>
Words, words, words...

Did you think some day why Microsoft or Borland did not create any
implementation of Lisp? Why Windows is written in C++, not in Common
Lisp? Why imperative languages covers much more big part of the
software market than Lisp clones?

I know only one answer: the imperative approach and such languages as
Pascal, Basic C or Java are more attractive for majority of
programmers.

You will answer me: "The success is not a good reason. There are a lot
data domains where the imperative approach is not applicable". My
answer is: LISPPA.

A.

Rahul Jain <·····@nyct.net> wrote in message news:<··············@nyct.net>...
> ··@cable.netlux.org (Alexander Baranovsky) writes:
> 
> > I just wanted to fit well known imperative programming languages for
> > the symbolic computations. The LISPPA is a solution. It uses a simple
> > concept (polymorphic array)
> 
> Is that supposed to be a fancy term for the Lisp heap?
> 
> > as a base for the data representation, LISPPA algorithmes of working
> > with dynamic data structures (lists, trees and more) are very laconic
> > and uniform ones. At the same time, LISPPA technology keeps main
> > advantages of imperative programming in comparison with the functional
> > approach: effectiveness (random array access, strict typing etc).
> 
> Oh no. Have I been fooled all these years into using a language that has
> both of those features and calling it "Lisp" when it's really
> "not-a-Lisp"?
> 
> > Using LISPPA, you might use C language directly (writing laconic and
> > effective programs without any pointers and memory
> > allocation/deallocation mechanismes!) instead of translating Common
> > Lisp to C etc.
> 
> Oh. In that case, I don't even need to use the C language in the first
> place. :)
From: Rahul Jain
Subject: Re: LISPPA
Date: 
Message-ID: <87hdv0f6w0.fsf@nyct.net>
··@cable.netlux.org (Alexander Baranovsky) writes:

> Words, words, words...
>
> Did you think some day why Microsoft or Borland did not create any
> implementation of Lisp? Why Windows is written in C++, not in Common
> Lisp? Why imperative languages covers much more big part of the
> software market than Lisp clones?

Because the set of imperative languages includes Lisps. Basic set theory.

-- 
Rahul Jain
·····@nyct.net
Professional Software Developer, Amateur Quantum Mechanicist
From: Alexander Baranovsky
Subject: Re: LISPPA
Date: 
Message-ID: <27d7ccd8.0405020328.39a7cff@posting.google.com>
> Because the set of imperative languages includes Lisps. Basic set theory.

Fuzzy set theory.

A.

Rahul Jain <·····@nyct.net> wrote in message news:<··············@nyct.net>...
> ··@cable.netlux.org (Alexander Baranovsky) writes:
> 
> > Words, words, words...
> >
> > Did you think some day why Microsoft or Borland did not create any
> > implementation of Lisp? Why Windows is written in C++, not in Common
> > Lisp? Why imperative languages covers much more big part of the
> > software market than Lisp clones?
> 
> Because the set of imperative languages includes Lisps. Basic set theory.
From: Rahul Jain
Subject: Re: LISPPA
Date: 
Message-ID: <8765beergz.fsf@nyct.net>
··@cable.netlux.org (Alexander Baranovsky) writes:

>> Because the set of imperative languages includes Lisps. Basic set theory.
>
> Fuzzy set theory.

OK, you surely are a lost cause. Have fun in fantasy^Wnightmare-land.

-- 
Rahul Jain
·····@nyct.net
Professional Software Developer, Amateur Quantum Mechanicist
From: Alexander Baranovsky
Subject: Re: LISPPA
Date: 
Message-ID: <27d7ccd8.0405030642.19e3302b@posting.google.com>
> >> Because the set of imperative languages includes Lisps. Basic set theory.
> >
> > Fuzzy set theory.
> 
> OK, you surely are a lost cause. Have fun in fantasy^Wnightmare-land.

Alas. You have tried to sum up the discussion but you did not show
nothing except the lack of sense of humor and good breeding.

My remark has been related to the word "includes". The Lisps cannot be
a match for such imperative languages as C, Basic or Pascal in view of
effectivity of programming, compile-time type checking, clean
algol-liked syntax and readability, huge number of libraries, big
number of IDEs, huge number of applications, huge amount of
investments and huge number of users.

Does it mean I consider Lisp as a bad or useless language? Not, of
course. For example, I like the idea to derive language from a minimal
set of base concepts, I like the idea of convergence between
imperative, logical and functional platforms, finally I would like to
see "new PL/1" some day :-) But why Lisp programmers think that the
imperative platform and such languages as C, Basic or Pascal is an
unsuitable base for such convergence? Indeed, there are no big
problems to fit mentioned imperative languages for the symbolic
computations. In another hand, any declarative model of computations
can be easily implemented as a set of classes written in the language.
A lot of programmers can be interested to add AI functionality to
already existed applications as extra module in C, Basic or Pascal,
new effective solutions will be developed in the framework of
component-based architecture...

A.

Rahul Jain <·····@nyct.net> wrote in message news:<··············@nyct.net>...
> ··@cable.netlux.org (Alexander Baranovsky) writes:
> 
> >> Because the set of imperative languages includes Lisps. Basic set theory.
> >
> > Fuzzy set theory.
> 
> OK, you surely are a lost cause. Have fun in fantasy^Wnightmare-land.
From: Matthew Danish
Subject: Re: LISPPA
Date: 
Message-ID: <20040503151247.GC25328@mapcar.org>
On Mon, May 03, 2004 at 07:42:24AM -0700, Alexander Baranovsky wrote:
> Does it mean I consider Lisp as a bad or useless language? Not, of
> course. For example, I like the idea to derive language from a minimal
> set of base concepts, I like the idea of convergence between
> imperative, logical and functional platforms, finally I would like to
> see "new PL/1" some day :-) 

You consistently demonstrate a complete lack of understanding of what
Lisp is, and what Lisp is about.  This paragraph sums it up.  That is
why Lisp programmers do not take you seriously.

Maybe if you were to attempt to learn Lisp, the *real* Lisp and not your
imaginary Lisp, people would listen when you critique it.

> But why Lisp programmers think that the imperative platform and such
> languages as C, Basic or Pascal is an unsuitable base for such
> convergence? Indeed, there are no big problems to fit mentioned
> imperative languages for the symbolic computations. 

Because none of these languages even has first-class functions and
closures.  That, in my opinion, is far more useful and much harder to
replace than lists.  All the libraries in the world don't make up for
the fact that you are effectively[1] hamstrung when using these
languages.


[1] That is using 'effective,' as a word, correctly!

-- 
; Matthew Danish <·······@andrew.cmu.edu>
; OpenPGP public key: C24B6010 on keyring.debian.org
; Signed or encrypted mail welcome.
; "There is no dark side of the moon really; matter of fact, it's all dark."
From: Gareth McCaughan
Subject: Re: LISPPA
Date: 
Message-ID: <87smehmbii.fsf@g.mccaughan.ntlworld.com>
Alexander Baranovsky wrote:

> My remark has been related to the word "includes". The Lisps cannot be
> a match for such imperative languages as C, Basic or Pascal in view of
> effectivity of programming,

What do you mean by "effectivity"?

>                             compile-time type checking,

Compile-time type checking is nice. But, at least when
comparing Lisp with languages like C, Pascal and Basic,
it just isn't worth the pain. The small gain in safety
and convenience is much, much more than offset by the
verbosity, the inconvenience whenever anything needs
to be changed, and the near-impossibility of interactive
use. With the time dynamic typing saves me, I can put
more care into checking the safety of my code in other
ways. The bugs that my C compiler catches for me are
the trivial, shallow ones that I'd have caught myself
anyway. The ones it doesn't catch are the subtle ones
that are easier to find when I'm not wasting neurons
on the hassle of static typing.

Now, if you were offering something like ML or Haskell
where you hardly ever need to declare types explicitly
becausae the inferencer is so clever, you'd have a point.
But you aren't.

>                                                         clean
> algol-liked syntax and readability,

The syntax of Lisp is much cleaner than that of any of
your preferred languages. It is, indeed, not Algol-like,
and apparently that's a problem for you. That's fine;
it's a problem for many people. But please understand
that *your* difficulty with the syntax of Lisp is not
*our* problem. "Readability" is almost entirely a
consequence of familiarity. "Lisp is less readable"
can be paraphrased as "I haven't used Lisp so much".
Boring.

>                                     huge number of libraries,

Yes, C wins there; maybe Basic too, if you mean Microsoft
Visual Basic. I'm not so sure about Pascal, but maybe you're
right.

>                                                               big
> number of IDEs,

Why would that be an advantage? One is enough. Perhaps zero
is enough.

>                 huge number of applications,

The number of applications written in a particular language
may (debatably) be some sort of *measure* of its quality,
but it isn't itself an advantage or a disadvantage of the
language.

>                                              huge amount of
> investments

Ah yes, money-oriented programming. The new paradigm.

>             and huge number of users.

If you feel unsafe when you're doing something that millions
of other people aren't also doing, then by all means go and
use Visual Basic.

> Does it mean I consider Lisp as a bad or useless language? Not, of
> course. For example, I like the idea to derive language from a minimal
> set of base concepts,

Common Lisp is no more "derived from a minimal set of base
concepts" than Pascal or C.

>                             But why Lisp programmers think that the
> imperative platform and such languages as C, Basic or Pascal is an
> unsuitable base for such convergence? Indeed, there are no big
> problems to fit mentioned imperative languages for the symbolic
> computations.

It's got nothing much to do with "symbolic computations";
the large majority of stuff done in Common Lisp doesn't
take the form of "symbolic computations". It happens that
Lisp is particularly good at such things, but there are
lots of other things it's particularly good at.

Nor is our preference for Lisp entirely a matter of
wanting "convergence" between imperative, functional
and logic programming.

C, Basic and Pascal are inferior for "such convergence"
because

  - without lexical closures they are unusable for
    functional programming;

  - making a sharp distinction between expressions
    and statements, and forbidding lots of useful
    stuff to occur in the former, they are painful
    for functional programming;

  - lacking automatic memory management, they are
    difficult to use effectively in any style that
    isn't explicit about the sequence of operations,
    which makes functional programming (and many
    other things) very unpleasant.

I'm not particularly interested in "logic programming",
neat though it is, so I shan't comment on the deficiencies
of your preferred languages as substrates for logic
programming beyond saying that the reason why Lisp is
usable for logic programming is its capacity for syntactic
abstraction, which is another conspicuous lack in the
languages you prefer.

They have other deficiencies compared to Common Lisp,
which aren't a matter of being unsuitable for "convergence"
or unfit for "symbolic computation".

Please try to understand the following: There are more
reasons for using Common Lisp than you think there are.
(There are doubtless more reasons for using C, or Fortran,
or Haskell, or 6502 assembly language, than you think,
too.) You have somehow got the idea that (1) Lisp is all
about "symbolic computation" for AI, and (2) the only
thing that makes Lisp good for "symbolic computation"
or for AI is its support for heterogeneous lists. Neither
of those things is true. So your repeated statements that
"LISPPA" makes C, Pascal and Basic suitable replacements
for Lisp by enabling them to do "symbolic computations"
completely miss the point, and would completely miss
the point even if it were true that Pascal+LISPPA is just
as good for symbolic computation as Lisp. Which it isn't.

I dare say that adding heterogeneous arrays to Pascal is
an improvement, just as you say it is. That's nice. Well
done. But please leave Lisp out of it, because what you've
done doesn't have anything to do with Lisp other than
the name.

>               In another hand, any declarative model of computations
> can be easily implemented as a set of classes written in the language.

(So you aren't talking about C or Pascal, then, but about some
extensions of them with classes like C++ or Delphi.)

Any "declarative model of computations" can be easily
implemented by a set of assembly-language routines and
suitable conventions for calling them. It'll hurt, though.
Likewise for "declarative programming" in any language
that's much like C, Basic or Pascal.

> A lot of programmers can be interested to add AI functionality to
> already existed applications as extra module in C, Basic or Pascal,
> new effective solutions will be developed in the framework of
> component-based architecture...

"Component-based architecture"? Man, that's so 20th-century.
It's all web applications and XML nowadays. You're on the
wrong bandwagon.

-- 
Gareth McCaughan
.sig under construc
From: Alexander Baranovsky
Subject: Re: LISPPA
Date: 
Message-ID: <27d7ccd8.0405040229.26ca5e7@posting.google.com>
>> My remark has been related to the word "includes". The Lisps cannot
be
>> a match for such imperative languages as C, Basic or Pascal in view
of
>> effectivity of programming,

> What do you mean by "effectivity"?

The speed of execution and the minimization of expenditure for the
memory resources.

>>                             compile-time type checking,

>Compile-time type checking is nice. But, at least when
>comparing Lisp with languages like C, Pascal and Basic,
>it just isn't worth the pain. The small gain in safety
>and convenience is much, much more than offset by the
>verbosity, the inconvenience whenever anything needs
>to be changed, and the near-impossibility of interactive
>use. With the time dynamic typing saves me, I can put
>more care into checking the safety of my code in other
>ways. The bugs that my C compiler catches for me are
>the trivial, shallow ones that I'd have caught myself
>anyway. 

Any Pascal programmer will testify that the compile-time error
checking is huge advantage. Not only type checking but explicit
declaration of variables allows to avoid many problems related to the
scope, time of life etc. These "trivial" bugs are dangerous.

I can accept that Lisp tries to compensate of the lack of compile-time
error checking by means of another tools, but please do not protest
against evidence: if majority of bugs can be fixed before execution,
it is advantage.

>                                                         clean
> algol-liked syntax and readability,

>The syntax of Lisp is much cleaner than that of any of
>your preferred languages. It is, indeed, not Algol-like,
>and apparently that's a problem for you. That's fine;
>it's a problem for many people. But please understand
>that *your* difficulty with the syntax of Lisp is not
>*our* problem. "Readability" is almost entirely a
>consequence of familiarity. "Lisp is less readable"
>can be paraphrased as "I haven't used Lisp so much".
>Boring.

:-)

Open any book devoted to algorithmes and data structures, or better:
open book "AI.Structures and Strategies for Complex Problem Solving"
by George Luger. You will find algorithmes written in pseudo-pascal,
not in Lisp. Open book "Symbolic logic and mechanical theorem proving"
by Chang and Lee and compare source code presented in Appendix A with
the translation of the code presented at my site...

I understand you. If somebody is writing programs in assembler during
25 year, he will tell that *your* difficulty with the syntax of
assembler is not
*his* problem. :-). No, I do not compare Lisp and Assembler. I'm
trying to tell about tradition in the representation of algorithmes.

>> big number of IDEs,

>Why would that be an advantage? One is enough. Perhaps zero
>is enough.

New protest against evidence :-) Please understand, my position is
absolutely non-agressive, unlike some Lisp programmers in this thread.
I do not mean you, I respect your professional approach to the
discussion. But why sometimes you are trying to discuss obvious
things? :-)
If you have tools of visual programming, nice editor, advanced
debugger: why it is not important?

>>  huge amount of investments

> Ah yes, money-oriented programming. The new paradigm.

:-)

This is one of sides of reality which reflects the general recognition
of mentioned languages.

>  and huge number of users.

>If you feel unsafe when you're doing something that millions
>of other people aren't also doing, then by all means go and
>use Visual Basic.

No, no. My intention is contrary. I'm trying to extend mentioned
imperative languages with new tools which will allow to increase the
applicability of those languages for the wide set of interesting and
science-intensive problems.

>> Does it mean I consider Lisp as a bad or useless language? Not, of
>> course. For example, I like the idea to derive language from a
minimal
>> set of base concepts,

> Common Lisp is no more "derived from a minimal set of base
> concepts" than Pascal or C.

My statement "I like the idea" has been related to Lisp.

>Nor is our preference for Lisp entirely a matter of
>wanting "convergence" between imperative, functional
>and logic programming.

>C, Basic and Pascal are inferior for "such convergence"
>because
>
>  - without lexical closures they are unusable for
>    functional programming;

>  - making a sharp distinction between expressions
>    and statements, and forbidding lots of useful
>    stuff to occur in the former, they are painful
>    for functional programming;

This is true. The functional model is just a "point at infinity" for a
good imperative language. Alas, we will not able to reduce the
programming as a whole to the functional paradigm. But, for example,
if the assignment operator is "evil", the language should allow to
restrict the using assignments. It leads to idea of representation of
complex transformations and algorithmes in a concise form which hides
internal mechanismes of memory allocation, pointers and big number of
"auxiliary assignments". Another ideas such as recursion, high-order
functions etc can be considered as properties of "good" imperative
language.

>  - lacking automatic memory management, they are
>    difficult to use effectively in any style that
>    isn't explicit about the sequence of operations,
>    which makes functional programming (and many
>    other things) very unpleasant.

Many imperative languages support it: Java, C# and other. However, in
my opinion, the "good" language should support both automatic memory
management and a possibility to free memory explicitly (for example,
in long loops). Any model of garbage collector has bottlenecks,
professional programmer should know how the garbage collector works to
write effective and safe programs.

>I'm not particularly interested in "logic programming",
>neat though it is, so I shan't comment on the deficiencies
>of your preferred languages as substrates for logic
>programming beyond saying that the reason why Lisp is
>usable for logic programming is its capacity for syntactic
>abstraction, which is another conspicuous lack in the
>languages you prefer.

I guess, you mean the representation of terms. But array is not less
suitable for it than lists. Compare

(f (g x y) a)

and

[f, [g, x, y], a]

I do not see big differences. Moreover, the presence of the random
access as property of array, can increase the performance.

>They have other deficiencies compared to Common Lisp,
>which aren't a matter of being unsuitable for "convergence"
>or unfit for "symbolic computation".

May be "yes", may be "no". But please take a look at the theorem
proving program presented at LISPPA site before any conclusion. You
will see that the problem of programming of "kernel" of the logic
programming is absolutely painless.

Btw, I wrote my first interpreter (for the VIRT language) 7 years ago.
It was written bad, the interpreter was slow. However it provided
better performance than XLISP for the theorem proving. Ok, XLISP is
not Common Lisp, VIRT is not Delphi. Even my present paxScript is more
fast than VIRT in 8 -10 times. So, your conclusions look as premature
ones.

>Please try to understand the following: There are more
>reasons for using Common Lisp than you think there are.
>(There are doubtless more reasons for using C, or Fortran,
>or Haskell, or 6502 assembly language, than you think,
>too.) You have somehow got the idea that (1) Lisp is all
>about "symbolic computation" for AI, and (2) the only
>thing that makes Lisp good for "symbolic computation"
>or for AI is its support for heterogeneous lists. Neither
>of those things is true. So your repeated statements that
>"LISPPA" makes C, Pascal and Basic suitable replacements
>for Lisp by enabling them to do "symbolic computations"
>completely miss the point, and would completely miss
>the point even if it were true that Pascal+LISPPA is just
>as good for symbolic computation as Lisp. Which it isn't.

Please understand me too :-) The "attack" of Lisp never was my goal
(however, it is not the same as to consider Lisp as a "sacred cow").
Lisp, functional and logical languages are useful sources for ideas in
view of an imperative programmer. If they are more suitable for a
class of problems at present time, I'm trying to understand "why" and
to find a "good replacement" in imperative language such as Pascal,
I'm trying to find another tools to solve the same problems with the
same success or even more effectively. My audience is C, Pascal, Basic
and Java programmers par excellence, LISPPA has been designed for
these languages. But personally for me, it is interesting to know
reaction and Lisp programmers as well.

>I dare say that adding heterogeneous arrays to Pascal is
>an improvement, just as you say it is. That's nice. Well
>done. But please leave Lisp out of it, because what you've
>done doesn't have anything to do with Lisp other than
>the name.

:-)

I'm sorry about the name. May be just this point irritate Lisp
programmers very much and "processing of dynamic data structures based
on the polymorphic arrays" would be more better and painless :-) But I
have thought that consideration "array vs list" can be interesting for
wide area of symbolic computations and AI applications. Indeed, the
term

(f (g x y) p(a)) 

can be considered as list of lists and as array of arrays. This
duality has been reflected in the "LISPPA" name.

>>               In another hand, any declarative model of
computations
>> can be easily implemented as a set of classes written in the
language.

>(So you aren't talking about C or Pascal, then, but about some
>extensions of them with classes like C++ or Delphi.)
>
>Any "declarative model of computations" can be easily
>implemented by a set of assembly-language routines and
>suitable conventions for calling them. It'll hurt, though.
>Likewise for "declarative programming" in any language
>that's much like C, Basic or Pascal.

We are discussing tools of painless, native implementation. 

>> A lot of programmers can be interested to add AI functionality to
>> already existed applications as extra module in C, Basic or Pascal,
>> new effective solutions will be developed in the framework of
>> component-based architecture...

>"Component-based architecture"? Man, that's so 20th-century.
>It's all web applications and XML nowadays. You're on the
>wrong bandwagon.

I never hear that the component-based architecture has been abolished.
:-)

A.


Gareth McCaughan <················@pobox.com> wrote in message news:<··············@g.mccaughan.ntlworld.com>...
> Alexander Baranovsky wrote:
> 
> > My remark has been related to the word "includes". The Lisps cannot be
> > a match for such imperative languages as C, Basic or Pascal in view of
> > effectivity of programming,
> 
> What do you mean by "effectivity"?
> 
> >                             compile-time type checking,
> 
> Compile-time type checking is nice. But, at least when
> comparing Lisp with languages like C, Pascal and Basic,
> it just isn't worth the pain. The small gain in safety
> and convenience is much, much more than offset by the
> verbosity, the inconvenience whenever anything needs
> to be changed, and the near-impossibility of interactive
> use. With the time dynamic typing saves me, I can put
> more care into checking the safety of my code in other
> ways. The bugs that my C compiler catches for me are
> the trivial, shallow ones that I'd have caught myself
> anyway. The ones it doesn't catch are the subtle ones
> that are easier to find when I'm not wasting neurons
> on the hassle of static typing.
> 
> Now, if you were offering something like ML or Haskell
> where you hardly ever need to declare types explicitly
> becausae the inferencer is so clever, you'd have a point.
> But you aren't.
> 
> >                                                         clean
> > algol-liked syntax and readability,
> 
> The syntax of Lisp is much cleaner than that of any of
> your preferred languages. It is, indeed, not Algol-like,
> and apparently that's a problem for you. That's fine;
> it's a problem for many people. But please understand
> that *your* difficulty with the syntax of Lisp is not
> *our* problem. "Readability" is almost entirely a
> consequence of familiarity. "Lisp is less readable"
> can be paraphrased as "I haven't used Lisp so much".
> Boring.
> 
> >                                     huge number of libraries,
> 
> Yes, C wins there; maybe Basic too, if you mean Microsoft
> Visual Basic. I'm not so sure about Pascal, but maybe you're
> right.
> 
> >                                                               big
> > number of IDEs,
> 
> Why would that be an advantage? One is enough. Perhaps zero
> is enough.
> 
> >                 huge number of applications,
> 
> The number of applications written in a particular language
> may (debatably) be some sort of *measure* of its quality,
> but it isn't itself an advantage or a disadvantage of the
> language.
> 
> >                                              huge amount of
> > investments
> 
> Ah yes, money-oriented programming. The new paradigm.
> 
> >             and huge number of users.
> 
> If you feel unsafe when you're doing something that millions
> of other people aren't also doing, then by all means go and
> use Visual Basic.
> 
> > Does it mean I consider Lisp as a bad or useless language? Not, of
> > course. For example, I like the idea to derive language from a minimal
> > set of base concepts,
> 
> Common Lisp is no more "derived from a minimal set of base
> concepts" than Pascal or C.
> 
> >                             But why Lisp programmers think that the
> > imperative platform and such languages as C, Basic or Pascal is an
> > unsuitable base for such convergence? Indeed, there are no big
> > problems to fit mentioned imperative languages for the symbolic
> > computations.
> 
> It's got nothing much to do with "symbolic computations";
> the large majority of stuff done in Common Lisp doesn't
> take the form of "symbolic computations". It happens that
> Lisp is particularly good at such things, but there are
> lots of other things it's particularly good at.
> 
> Nor is our preference for Lisp entirely a matter of
> wanting "convergence" between imperative, functional
> and logic programming.
> 
> C, Basic and Pascal are inferior for "such convergence"
> because
> 
>   - without lexical closures they are unusable for
>     functional programming;
> 
>   - making a sharp distinction between expressions
>     and statements, and forbidding lots of useful
>     stuff to occur in the former, they are painful
>     for functional programming;
> 
>   - lacking automatic memory management, they are
>     difficult to use effectively in any style that
>     isn't explicit about the sequence of operations,
>     which makes functional programming (and many
>     other things) very unpleasant.
> 
> I'm not particularly interested in "logic programming",
> neat though it is, so I shan't comment on the deficiencies
> of your preferred languages as substrates for logic
> programming beyond saying that the reason why Lisp is
> usable for logic programming is its capacity for syntactic
> abstraction, which is another conspicuous lack in the
> languages you prefer.
> 
> They have other deficiencies compared to Common Lisp,
> which aren't a matter of being unsuitable for "convergence"
> or unfit for "symbolic computation".
> 
> Please try to understand the following: There are more
> reasons for using Common Lisp than you think there are.
> (There are doubtless more reasons for using C, or Fortran,
> or Haskell, or 6502 assembly language, than you think,
> too.) You have somehow got the idea that (1) Lisp is all
> about "symbolic computation" for AI, and (2) the only
> thing that makes Lisp good for "symbolic computation"
> or for AI is its support for heterogeneous lists. Neither
> of those things is true. So your repeated statements that
> "LISPPA" makes C, Pascal and Basic suitable replacements
> for Lisp by enabling them to do "symbolic computations"
> completely miss the point, and would completely miss
> the point even if it were true that Pascal+LISPPA is just
> as good for symbolic computation as Lisp. Which it isn't.
> 
> I dare say that adding heterogeneous arrays to Pascal is
> an improvement, just as you say it is. That's nice. Well
> done. But please leave Lisp out of it, because what you've
> done doesn't have anything to do with Lisp other than
> the name.
> 
> >               In another hand, any declarative model of computations
> > can be easily implemented as a set of classes written in the language.
> 
> (So you aren't talking about C or Pascal, then, but about some
> extensions of them with classes like C++ or Delphi.)
> 
> Any "declarative model of computations" can be easily
> implemented by a set of assembly-language routines and
> suitable conventions for calling them. It'll hurt, though.
> Likewise for "declarative programming" in any language
> that's much like C, Basic or Pascal.
> 
> > A lot of programmers can be interested to add AI functionality to
> > already existed applications as extra module in C, Basic or Pascal,
> > new effective solutions will be developed in the framework of
> > component-based architecture...
> 
> "Component-based architecture"? Man, that's so 20th-century.
> It's all web applications and XML nowadays. You're on the
> wrong bandwagon.
From: Russell McManus
Subject: Re: LISPPA
Date: 
Message-ID: <87oep41iel.fsf@thelonious.dyndns.org>
··@cable.netlux.org (Alexander Baranovsky) writes:

> Open any book devoted to algorithmes and data structures, or better:
> open book "AI.Structures and Strategies for Complex Problem Solving"
> by George Luger. You will find algorithmes written in pseudo-pascal,
> not in Lisp. Open book "Symbolic logic and mechanical theorem proving"
> by Chang and Lee and compare source code presented in Appendix A with
> the translation of the code presented at my site...

I just opened one called "Paradigms of Artificial Intelligence
Programming", by Peter Norvig.  In this book, he shows how to build a
logic programming compiler in a few hundred lines of lisp.  This is
the application area under discussion, if I remember correctly.

I also happen to have a copy of "Structure and Interpretation of
Computer Programs".  In chapter 2 of this book, the authors show how
to build a symbolic math package that can handle polynomial
operations, all in a couple hundred lines of Scheme.  I could go on.

Bzzzt. Try again, thanks for playing.

-russ
From: Tayssir John Gabbour
Subject: Re: LISPPA
Date: 
Message-ID: <866764be.0405040822.22815c81@posting.google.com>
··@cable.netlux.org (Alexander Baranovsky) wrote in message news:<···························@posting.google.com>...
> I can accept that Lisp tries to compensate of the lack of compile-time
> error checking by means of another tools, but please do not protest
> against evidence: if majority of bugs can be fixed before execution,
> it is advantage.

Incidentally, you might want to look at the test-coverage tools here,
which in a few pages instruments code to assess what parts of code
were exercised by a test suite.
http://www.merl.com/papers/TR91-04/

You weren't looking for test cases, but it's just an example of how
programmers can bulid automatic correctness tools without having to
leave the lisp environment. (That is, no external tool needed.)

In fact, given lisp's known power in machine reasoning and logic,
especially analyzing code with data-processing techniques, it is sad
that people hold correctness-checking over lisp's head. One should ask
the designers of other languages why they keep users from creating
their own correctness tools.


Also, check out compiler macros for generating high-performance code,
based on statically available information. Lisp is a "Telescoping
Language," that means the programmer can tell the compiler how to
optimize special cases on code she writes. For example, any compiler
can optimize (* 3600 a 365) into (* 1314000 a) so less arithmetic is
done at runtime. Because the compiler writer taught it to understand
arithmetic reduction. However, can it do the same for all of your
functions? Lisp allows programmers to tell the compiler how to
optimize special cases.
From: Dave Pearson
Subject: Re: LISPPA
Date: 
Message-ID: <slrnc9evhl.9t4.davep.news@hagbard.davep.org>
* Alexander Baranovsky <··@cable.netlux.org>:

> Any Pascal programmer will testify that the compile-time error checking is
> huge advantage. Not only type checking but explicit declaration of
> variables allows to avoid many problems related to the scope, time of life
> etc. These "trivial" bugs are dangerous.

My "day job" is writing Pascal code (I write a lot of stuff in Borland
Delphi), I also write Common Lisp code for recreation and sometimes to solve
little local problems in my development environment. As such I think I
qualify as "any Pascal programmer". I'm reading the above and wondering what
it is, exactly, when you've got Pascal and Common Lisp in mind, that you're
thinking about.

I can't say I've personally noticed that the Pascal compiler I use really
aids in helping me remove possible problems from my code and, in my
experience, the real problems that take up my time and problems that get
past the compiler (as your correspondent said in his previous post).
Moreover, the Common Lisp compiler I use most of the time (SBCL) is terribly
pedantic and throws up lots of information about my code, much like my
Pascal compiler can and does.

On top of this, from what I've seen and experienced, many of the killer bugs
in applications written in Pascal don't stem from incorrect types but stem
from things like failure to free allocated memory or attempting to use an
object that has been freed, that sort of thing.

> I can accept that Lisp tries to compensate of the lack of compile-time
> error checking by means of another tools, but please do not protest
> against evidence: if majority of bugs can be fixed before execution, it is
> advantage.

This is one of the key differences between developing with Pascal and
developing with Common Lisp: in my experience, with Common Lisp, you don't
have to do a compile and then execute on the whole body of code to test the
bit of code you're working on; you tend to test it there and then, as you're
writing it. As your correspondent pointed out this can and does seem to lead
to a different approach to development which is very much about getting the
code right as you develop it, an approach that is about worrying about and
dealing with non-trivial issues.

From what I've seen above and elsewhere you do seem to give the impression
that you're commenting on Common Lisp development without really having
experienced it. If you had you'd know that you can have compile-time
type-checking should you desire it. How else do you explain this:

,----
| CL-USER[1]> (compile-file "foo")
| ; compiling file "/home/davep/temp/foo.lisp" (written 04 MAY 2004 12:06:07 PM):
| ; recognizing DEFUN add
| ; compiling top level form: 
| ; compiling defun add: 
| ; compiling top level form: 
| ; compiling top level form: 
| 
| ; file: /home/davep/temp/foo.lisp
| ; in: add 23
| ;     (ADD 23 "forty two")
| ; 
| ; note: deleting unreachable code
| ; 
| ; caught warning:
| ;   Asserted type integer conflicts with derived type
| ;   (values (simple-base-string 9) &optional).
| 
| ; compilation unit finished
| ;   caught 1 WARNING condition
| ;   printed 1 note
| 
| ; /home/davep/temp/foo.fasl written
| ; compilation finished in 0:00:00
| 
| #P"/home/davep/temp/foo.fasl"
| t
| t
`----

when I compile this:

,----[ foo.lisp ]
| (defun add (n1 n2)
|   (declare (type integer n1)
|            (type integer n2))
|   (+ n1 n2))
| 
| (add 23 "forty two")
`----

To this Pascal programmer it doesn't seem terribly different. What
difference should this "Pascal programmer" be seeing when he hits the
compile key combination in his Common Lisp IDE?

-- 
Dave Pearson
http://www.davep.org/lisp/
From: Gareth McCaughan
Subject: Re: LISPPA
Date: 
Message-ID: <87d65jlpmg.fsf@g.mccaughan.ntlworld.com>
Alexander Baranovsky wrote:

>>> My remark has been related to the word "includes". The Lisps
>>> cannot be a match for such imperative languages as C, Basic or
>>> Pascal in view of effectivity of programming,
> 
>> What do you mean by "effectivity"?
> 
> The speed of execution and the minimization of expenditure for the
> memory resources.

Most things you can do in C, Pascal or Basic you can also
do in Common Lisp with code that runs about as fast. Some
varieties of low-level bit diddling are still easier to
do efficiently in C. (Some are easier to do efficiently
in assembler.) The slowness of Lisp is mostly an urban
legend.

It's true that Lisp programs are usually more memory-hungry
than programs written in C or Pascal. As against that, they
usually have considerably fewer memory-related bugs. Memories
are so big these days that memory use only matters much for
large programs, and it's not clear that *large* Lisp programs
are much more memory-hungry than *large* programs in weaker
languages.

>> Compile-time type checking is nice. But, at least when
>> comparing Lisp with languages like C, Pascal and Basic,
>> it just isn't worth the pain. The small gain in safety
>> and convenience is much, much more than offset by the
>> verbosity, the inconvenience whenever anything needs
>> to be changed, and the near-impossibility of interactive
>> use. With the time dynamic typing saves me, I can put
>> more care into checking the safety of my code in other
>> ways. The bugs that my C compiler catches for me are
>> the trivial, shallow ones that I'd have caught myself
>> anyway. 
> 
> Any Pascal programmer will testify that the compile-time error
> checking is huge advantage. Not only type checking but explicit
> declaration of variables allows to avoid many problems related to the
> scope, time of life etc. These "trivial" bugs are dangerous.

"Any Pascal programmer"? The huge majority of Pascal programmers
are not in a position to have an educated opinion on whether
static type checking and explicit type declarations make for
better programs than dynamic type checking and type inference,
because the huge majority of Pascal programmers have never used
any language with dynamic type checking and type inference for
anything other than trivial tasks. (I suspect most of them
have *never* used one at all.)

Making variable scopes and lifetimes clear has nothing to do
with making their types explicit. Common Lisp does scoping
just as cleanly as any of the languages you prefer, even
when you're not declaring any types at all. Why on earth
should it be necessary to declare the type of a variable
in order to establish its scope?

> I can accept that Lisp tries to compensate of the lack of compile-time
> error checking by means of another tools, but please do not protest
> against evidence: if majority of bugs can be fixed before execution,
> it is advantage.

You have given no evidence. As for your "majority of bugs":
my experience is as follows. When I am writing in C or C++,
I commit a substantial number of trivial errors that the
compiler catches. Some of them have to do with the types
of variables. I make these errors, the compiler tells me off,
I fix them. When I am writing in Common Lisp, I do not make
nearly so many of these trivial errors.

One reason for this is that many of the errors are *consequences*
of the type system. Many others are consequences of other
misfeatures of C or C++. It is not a merit in a language that
it creates ways of going wrong and then warns you when you
fall into them.

Another reason, I think, is that when writing in C or C++
there is so much irrelevant nonsense that you have to be
thinking about that your brain has less attention available
for thinking about other things, like all the bugs you're
putting into your code.

>>                                                         clean
>> algol-liked syntax and readability,
> 
>> The syntax of Lisp is much cleaner than that of any of
>> your preferred languages. It is, indeed, not Algol-like,
>> and apparently that's a problem for you. That's fine;
>> it's a problem for many people. But please understand
>> that *your* difficulty with the syntax of Lisp is not
>> *our* problem. "Readability" is almost entirely a
>> consequence of familiarity. "Lisp is less readable"
>> can be paraphrased as "I haven't used Lisp so much".
>> Boring.
> 
> :-)
> 
> Open any book devoted to algorithmes and data structures, or better:
> open book "AI.Structures and Strategies for Complex Problem Solving"
> by George Luger. You will find algorithmes written in pseudo-pascal,
> not in Lisp. Open book "Symbolic logic and mechanical theorem proving"
> by Chang and Lee and compare source code presented in Appendix A with
> the translation of the code presented at my site...

As someone has already pointed out, there are books on
AI and books on algorithms-and-data-structures-and-all-that
that use Lisp syntax. The most famous of all "algorithms
and data structures and all that" books would be Knuth's
three volumes, which use ... an assembly language of his
own invention.

(I have three AI books ready to hand. One uses Lisp notation
for its algorithms, one uses pseudo-Pascal, and one doesn't
have any formally stated algorithms at all.)

> I understand you. If somebody is writing programs in assembler during
> 25 year, he will tell that *your* difficulty with the syntax of
> assembler is not *his* problem. :-).

Right. You could ask: How long did it take before you
could read this stuff fluently? and compare the answers
for different languages. That's still sensitive to what
languages people learn first. I predict that the difference
between Lisp and Pascal will be small, and that both will
be far ahead of most assembly languges.

>>> big number of IDEs,
> 
>> Why would that be an advantage? One is enough. Perhaps zero
>> is enough.
> 
> New protest against evidence :-)

What does that mean?

>                                  Please understand, my position is
> absolutely non-agressive, unlike some Lisp programmers in this thread.
> I do not mean you, I respect your professional approach to the
> discussion. But why sometimes you are trying to discuss obvious
> things? :-)

I would prefer to ask: Why do you sometimes think things are
obvious when they are obviously false? :-)

> If you have tools of visual programming, nice editor, advanced
> debugger: why it is not important?

The question I asked was: why is it an advantage to have many
different IDEs? Actually, I'll concede that maybe having a
couple of decent ones is good, because people's tastes vary.
(Though the advantages of consistency might outweigh that,
if there were just one but a very good one.)

"Perhaps zero is enough" wasn't an entirely serious statement,
but there *are* plenty of people who find fancy graphical IDEs
less productive than, say, Emacs and the usual collection of
Unixy stuff. If by "visual programming" you mean the sort of
thing VB does for you, that's very valuable when you're building
GUIs but much less useful for everything else one does when
programming. Yes, a good debugger is very useful. It doesn't
need to be graphical. Yes, a good editor is very useful. It
doesn't need to be part of an IDE.

I've seen people be very productive indeed in Emacs. I've
seen prople be very productive in Visual Studio, too.

By the way, Lisp does have IDEs. All the commercial CL
implementations have them. There's also an excellent
(though unfortunately not yet entirely mature) Emacs-based
Lisp development environment, with the charming name
of SLIME.

>>   and huge number of users.
> 
>> If you feel unsafe when you're doing something that millions
>> of other people aren't also doing, then by all means go and
>> use Visual Basic.
> 
> No, no. My intention is contrary. I'm trying to extend mentioned
> imperative languages with new tools which will allow to increase the
> applicability of those languages for the wide set of interesting and
> science-intensive problems.

Fine. I shan't try to stop you :-).

>>> Does it mean I consider Lisp as a bad or useless language? Not, of
>>> course. For example, I like the idea to derive language from a
>>> minimal set of base concepts,
>
>> Common Lisp is no more "derived from a minimal set of base
>> concepts" than Pascal or C.
> 
> My statement "I like the idea" has been related to Lisp.

I don't understand; sorry.

>> C, Basic and Pascal are inferior for "such convergence"
>> because
>> 
>>   - without lexical closures they are unusable for
>>     functional programming;
>>  
>>   - making a sharp distinction between expressions
>>     and statements, and forbidding lots of useful
>>     stuff to occur in the former, they are painful
>>     for functional programming;
> 
> This is true. The functional model is just a "point at infinity" for a
> good imperative language. Alas, we will not able to reduce the
> programming as a whole to the functional paradigm.

You say "alas"; I say "thank goodness". I want to have
functional programming *available*. I don't want to be
forced to use it all the time. Sometimes a loop is just
a loop.

>                                                    But, for example,
> if the assignment operator is "evil", the language should allow to
> restrict the using assignments.

The assignment operator is not evil. Unless you mean
specifically C++'s assignment operator, which conflates
assignment and copying in a brain-damaging way. That's
evil, all right.

>                                 It leads to idea of representation of
> complex transformations and algorithmes in a concise form which hides
> internal mechanismes of memory allocation, pointers and big number of
> "auxiliary assignments". Another ideas such as recursion, high-order
> functions etc can be considered as properties of "good" imperative
> language.

All of which are poorly supported by C, Pascal and Basic.
(Except for recursion, which is passably supported by all
modern languages.)

> >  - lacking automatic memory management, they are
> >    difficult to use effectively in any style that
> >    isn't explicit about the sequence of operations,
> >    which makes functional programming (and many
> >    other things) very unpleasant.
> 
> Many imperative languages support it: Java, C# and other.

Yes. But you referred specifically to C, Pascal and Basic.
Those languages do not have automatic memory management.
(Actually, maybe Visual Basic does.)

>                                                           However, in
> my opinion, the "good" language should support both automatic memory
> management and a possibility to free memory explicitly (for example,
> in long loops). Any model of garbage collector has bottlenecks,
> professional programmer should know how the garbage collector works to
> write effective and safe programs.

It is very rare for the GC to be a bottleneck. But I do quite
like languages like Modula-3 where automatic and manual memory
management can coexist happily.

>> I'm not particularly interested in "logic programming",
>> neat though it is, so I shan't comment on the deficiencies
>> of your preferred languages as substrates for logic
>> programming beyond saying that the reason why Lisp is
>> usable for logic programming is its capacity for syntactic
>> abstraction, which is another conspicuous lack in the
>> languages you prefer.
> 
> I guess, you mean the representation of terms. But array is not less
> suitable for it than lists. Compare
> 
> (f (g x y) a)
> 
> and
> 
> [f, [g, x, y], a]
> 
> I do not see big differences. Moreover, the presence of the random
> access as property of array, can increase the performance.

No, I do not just mean the representation of terms. I mean
that when you have a language that's basically designed for
one sort of programming and you try to use it for another,
some things tend to be expressed in ugly ways. For instance,
if you try to do object-oriented programming in plain old C
you have to build your own method tables and write explicit
code to use them. This ends up being verbose and hard to read,
in comparison with languages that have syntactic support for OO
such as Objective-C and C++. (And that's just about the only
time you'll hear me saying that C++ is easier to read than
something else.)

Similarly, if you try to do logic programming in, say, Pascal
then things that Prolog has special notation for -- adding
facts to the database, making queries -- needs to be done
in a more explicit way, which again will be verbose and hard
to read.

The same would be just as true in Common Lisp, except that
Common Lisp has powerful features for syntactic abstraction --
macros and a configurable reader -- so that you can write
logic programs in Common Lisp that are nearly as clear and
concise as they would be in Prolog. You can find some nice
examples in Norvig's excellent book "Paradigms of artificial
intelligence programming".

As for the performance of arrays versus linked lists -- you
*do* know that Common Lisp has arrays too, right?

>> They have other deficiencies compared to Common Lisp,
>> which aren't a matter of being unsuitable for "convergence"
>> or unfit for "symbolic computation".
> 
> May be "yes", may be "no". But please take a look at the theorem
> proving program presented at LISPPA site before any conclusion. You
> will see that the problem of programming of "kernel" of the logic
> programming is absolutely painless.

It doesn't look "absolutely painless" to me, in comparison
with doing such things in Lisp.

> Btw, I wrote my first interpreter (for the VIRT language) 7 years ago.
> It was written bad, the interpreter was slow. However it provided
> better performance than XLISP for the theorem proving. Ok, XLISP is
> not Common Lisp, VIRT is not Delphi. Even my present paxScript is more
> fast than VIRT in 8 -10 times. So, your conclusions look as premature
> ones.

I have no idea what conclusions you think look premature
because 7 years ago you wrote something that gave you
better performance for theorem proving than something
else.

>> Please try to understand the following: There are more
>> reasons for using Common Lisp than you think there are.
>> (There are doubtless more reasons for using C, or Fortran,
>> or Haskell, or 6502 assembly language, than you think,
>> too.) You have somehow got the idea that (1) Lisp is all
>> about "symbolic computation" for AI, and (2) the only
>> thing that makes Lisp good for "symbolic computation"
>> or for AI is its support for heterogeneous lists. Neither
>> of those things is true. So your repeated statements that
>> "LISPPA" makes C, Pascal and Basic suitable replacements
>> for Lisp by enabling them to do "symbolic computations"
>> completely miss the point, and would completely miss
>> the point even if it were true that Pascal+LISPPA is just
>> as good for symbolic computation as Lisp. Which it isn't.
> 
> Please understand me too :-) The "attack" of Lisp never was my goal
> (however, it is not the same as to consider Lisp as a "sacred cow").

No, I realise. But ... you came into comp.lang.lisp, saying
"look at my wonderful new thing, which gives all the power
of Lisp to C, Pascal and Basic". And, actually, that *is*
an attack on Lisp, because it's saying that Lisp doesn't
offer anything valuable beyond Pascal + heterogeneous arrays.
Which is not only false but laughably false.

> Lisp, functional and logical languages are useful sources for ideas in
> view of an imperative programmer. If they are more suitable for a
> class of problems at present time, I'm trying to understand "why" and
> to find a "good replacement" in imperative language such as Pascal,
> I'm trying to find another tools to solve the same problems with the
> same success or even more effectively. My audience is C, Pascal, Basic
> and Java programmers par excellence, LISPPA has been designed for
> these languages. But personally for me, it is interesting to know
> reaction and Lisp programmers as well.

It looks to me as if you have conflicting requirements.
On the one hand, you want something that's completely
familiar to Pascal programmers, something that's just
Pascal with a few little bags on the side. On the other
hand, you want something that's just as good as Lisp
for solving hard problems. I'm sorry to be the bringer
of bad news, but I don't think you can have both at once.

>> I dare say that adding heterogeneous arrays to Pascal is
>> an improvement, just as you say it is. That's nice. Well
>> done. But please leave Lisp out of it, because what you've
>> done doesn't have anything to do with Lisp other than
>> the name.
> 
> :-)
> 
> I'm sorry about the name. May be just this point irritate Lisp
> programmers very much and "processing of dynamic data structures based
> on the polymorphic arrays" would be more better and painless :-)

I'm not irritated by the name. I'm irritated only by the fact
that you seem to consider that this shallow resemblance to
Lisp makes your languages just as effective as Lisp for the
things Lisp does well. Believe me, it doesn't.

>                                                                  But I
> have thought that consideration "array vs list" can be interesting for
> wide area of symbolic computations and AI applications. Indeed, the
> term
> 
> (f (g x y) p(a)) 
> 
> can be considered as list of lists and as array of arrays. This
> duality has been reflected in the "LISPPA" name.

No one is disputing that you can use heterogeneous arrays
when writing programs that do symbolic computation. Why
would they?

>>>               In another hand, any declarative model of
>>> computations can be easily implemented as a set of classes
>>> written in the language.
...
>> Any "declarative model of computations" can be easily
>> implemented by a set of assembly-language routines and
>> suitable conventions for calling them. It'll hurt, though.
>> Likewise for "declarative programming" in any language
>> that's much like C, Basic or Pascal.
> 
> We are discussing tools of painless, native implementation. 

"Painless" is like "readable": it's relative to what
you're used to. I expect that in the early days of
assembly language programming, *that* was advertised
as a way to write complicated programs painlessly. In
fact, it did remove a lot of the pain, but then new
kinds of pain were discovered. So then along comes
FORTRAN, which lets you write your scientific software
painlessly, using real mathematical notation. Um,
except that it turns out that there are other sources
of pain too even when you write in FORTRAN. So it goes:
every advance in programming technology seems like it
might mean an end to pain, but in fact it just means
that you get to discover some new (and hopefully less
awful) kinds of pain.

I do not accept that "any declarative model of computations
can be easily implemented as a set of classes written in
the language". It may be true for people who don't feel
pain at having to write similar boilerplate code over and
over again, but I am not one of those people. It may be
true for people who don't mind code that's twice the length
it should be, but I am not one of those people. It may be
true for people who don't mind having the important material
in their software obscured by syntactic irrelevancies, but
I am not one of those people.

-- 
Gareth McCaughan
.sig under construc
From: Alexander Baranovsky
Subject: Re: LISPPA
Date: 
Message-ID: <27d7ccd8.0405051024.71959d23@posting.google.com>
> >>> My remark has been related to the word "includes". The Lisps
> >>> cannot be a match for such imperative languages as C, Basic or
> >>> Pascal in view of effectivity of programming,
>  
> >> What do you mean by "effectivity"?
> > 
> > The speed of execution and the minimization of expenditure for the
> > memory resources.
> 
> Most things you can do in C, Pascal or Basic you can also
> do in Common Lisp with code that runs about as fast. Some
> varieties of low-level bit diddling are still easier to
> do efficiently in C. (Some are easier to do efficiently
> in assembler.) The slowness of Lisp is mostly an urban
> legend.

I suppose that each legend based on a true story :-) .C or Pascal have
no legends regarding it. So, to see a reference on benchmarks would be
nice.

> 
> It's true that Lisp programs are usually more memory-hungry
> than programs written in C or Pascal. As against that, they
> usually have considerably fewer memory-related bugs. Memories
> are so big these days that memory use only matters much for
> large programs, and it's not clear that *large* Lisp programs
> are much more memory-hungry than *large* programs in weaker
> languages.

There are a few tools which allows to decrease footprint of a program
in "weaks languages": iteration instead of recursion and "local"
memory deallocation  (at least). The "local" memory deallocation can
include:

- manually deallocation
- time of life of variable
- "semantics" deallocation (for example, reduced assignments in
LISPPA)

I do not see why using of last 2 cases leads to the writing less clear
programs.

> 
> >> Compile-time type checking is nice. But, at least when
> >> comparing Lisp with languages like C, Pascal and Basic,
> >> it just isn't worth the pain. The small gain in safety
> >> and convenience is much, much more than offset by the
> >> verbosity, the inconvenience whenever anything needs
> >> to be changed, and the near-impossibility of interactive
> >> use. With the time dynamic typing saves me, I can put
> >> more care into checking the safety of my code in other
> >> ways. The bugs that my C compiler catches for me are
> >> the trivial, shallow ones that I'd have caught myself
> >> anyway. 
> > 
> > Any Pascal programmer will testify that the compile-time error
> > checking is huge advantage. Not only type checking but explicit
> > declaration of variables allows to avoid many problems related to the
> > scope, time of life etc. These "trivial" bugs are dangerous.
> 
> "Any Pascal programmer"? The huge majority of Pascal programmers
> are not in a position to have an educated opinion on whether
> static type checking and explicit type declarations make for
> better programs than dynamic type checking and type inference,
> because the huge majority of Pascal programmers have never used
> any language with dynamic type checking and type inference for
> anything other than trivial tasks. (I suspect most of them
> have *never* used one at all.)

Seems too categorical to be true. I can assure: Pascal has the dynamic
type checking. Take a look at the variant types. As for the type
inference, any Pascal compiler makes it at compile-stage. And, it
seems we already discussed it, please do not think about "average"
Pascal programmers very bad. Sorry that I need to repeat it.

> 
> Making variable scopes and lifetimes clear has nothing to do
> with making their types explicit. 

I never stated it.

> Common Lisp does scoping
> just as cleanly as any of the languages you prefer, even
> when you're not declaring any types at all. Why on earth
> should it be necessary to declare the type of a variable
> in order to establish its scope?

??? See above: (I never stated it).

> 
> > I can accept that Lisp tries to compensate of the lack of compile-time
> > error checking by means of another tools, but please do not protest
> > against evidence: if majority of bugs can be fixed before execution,
> > it is advantage.
> 
> You have given no evidence. As for your "majority of bugs":
> my experience is as follows. When I am writing in C or C++,
> I commit a substantial number of trivial errors that the
> compiler catches. Some of them have to do with the types
> of variables. I make these errors, the compiler tells me off,
> I fix them. When I am writing in Common Lisp, I do not make
> nearly so many of these trivial errors.

:-) 

> 
> One reason for this is that many of the errors are *consequences*
> of the type system. Many others are consequences of other
> misfeatures of C or C++. It is not a merit in a language that
> it creates ways of going wrong and then warns you when you
> fall into them.
> 
> Another reason, I think, is that when writing in C or C++
> there is so much irrelevant nonsense that you have to be
> thinking about that your brain has less attention available
> for thinking about other things, like all the bugs you're
> putting into your code.

Honestly I do not have big experience in C/C++. My favorite language
is Pascal (Delphi). I'm programming in Pascal since 1990 year, my
previous language was FORTRAN-IV. So, the first my impression was the
power of compile-time error checking. If I have clean understanding
what I have to do, the rest is a simple thing, compiler finds my
errors like Word find my misprint. No "irrelevant nonsenses". (Yes, my
"Word" does not help me to write "poems in functional style", but it
is not an obstacle for writing another "poems" :-)

> 
> >>                                                         clean
> >> algol-liked syntax and readability,
>  
> >> The syntax of Lisp is much cleaner than that of any of
> >> your preferred languages. It is, indeed, not Algol-like,
> >> and apparently that's a problem for you. That's fine;
> >> it's a problem for many people. But please understand
> >> that *your* difficulty with the syntax of Lisp is not
> >> *our* problem. "Readability" is almost entirely a
> >> consequence of familiarity. "Lisp is less readable"
> >> can be paraphrased as "I haven't used Lisp so much".
> >> Boring.
>  
> > :-)
> > 
> > Open any book devoted to algorithmes and data structures, or better:
> > open book "AI.Structures and Strategies for Complex Problem Solving"
> > by George Luger. You will find algorithmes written in pseudo-pascal,
> > not in Lisp. Open book "Symbolic logic and mechanical theorem proving"
> > by Chang and Lee and compare source code presented in Appendix A with
> > the translation of the code presented at my site...
> 
> As someone has already pointed out, there are books on
> AI and books on algorithms-and-data-structures-and-all-that
> that use Lisp syntax. The most famous of all "algorithms
> and data structures and all that" books would be Knuth's
> three volumes, which use ... an assembly language of his
> own invention.

Knuth's trilogy is cool, but I had in mind another classic book: "The
design and analysis of computer algorithmes" (Aho, Hopcroft, Ullman).
It uses pseudo-pascal. Besides I like "Combinatorial algorithmes.
Theory and practice" (Reingold, Nievergelt and Deo), "Combinatoric for
programmers" (Lipsky), "Principles of Software Engineering and Design"
(Zelkowitz, Shaw, Gannon). All these books use pseudo-pascal I did not
mention "Algorithmes and data structure" (Wirth) as it is a "party"
book. :-)

> >>> big number of IDEs,
>  
> >> Why would that be an advantage? One is enough. Perhaps zero
> >> is enough.
> > 
> > New protest against evidence :-)
> 
> What does that mean?

It means to protest again bread and butter :-)

> 
> >                                  Please understand, my position is
> > absolutely non-agressive, unlike some Lisp programmers in this thread.
> > I do not mean you, I respect your professional approach to the
> > discussion. But why sometimes you are trying to discuss obvious
> > things? :-)
> 
> I would prefer to ask: Why do you sometimes think things are
> obvious when they are obviously false? :-)

Please do not tell "they are", please tell "in my opinion, they are".
Categorical statements make discussion less interesting (in my opinion
:-)

> 
> > If you have tools of visual programming, nice editor, advanced
> > debugger: why it is not important?
> 
> The question I asked was: why is it an advantage to have many
> different IDEs? Actually, I'll concede that maybe having a
> couple of decent ones is good, because people's tastes vary.
> (Though the advantages of consistency might outweigh that,
> if there were just one but a very good one.)
> 
> "Perhaps zero is enough" wasn't an entirely serious statement,
> but there *are* plenty of people who find fancy graphical IDEs
> less productive than, say, Emacs and the usual collection of
> Unixy stuff. If by "visual programming" you mean the sort of
> thing VB does for you, that's very valuable when you're building
> GUIs but much less useful for everything else one does when
> programming. 

I'm constantly trying to find out what is the "everything else one"?
Ok, I can guess: it is a sort of non-visual programming. :-) Moreover,
perhaps it is a sort of very difficult programming which compel me to
forget about nice editor, advanced debugger, third-party libraries.
But what I cannot understand is: how such difficult problem can exist
as "thing in itself" without necessity to integrate it with DBMS,
internet etc? But good IDE allows you to solve these
"input/deployment" problems easily.

> Yes, a good debugger is very useful. It doesn't
> need to be graphical. Yes, a good editor is very useful. It
> doesn't need to be part of an IDE.

:-)

> 
> >>   and huge number of users.
>  
> >> If you feel unsafe when you're doing something that millions
> >> of other people aren't also doing, then by all means go and
> >> use Visual Basic.
> > 
> > No, no. My intention is contrary. I'm trying to extend mentioned
> > imperative languages with new tools which will allow to increase the
> > applicability of those languages for the wide set of interesting and
> > science-intensive problems.
> 
> Fine. I shan't try to stop you :-).

Thank you :-)

> 
> >>> Does it mean I consider Lisp as a bad or useless language? Not, of
> >>> course. For example, I like the idea to derive language from a
> >>> minimal set of base concepts,
>  
> >> Common Lisp is no more "derived from a minimal set of base
> >> concepts" than Pascal or C.
> > 
> > My statement "I like the idea" has been related to Lisp.
> 
> I don't understand; sorry.

CAR, CDR, CONS, EQ, ATOM.

> 
> >> C, Basic and Pascal are inferior for "such convergence"
> >> because
> >> 
> >>   - without lexical closures they are unusable for
> >>     functional programming;
> >>  
> >>   - making a sharp distinction between expressions
> >>     and statements, and forbidding lots of useful
> >>     stuff to occur in the former, they are painful
> >>     for functional programming;
> > 
> > This is true. The functional model is just a "point at infinity" for a
> > good imperative language. Alas, we will not able to reduce the
> > programming as a whole to the functional paradigm.
> 
> You say "alas"; I say "thank goodness". I want to have
> functional programming *available*. I don't want to be
> forced to use it all the time. Sometimes a loop is just
> a loop.

I absolutely agree here. The functional approach should be used
"locally" and intelligently. The "alas" is another's "alas", not mine.

> 
> >                                                    But, for example,
> > if the assignment operator is "evil", the language should allow to
> > restrict the using assignments.
> 
> The assignment operator is not evil. Unless you mean
> specifically C++'s assignment operator, which conflates
> assignment and copying in a brain-damaging way. That's
> evil, all right.

Too many auxiliary assignments is "evil" (in my opinion). :-) The
problem is similar to "goto" problem a bit. The "goto" exists, but
battle against "goto" was not useless: we got structured programming
as a result. The battle against auxiliary assignments leads to the
"functional appoach in the small". In particular, LISPPA illustrates
this tendency.

> 
> >                                 It leads to idea of representation of
> > complex transformations and algorithmes in a concise form which hides
> > internal mechanismes of memory allocation, pointers and big number of
> > "auxiliary assignments". Another ideas such as recursion, high-order
> > functions etc can be considered as properties of "good" imperative
> > language.
> 
> All of which are poorly supported by C, Pascal and Basic.
> (Except for recursion, which is passably supported by all
> modern languages.)

High order functions are available in Pascal via procedural types.

> >> I'm not particularly interested in "logic programming",
> >> neat though it is, so I shan't comment on the deficiencies
> >> of your preferred languages as substrates for logic
> >> programming beyond saying that the reason why Lisp is
> >> usable for logic programming is its capacity for syntactic
> >> abstraction, which is another conspicuous lack in the
> >> languages you prefer.
> > 
> > I guess, you mean the representation of terms. But array is not less
> > suitable for it than lists. Compare
> > 
> > (f (g x y) a)
> > 
> > and
> > 
> > [f, [g, x, y], a]
> > 
> > I do not see big differences. Moreover, the presence of the random
> > access as property of array, can increase the performance.
> 
> No, I do not just mean the representation of terms. I mean
> that when you have a language that's basically designed for
> one sort of programming and you try to use it for another,
> some things tend to be expressed in ugly ways. For instance,
> if you try to do object-oriented programming in plain old C
> you have to build your own method tables and write explicit
> code to use them. This ends up being verbose and hard to read,
> in comparison with languages that have syntactic support for OO
> such as Objective-C and C++. (And that's just about the only
> time you'll hear me saying that C++ is easier to read than
> something else.)
> 
> Similarly, if you try to do logic programming in, say, Pascal
> then things that Prolog has special notation for -- adding
> facts to the database, making queries -- needs to be done
> in a more explicit way, which again will be verbose and hard
> to read.

We already discussed the representation of facts, terms etc in Lisp
and Pascal + LISPPA. There are no big differences (please see above).
A query can be expressed as a term. So, no problem again. Moreover,
Pascal can provide more effective representation of terms in
comparison with Lisp. Once again, please see the theorem proving
program in paxPascal at the LISPPA site.

I mean the representation terms as "term" + "parameters".

Let's consider T term

 (f(x, y), g(x))

You see, the x variable is presented twice. To replace x with z (it is
a part of theorem proving algorithm), you have to make tour of tree in
Lisp.

In paxPascal you can represent the T term as

(f(p1, p2), f(p1)), where p1 is alias of x, p2 is alias of y. So (x,y)
represents parameters of T.

To rename variables of T, you have just to rename parameters. So, you
have linear algorithm instead of exponential one. This representation
also simplifies and speeds up the unification algorithm - heart of any
logical programming system.

> 
> The same would be just as true in Common Lisp, except that
> Common Lisp has powerful features for syntactic abstraction --
> macros and a configurable reader -- so that you can write
> logic programs in Common Lisp that are nearly as clear and
> concise as they would be in Prolog. You can find some nice
> examples in Norvig's excellent book "Paradigms of artificial
> intelligence programming".

Sorry, I have no this book. It would be really very interesting for me
to read it.

> 
> As for the performance of arrays versus linked lists -- you
> *do* know that Common Lisp has arrays too, right?

Yes, I know. But Common Lisp uses linked lists for the data
representation in the mentioned problem. As for the arrays for linear
algebra and related problems, I would prefer to use Pascal, not Common
Lisp.

> 
> >> They have other deficiencies compared to Common Lisp,
> >> which aren't a matter of being unsuitable for "convergence"
> >> or unfit for "symbolic computation".
> > 
> > May be "yes", may be "no". But please take a look at the theorem
> > proving program presented at LISPPA site before any conclusion. You
> > will see that the problem of programming of "kernel" of the logic
> > programming is absolutely painless.
> 
> It doesn't look "absolutely painless" to me, in comparison
> with doing such things in Lisp.

Thank for the "to me" :-)

> 
> > Btw, I wrote my first interpreter (for the VIRT language) 7 years ago.
> > It was written bad, the interpreter was slow. However it provided
> > better performance than XLISP for the theorem proving. Ok, XLISP is
> > not Common Lisp, VIRT is not Delphi. Even my present paxScript is more
> > fast than VIRT in 8 -10 times. So, your conclusions look as premature
> > ones.
> 
> I have no idea what conclusions you think look premature
> because 7 years ago you wrote something that gave you
> better performance for theorem proving than something
> else.

We are discussing the applicability of Pascal+LISPPA for simulation of
logical programming. You states it is not applicable. I remembered
VIRT as an imperative language with Modula-2 syntax as the first try
in this relation and mentioned that the first results in the theorem
proving were not very bad.

> 
> >> Please try to understand the following: There are more
> >> reasons for using Common Lisp than you think there are.
> >> (There are doubtless more reasons for using C, or Fortran,
> >> or Haskell, or 6502 assembly language, than you think,
> >> too.) You have somehow got the idea that (1) Lisp is all
> >> about "symbolic computation" for AI, and (2) the only
> >> thing that makes Lisp good for "symbolic computation"
> >> or for AI is its support for heterogeneous lists. Neither
> >> of those things is true. So your repeated statements that
> >> "LISPPA" makes C, Pascal and Basic suitable replacements
> >> for Lisp by enabling them to do "symbolic computations"
> >> completely miss the point, and would completely miss
> >> the point even if it were true that Pascal+LISPPA is just
> >> as good for symbolic computation as Lisp. Which it isn't.
> > 
> > Please understand me too :-) The "attack" of Lisp never was my goal
> > (however, it is not the same as to consider Lisp as a "sacred cow").
> 
> No, I realise. But ... you came into comp.lang.lisp, saying
> "look at my wonderful new thing, which gives all the power
> of Lisp to C, Pascal and Basic". And, actually, that *is*
> an attack on Lisp, because it's saying that Lisp doesn't
> offer anything valuable beyond Pascal + heterogeneous arrays.

Oh, you "discovered" me. :-) I really have thought about competition.
But it is not the same as to say "forget Lisp. Your 40 years of
history is one big mistake".  I repeat: I'm trying to reproduce
usefull features of Lisp and another languages in my experiment. Why
you see only "attack"? I want to learn and understand, not fight.

> Which is not only false but laughably false.

Please add "in my opinion". :-)

> 
> > Lisp, functional and logical languages are useful sources for ideas in
> > view of an imperative programmer. If they are more suitable for a
> > class of problems at present time, I'm trying to understand "why" and
> > to find a "good replacement" in imperative language such as Pascal,
> > I'm trying to find another tools to solve the same problems with the
> > same success or even more effectively. My audience is C, Pascal, Basic
> > and Java programmers par excellence, LISPPA has been designed for
> > these languages. But personally for me, it is interesting to know
> > reaction and Lisp programmers as well.
> 
> It looks to me as if you have conflicting requirements.
> On the one hand, you want something that's completely
> familiar to Pascal programmers, something that's just
> Pascal with a few little bags on the side. On the other
> hand, you want something that's just as good as Lisp
> for solving hard problems. I'm sorry to be the bringer
> of bad news, but I don't think you can have both at once.

Let's see. Why not? 

> 
> >> I dare say that adding heterogeneous arrays to Pascal is
> >> an improvement, just as you say it is. That's nice. Well
> >> done. But please leave Lisp out of it, because what you've
> >> done doesn't have anything to do with Lisp other than
> >> the name.
>  
> > :-)
> > 
> > I'm sorry about the name. May be just this point irritate Lisp
> > programmers very much and "processing of dynamic data structures based
> > on the polymorphic arrays" would be more better and painless :-)
> 
> I'm not irritated by the name. I'm irritated only by the fact
> that you seem to consider that this shallow resemblance to
> Lisp makes your languages just as effective as Lisp for the
> things Lisp does well. Believe me, it doesn't.

I do not consider all possible problems which "Lisp does well". But I
state that for many problems Pascal+LISPPA (Basic+LISPPA etc). can be
used with the same success or even better. These problems are:
knowledge representation, search algorithmes, symbolic computations, 
interpreters of logic programming.

> 
> >                                                                  But I
> > have thought that consideration "array vs list" can be interesting for
> > wide area of symbolic computations and AI applications. Indeed, the
> > term
> > 
> > (f (g x y) p(a)) 
> > 
> > can be considered as list of lists and as array of arrays. This
> > duality has been reflected in the "LISPPA" name.
> 
> No one is disputing that you can use heterogeneous arrays
> when writing programs that do symbolic computation. Why
> would they?
> 

Sorry, I did not understand your question.

> >>>               In another hand, any declarative model of
> >>> computations can be easily implemented as a set of classes
> >>> written in the language.
>  ...
> >> Any "declarative model of computations" can be easily
> >> implemented by a set of assembly-language routines and
> >> suitable conventions for calling them. It'll hurt, though.
> >> Likewise for "declarative programming" in any language
> >> that's much like C, Basic or Pascal.
> > 
> > We are discussing tools of painless, native implementation. 
> 
> "Painless" is like "readable": it's relative to what
> you're used to. I expect that in the early days of
> assembly language programming, *that* was advertised
> as a way to write complicated programs painlessly. In
> fact, it did remove a lot of the pain, but then new
> kinds of pain were discovered. So then along comes
> FORTRAN, which lets you write your scientific software
> painlessly, using real mathematical notation. Um,
> except that it turns out that there are other sources
> of pain too even when you write in FORTRAN. So it goes:
> every advance in programming technology seems like it
> might mean an end to pain, but in fact it just means
> that you get to discover some new (and hopefully less
> awful) kinds of pain.
> 
> I do not accept that "any declarative model of computations
> can be easily implemented as a set of classes written in
> the language". It may be true for people who don't feel
> pain at having to write similar boilerplate code over and
> over again, but I am not one of those people. 

Ok.

> It may be
> true for people who don't mind code that's twice the length
> it should be, but I am not one of those people. 

Ok.

> It may be
> true for people who don't mind having the important material
> in their software obscured by syntactic irrelevancies, but
> I am not one of those people.

Ok.

Recruitment is not my goal at comp.lang.lisp. 

Besides you are not quite right in your review. 

In my opinion.

:-)

A.
From: Cameron MacKinnon
Subject: Re: LISPPA
Date: 
Message-ID: <QdCdnSjYeJJBpATdRVn-gw@golden.net>
Alexander Baranovsky wrote:
> Recruitment is not my goal at comp.lang.lisp. 

OK, I'll bite. What IS your goal here? It's as if you walked into a 
Ferrari owners' group advertising FerrariPA, a set of fiberglass body 
panels that turns any Fiero or Yugo into something superficially 
resembling a Ferrari. And you're getting about the same reaction. Surprised?

-- 
Cameron MacKinnon
Toronto, Canada
From: Alexander Baranovsky
Subject: Re: LISPPA
Date: 
Message-ID: <27d7ccd8.0405070550.5df99bc8@posting.google.com>
Cameron MacKinnon <··········@clearspot.net> wrote in message news:<······················@golden.net>...
> Alexander Baranovsky wrote:
> > Recruitment is not my goal at comp.lang.lisp. 
> 
> OK, I'll bite. What IS your goal here? It's as if you walked into a 
> Ferrari owners' group advertising FerrariPA, a set of fiberglass body 
> panels that turns any Fiero or Yugo into something superficially 
> resembling a Ferrari. And you're getting about the same reaction. Surprised?

I never supposed that comp.lang.lisp is just a "fan-club". And it is
not. I like a few messages in this thread.

A.
From: Matthew Danish
Subject: Re: LISPPA
Date: 
Message-ID: <20040507141158.GQ25328@mapcar.org>
On Fri, May 07, 2004 at 06:50:38AM -0700, Alexander Baranovsky wrote:
> Cameron MacKinnon <··········@clearspot.net> wrote in message news:<······················@golden.net>...
> > Alexander Baranovsky wrote:
> > > Recruitment is not my goal at comp.lang.lisp. 
> > 
> > OK, I'll bite. What IS your goal here? It's as if you walked into a 
> > Ferrari owners' group advertising FerrariPA, a set of fiberglass body 
> > panels that turns any Fiero or Yugo into something superficially 
> > resembling a Ferrari. And you're getting about the same reaction. Surprised?
> 
> I never supposed that comp.lang.lisp is just a "fan-club". And it is
> not. I like a few messages in this thread.

You missed his point.  He is comparing your LISPPA to a shitty car like
a Yugo with a fake Ferrari appearance.  If you tried to claim that this
Yugo is better than a Ferrari, you would get laughed at by any group of
people, fan-club or not.

-- 
; Matthew Danish <·······@andrew.cmu.edu>
; OpenPGP public key: C24B6010 on keyring.debian.org
; Signed or encrypted mail welcome.
; "There is no dark side of the moon really; matter of fact, it's all dark."
From: Alexander Baranovsky
Subject: Re: LISPPA
Date: 
Message-ID: <27d7ccd8.0405071345.4077a86d@posting.google.com>
Matthew Danish <·······@andrew.cmu.edu> wrote in message news:<······················@mapcar.org>...
> On Fri, May 07, 2004 at 06:50:38AM -0700, Alexander Baranovsky wrote:
> > Cameron MacKinnon <··········@clearspot.net> wrote in message news:<······················@golden.net>...
> > > Alexander Baranovsky wrote:
> > > > Recruitment is not my goal at comp.lang.lisp. 
> > > 
> > > OK, I'll bite. What IS your goal here? It's as if you walked into a 
> > > Ferrari owners' group advertising FerrariPA, a set of fiberglass body 
> > > panels that turns any Fiero or Yugo into something superficially 
> > > resembling a Ferrari. And you're getting about the same reaction. Surprised?
> > 
> > I never supposed that comp.lang.lisp is just a "fan-club". And it is
> > not. I like a few messages in this thread.
> 
> You missed his point.  He is comparing your LISPPA to a shitty car like
> a Yugo with a fake Ferrari appearance.  If you tried to claim that this
> Yugo is better than a Ferrari, you would get laughed at by any group of
> people, fan-club or not.

Thanks for your comments. As it is not first your comment similar to
it in this thread I would like to become acquainted with you.

Let me to introduce myself. I'm author of the VIRT programming
language, Object Pascal interpreter (PasScript), JavaScript
interpreter (DXJavaScript), and paxScript scripting engine. Well-known
software companies are among my customers.

But who are you? I do not know.

A.
From: Tayssir John Gabbour
Subject: Re: LISPPA
Date: 
Message-ID: <866764be.0405051909.d54c668@posting.google.com>
··@cable.netlux.org (Alexander Baranovsky) wrote in message news:<····························@posting.google.com>...
> Knuth's trilogy is cool, but I had in mind another classic book: "The
> design and analysis of computer algorithmes" (Aho, Hopcroft, Ullman).
> It uses pseudo-pascal.

I LOVE how you mention that! "Formal declarations of data types are
avoided as much as possible. The data type of a variable and its scope
should be evident either from its name or from its context."
-- The Design and Analysis of Computer Algorithms, p.34

And it's "Pidgin Algol."

10 POINTS.
From: Alexander Baranovsky
Subject: Re: LISPPA
Date: 
Message-ID: <27d7ccd8.0405070542.73469965@posting.google.com>
> And it's "Pidgin Algol."

Yes, you are right. But we are discussing prevalence of algol-liked
syntax in comparison with lisp-liked syntax in presentation of
algorithmes, so it does not big matter which Algol-descendant is used.

> "Formal declarations of data types are
> avoided as much as possible. The data type of a variable and its scope
> should be evident either from its name or from its context."
> -- The Design and Analysis of Computer Algorithms, p.34

Just a remark to make it more precise: authors do not give
recommendations for real programming languages in the cited statement.
They are describing features of Pidgin Algol as the tool of
representation of algorithmes in
their book.

As for the "real world", we already had I, J, K, L, M, N in FORTRAN to
denote integers variables without having to declare such variables
explicitly. Nobody want to return back :-)

A.
From: Tayssir John Gabbour
Subject: Re: LISPPA
Date: 
Message-ID: <866764be.0405071024.55f91c11@posting.google.com>
··@cable.netlux.org (Alexander Baranovsky) wrote in message news:<····························@posting.google.com>...
> > "Formal declarations of data types are
> > avoided as much as possible. The data type of a variable and its scope
> > should be evident either from its name or from its context."
> > -- The Design and Analysis of Computer Algorithms, p.34
> 
> Just a remark to make it more precise: authors do not give
> recommendations for real programming languages in the cited statement.
> They are describing features of Pidgin Algol as the tool of
> representation of algorithmes in
> their book.

Yes, they had to mutate a language, because static variable typing
obscures algorithms.


> As for the "real world", we already had I, J, K, L, M, N in FORTRAN to
> denote integers variables without having to declare such variables
> explicitly. Nobody want to return back :-)

Neither lisp or the book uses names like that, except for the
occasional array index which is idiomatic in all languages.


(I hope you do not interpret my terseness as rudeness. Just that,
perhaps wrongly, I sense you are more interested in debating, despite
what you have claimed to Gareth. Perhaps you've been put on the
defensive in this newsgroup, I do not know; sorry if this is the case.
But you are training me to debate with you, and drop all possible
points of fuzziness which are open to creative interpretation.)
From: Alexander Baranovsky
Subject: Re: LISPPA
Date: 
Message-ID: <27d7ccd8.0405071414.21c0cffa@posting.google.com>
···········@yahoo.com (Tayssir John Gabbour) wrote in message news:<····························@posting.google.com>...
> ··@cable.netlux.org (Alexander Baranovsky) wrote in message news:<····························@posting.google.com>...
> > > "Formal declarations of data types are
> > > avoided as much as possible. The data type of a variable and its scope
> > > should be evident either from its name or from its context."
> > > -- The Design and Analysis of Computer Algorithms, p.34
> > 
> > Just a remark to make it more precise: authors do not give
> > recommendations for real programming languages in the cited statement.
> > They are describing features of Pidgin Algol as the tool of
> > representation of algorithmes in
> > their book.
> 
> Yes, they had to mutate a language, because static variable typing
> obscures algorithms.
> 
> 
> > As for the "real world", we already had I, J, K, L, M, N in FORTRAN to
> > denote integers variables without having to declare such variables
> > explicitly. Nobody want to return back :-)
> 
> Neither lisp or the book uses names like that, except for the
> occasional array index which is idiomatic in all languages.
> 
> 
> (I hope you do not interpret my terseness as rudeness. Just that,
> perhaps wrongly, I sense you are more interested in debating, despite
> what you have claimed to Gareth. Perhaps you've been put on the
> defensive in this newsgroup, I do not know; sorry if this is the case.
> But you are training me to debate with you, and drop all possible
> points of fuzziness which are open to creative interpretation.)

Please do not think that. I respect your opinion (I mean your previous
messages in this thread too).

A.
From: Gareth McCaughan
Subject: Re: LISPPA
Date: 
Message-ID: <8765bajq59.fsf@g.mccaughan.ntlworld.com>
Alexander Baranovsky wrote:

>> Most things you can do in C, Pascal or Basic you can also
>> do in Common Lisp with code that runs about as fast. Some
>> varieties of low-level bit diddling are still easier to
>> do efficiently in C. (Some are easier to do efficiently
>> in assembler.) The slowness of Lisp is mostly an urban
>> legend.
> 
> I suppose that each legend based on a true story :-) .

What an extraordinary supposition.

>                                                       C or Pascal have
> no legends regarding it. So, to see a reference on benchmarks would be
> nice.

I have no idea what you mean by "C or Pascal have no legends
regarding it". There was a recent discussion in comp.lang.lisp
concerning a benchmark called something like "almabench". It
was a floating-point-intensive program that calculated some
astronomical thing or other. The original version was in
Fortran or C or some such language. Someone translated it
to Common Lisp, and various people in comp.lang.lisp played
with it a bit, and after a modest amount of tuning we had
it running *faster* than the C version compiled with gcc.
None of the tuning was illegitimate -- it didn't involve
things like simplifying the formulas or avoiding repeated
computations. Google Groups has all the details. Google
is your friend.

I don't know of much serious benchmarking work addressing
the question of how fast you can make Lisp programs that
do the same as C or Pascal or Basic ones, if you really
need to make them fast. My usual experience is that Lisp
programs run plenty fast enough without needing to take
any trouble to make them run faster. (This is also my
experience with C and C++, but not with Perl or Python.)
It's quite unusual for it to matter enough that I'd even
notice if there were, say, a factor of 3 one way or the
other.

>> It's true that Lisp programs are usually more memory-hungry
>> than programs written in C or Pascal. As against that, they
>> usually have considerably fewer memory-related bugs. Memories
>> are so big these days that memory use only matters much for
>> large programs, and it's not clear that *large* Lisp programs
>> are much more memory-hungry than *large* programs in weaker
>> languages.
> 
> There are a few tools which allows to decrease footprint of a program
> in "weak languages": iteration instead of recursion and "local"
> memory deallocation  (at least). The "local" memory deallocation can
> include:
> 
> - manually deallocation
> - time of life of variable
> - "semantics" deallocation (for example, reduced assignments in
> LISPPA)
> 
> I do not see why using of last 2 cases leads to the writing less clear
> programs.

I don't recall saying that it does. I think the ability
to make an object get deallocated as soon as it goes
out of scope is a useful one, though I'm not sure it's
useful enough that I'd rather have it than Lisp's
guarantee that dangling pointers can't happen.

Manual deallocation certainly does make things less clear.

>>>> Compile-time type checking is nice. But, at least when
>>>> comparing Lisp with languages like C, Pascal and Basic,
>>>> it just isn't worth the pain. The small gain in safety
>>>> and convenience is much, much more than offset by the
>>>> verbosity, the inconvenience whenever anything needs
>>>> to be changed, and the near-impossibility of interactive
>>>> use. With the time dynamic typing saves me, I can put
>>>> more care into checking the safety of my code in other
>>>> ways. The bugs that my C compiler catches for me are
>>>> the trivial, shallow ones that I'd have caught myself
>>>> anyway. 
>>> 
>>> Any Pascal programmer will testify that the compile-time error
>>> checking is huge advantage. Not only type checking but explicit
>>> declaration of variables allows to avoid many problems related to the
>>> scope, time of life etc. These "trivial" bugs are dangerous.
>> 
>> "Any Pascal programmer"? The huge majority of Pascal programmers
>> are not in a position to have an educated opinion on whether
>> static type checking and explicit type declarations make for
>> better programs than dynamic type checking and type inference,
>> because the huge majority of Pascal programmers have never used
>> any language with dynamic type checking and type inference for
>> anything other than trivial tasks. (I suspect most of them
>> have *never* used one at all.)
> 
> Seems too categorical to be true. I can assure: Pascal has the dynamic
> type checking. Take a look at the variant types.

I'm not familiar with the provision for variant types in
Pascal. Or, more precisely, in some versions of Pascal;
I've definitely used versions without any such provision.
If it's the case that (1) a substantial fraction of Pascal
programmers are using versions of Pascal that support
variant types, and (2) those versions of Pascal support
variant types in a way that's transparent enough to count
as "the language having dynamic type checking", then indeed
I have erred and I retract my conjecture that most Pascal
programmers have never used any language with dynamic
type checking.

>                                                  As for the type
> inference, any Pascal compiler makes it at compile-stage.

If you believe that, then you do not understand what I mean
by "type inference". Type inference is where the compiler
works out the types of variables, the signatures of functions,
and so on, *without* you having to tell it explicitly. Many
Common Lisp implementations do a certain amount of type
inference, so as to optimize better. There are also statically
typed languages in which scarcely any explicit type declarations
are needed, because they have powerful type inference systems.
Haskell and ML are good examples.

>                                                           And, it
> seems we already discussed it, please do not think about "average"
> Pascal programmers very bad. Sorry that I need to repeat it.

I'm not sure I understand the sentence that ends "very bad".
If I *have* understood it right, then (1) I wasn't thinking
any such thing and (2) you have no right to tell me what I
may and may not think anyway.

> > Making variable scopes and lifetimes clear has nothing to do
> > with making their types explicit. 
> 
> I never stated it.

Really? Then I conclude that when you said

  |                             Not only type checking but explicit
  | declaration of variables allows to avoid many problems related to the
  | scope, time of life etc.

you were using "explicit declaration of variables" not to mean
"explicit declaration of variables with types" but simply
"not letting variables spring into being with no declaration
whatever" or something like that. In which case I haven't
the slightest idea what that has to do with anything, since
"explicit declaration of variables" in that sense is not
a difference between C/Pascal/Basic and Lisp, and since
the context was that I was saying about how explicit type
declarations for variables aren't, on balance, worth what
they cost.

>> One reason for this is that many of the errors are *consequences*
>> of the type system. Many others are consequences of other
>> misfeatures of C or C++. It is not a merit in a language that
>> it creates ways of going wrong and then warns you when you
>> fall into them.
>> 
>> Another reason, I think, is that when writing in C or C++
>> there is so much irrelevant nonsense that you have to be
>> thinking about that your brain has less attention available
>> for thinking about other things, like all the bugs you're
>> putting into your code.
> 
> Honestly I do not have big experience in C/C++. My favorite language
> is Pascal (Delphi). I'm programming in Pascal since 1990 year, my
> previous language was FORTRAN-IV. So, the first my impression was the
> power of compile-time error checking. If I have clean understanding
> what I have to do, the rest is a simple thing, compiler finds my
> errors like Word find my misprint. No "irrelevant nonsenses". (Yes, my
> "Word" does not help me to write "poems in functional style", but it
> is not an obstacle for writing another "poems" :-)

Since you mention it, I find that Word's checking of spelling
and grammar is nothing but a nuisance. I am much more proficient
in English spelling and grammar than Word is. The only use I've
found for it is that it's a little easier to spot double spaces
when they're underlined in green. But then, double spaces shouldn't
result in double spacing anyway, if you see what I mean. But I
digress.

If all your bugs are ones that the compiler can find, then
you must either be immensely cleverer than I am or work on
easier problems than I do. (Didn't I find a bug in one of
your sample bits of LISPPA code? One that the compiler
couldn't have caught? Maybe I imagined that.)

I can't comment on how much irrelevant nonsense there is in
modern Pascal programming because I haven't written any Pascal
since about 1990. That would have been Borland's Turbo Pascal.
My experience of such older Pascal is that it was even worse
than C and C++ in the "irrelevant nonsense" department, but
it's long enough ago that I don't remember the details.

>>> Open any book devoted to algorithmes and data structures, or better:
>>> open book "AI.Structures and Strategies for Complex Problem Solving"
>>> by George Luger. You will find algorithmes written in pseudo-pascal,
>>> not in Lisp. Open book "Symbolic logic and mechanical theorem proving"
>>> by Chang and Lee and compare source code presented in Appendix A with
>>> the translation of the code presented at my site...
>> 
>> As someone has already pointed out, there are books on
>> AI and books on algorithms-and-data-structures-and-all-that
>> that use Lisp syntax. The most famous of all "algorithms
>> and data structures and all that" books would be Knuth's
>> three volumes, which use ... an assembly language of his
>> own invention.
> 
> Knuth's trilogy is cool, but I had in mind another classic book: "The
> design and analysis of computer algorithmes" (Aho, Hopcroft, Ullman).
> It uses pseudo-pascal. Besides I like "Combinatorial algorithmes.
> Theory and practice" (Reingold, Nievergelt and Deo), "Combinatoric for
> programmers" (Lipsky), "Principles of Software Engineering and Design"
> (Zelkowitz, Shaw, Gannon). All these books use pseudo-pascal I did not
> mention "Algorithmes and data structure" (Wirth) as it is a "party"
> book. :-)

Sure. There are plenty of books that use pseudocode in a
slightly Pascal-like style. (I suspect that its origins
actually derive from Algol and don't come via Pascal in
most cases. Not that this matters.)

> > >>> big number of IDEs,
> >  
> > >> Why would that be an advantage? One is enough. Perhaps zero
> > >> is enough.
> > > 
> > > New protest against evidence :-)
> > 
> > What does that mean?
> 
> It means to protest again bread and butter :-)

I am none the wiser.

>>>                                  Please understand, my position is
>>> absolutely non-agressive, unlike some Lisp programmers in this thread.
>>> I do not mean you, I respect your professional approach to the
>>> discussion. But why sometimes you are trying to discuss obvious
>>> things? :-)
>> 
>> I would prefer to ask: Why do you sometimes think things are
>> obvious when they are obviously false? :-)
> 
> Please do not tell "they are", please tell "in my opinion, they are".
> Categorical statements make discussion less interesting (in my opinion
> :-)

I do beg your pardon. Not having seen the slightest sign of
"in my opinion" attached to any of the highly debatable (or,
in some cases, outright refutable) statements you've made,
I failed to realise that this was important for you.

>>> If you have tools of visual programming, nice editor, advanced
>>> debugger: why it is not important?
...
>> "Perhaps zero is enough" wasn't an entirely serious statement,
>> but there *are* plenty of people who find fancy graphical IDEs
>> less productive than, say, Emacs and the usual collection of
>> Unixy stuff. If by "visual programming" you mean the sort of
>> thing VB does for you, that's very valuable when you're building
>> GUIs but much less useful for everything else one does when
>> programming. 
> 
> I'm constantly trying to find out what is the "everything else one"?

I think you have mis-parsed what I wrote. "Everything else"
includes, to take some random examples,
  - database access
  - graph-theoretic algorithms
  - multiple-precision arithmetic
  - grovelling through text files
  - network communication
  - neural networks
  - virtual machines
  - compilers
  - device drivers

> Ok, I can guess: it is a sort of non-visual programming. :-) Moreover,
> perhaps it is a sort of very difficult programming which compel me to
> forget about nice editor, advanced debugger, third-party libraries.

Er, no, and if you really think I said that then you need to
go back and read what I wrote, more carefully.

> But what I cannot understand is: how such difficult problem can exist
> as "thing in itself" without necessity to integrate it with DBMS,
> internet etc? But good IDE allows you to solve these
> "input/deployment" problems easily.

Integrating your code with a DBMS or the internet doesn't
have anything much to do with having a good IDE.

>>>>> Does it mean I consider Lisp as a bad or useless language? Not, of
>>>>> course. For example, I like the idea to derive language from a
>>>>> minimal set of base concepts,
>>>> 
>>>> Common Lisp is no more "derived from a minimal set of base
>>>> concepts" than Pascal or C.
>>> 
>>> My statement "I like the idea" has been related to Lisp.
>> 
>> I don't understand; sorry.
> 
> CAR, CDR, CONS, EQ, ATOM.

Common Lisp is no more "derived from a minimal set of base
concepts" than Pascal or C. You must be confusing it with
the very tiny language McCarthy designed in 1959 for purely
academic purposes. That tiny language is indeed an ancestor
of Common Lisp (and of the other varieties of Lisp that
are in use today), but so what? Some boring unicellular
creature was my ancestor too, a billion years ago or thereabouts;
that doesn't tell you anything useful about me.

So if one of your reasons for liking "Lisp" is that you
think everything in it is derived from CAR, CDR, CONS, EQ
and ATOM, then the language you like is one with only a
superficial resemblance to the language I've been talking
about in our discussions here.

>>>                                                    But, for example,
>>> if the assignment operator is "evil", the language should allow to
>>> restrict the using assignments.
>> 
>> The assignment operator is not evil. Unless you mean
>> specifically C++'s assignment operator, which conflates
>> assignment and copying in a brain-damaging way. That's
>> evil, all right.
> 
> Too many auxiliary assignments is "evil" (in my opinion). :-)

"Too many" of anything is bad. That's what "too many" means.

>                                                               The
> problem is similar to "goto" problem a bit. The "goto" exists, but
> battle against "goto" was not useless: we got structured programming
> as a result. The battle against auxiliary assignments leads to the
> "functional appoach in the small". In particular, LISPPA illustrates
> this tendency.

I've not noticed any conspicuous lack of assignment
statements in your sample LISPPA programs. Maybe I
just don't know what you mean by "auxiliary".

> > >                                 It leads to idea of representation of
> > > complex transformations and algorithmes in a concise form which hides
> > > internal mechanismes of memory allocation, pointers and big number of
> > > "auxiliary assignments". Another ideas such as recursion, high-order
> > > functions etc can be considered as properties of "good" imperative
> > > language.
> > 
> > All of which are poorly supported by C, Pascal and Basic.
> > (Except for recursion, which is passably supported by all
> > modern languages.)
> 
> High order functions are available in Pascal via procedural types.

Yes, Pascal supports this better than C does. Not because
Pascal has "procedural types"; so does C, kind of. But because
Pascal permits nested procedures with access to the variables
of the enclosing procedures. So (I hope) you can get something
a little like closures. But it's much more cumbersome than
in Lisp, or in any functional language. (Note: Lisp is not
a "functional language", but a multi-paradigm language that
supports functional programming as well as other sorts.) I'm
afraid I still consider it "poor support".

>> No, I do not just mean the representation of terms. I mean
>> that when you have a language that's basically designed for
>> one sort of programming and you try to use it for another,
>> some things tend to be expressed in ugly ways. For instance,
>> if you try to do object-oriented programming in plain old C
>> you have to build your own method tables and write explicit
>> code to use them. This ends up being verbose and hard to read,
>> in comparison with languages that have syntactic support for OO
>> such as Objective-C and C++. (And that's just about the only
>> time you'll hear me saying that C++ is easier to read than
>> something else.)
>> 
>> Similarly, if you try to do logic programming in, say, Pascal
>> then things that Prolog has special notation for -- adding
>> facts to the database, making queries -- needs to be done
>> in a more explicit way, which again will be verbose and hard
>> to read.
> 
> We already discussed the representation of facts, terms etc in Lisp
> and Pascal + LISPPA. There are no big differences (please see above).
> A query can be expressed as a term. So, no problem again.

I think you are confusing data structures with operations on
those data structures. In Prolog, a query is not only a
data structure; it is an instruction to the system to do
some work.

>                                                           Moreover,
> Pascal can provide more effective representation of terms in
> comparison with Lisp. Once again, please see the theorem proving
> program in paxPascal at the LISPPA site.

I see nothing whatsoever in that program that could possibly
be considered "more effective representation of terms".
 
> I mean the representation terms as "term" + "parameters".
> 
> Let's consider T term
> 
>  (f(x, y), g(x))
> 
> You see, the x variable is presented twice. To replace x with z (it is
> a part of theorem proving algorithm), you have to make tour of tree in
> Lisp.
> 
> In paxPascal you can represent the T term as
> 
> (f(p1, p2), f(p1)), where p1 is alias of x, p2 is alias of y. So (x,y)
> represents parameters of T.
> 
> To rename variables of T, you have just to rename parameters. So, you
> have linear algorithm instead of exponential one. This representation
> also simplifies and speeds up the unification algorithm - heart of any
> logical programming system.

I cannot imagine what makes you think that you can't do
the same in Lisp. (You can.) I do not believe there is
anything in a logic programming system that takes linear
time when implemented in Pascal+LISPPA and that cannot
be cleanly implemented in Lisp so as to still take linear
time. (I don't doubt that you can write a bad implementation
in Lisp that takes exponential time. That isn't Lisp's
fault.)

You might want to find a copy of "Paradigms of Artificial
Intelligence Programming" and look at section 11.6, where
you will find an implementation of unification that looks
rather like yours except that (1) it's shorter, (2) it's
easier to read because it doesn't involve artificial
diddling with indices (pointers by another name!), and
(3) it's correct. (Try to unify [a,b,c,d] and [b,c,d,a]
with your code, where a,b,c,d are variables.)

>> The same would be just as true in Common Lisp, except that
>> Common Lisp has powerful features for syntactic abstraction --
>> macros and a configurable reader -- so that you can write
>> logic programs in Common Lisp that are nearly as clear and
>> concise as they would be in Prolog. You can find some nice
>> examples in Norvig's excellent book "Paradigms of artificial
>> intelligence programming".
> 
> Sorry, I have no this book. It would be really very interesting for me
> to read it.

I strongly recommend that you do so. It's a first-rate
book for anyone interested in either artificial intelligence
or Lisp. It's a pretty good book for programmers who
aren't much interested in either.

>> As for the performance of arrays versus linked lists -- you
>> *do* know that Common Lisp has arrays too, right?
> 
> Yes, I know. But Common Lisp uses linked lists for the data
> representation in the mentioned problem.

Sorry, what's "the mentioned problem" here? Your theorem-proving
thing? Maybe I just haven't read the CL spec carefully enough;
I don't remember reading anything that says "Using arrays in
theorem-proving code is a violation of the language standard
and will produce undefined results".

>                                          As for the arrays for linear
> algebra and related problems, I would prefer to use Pascal, not Common
> Lisp.

Then it isn't surprising that you find Pascal more to your
taste than Common Lisp. I hope you, in turn, are not surprised
that readers of comp.lang.lisp do not consider "it's Pascal,
not Common Lisp" a major advantage of Pascal over Common Lisp.

>>> Btw, I wrote my first interpreter (for the VIRT language) 7 years ago.
>>> It was written bad, the interpreter was slow. However it provided
>>> better performance than XLISP for the theorem proving. Ok, XLISP is
>>> not Common Lisp, VIRT is not Delphi. Even my present paxScript is more
>>> fast than VIRT in 8 -10 times. So, your conclusions look as premature
>>> ones.
>> 
>> I have no idea what conclusions you think look premature
>> because 7 years ago you wrote something that gave you
>> better performance for theorem proving than something
>> else.
> 
> We are discussing the applicability of Pascal+LISPPA for simulation of
> logical programming. You states it is not applicable.

Where, please, did I state that it is not applicable?
I said only that if you're going to do logic programming
in a language not originally designed for logic programming,
then Lisp is a better substrate than Pascal. I fail to see
how your example offers any evidence against this.

>                                                       I remembered
> VIRT as an imperative language with Modula-2 syntax as the first try
> in this relation and mentioned that the first results in the theorem
> proving were not very bad.

If you think I was claiming that it is impossible to write
reasonably fast theorem-proving code in a Pascal-like language,
then I apologise for my lack of clarity, because I did not
intend to say anything of the sort.

>> No, I realise. But ... you came into comp.lang.lisp, saying
>> "look at my wonderful new thing, which gives all the power
>> of Lisp to C, Pascal and Basic". And, actually, that *is*
>> an attack on Lisp, because it's saying that Lisp doesn't
>> offer anything valuable beyond Pascal + heterogeneous arrays.
> 
> Oh, you "discovered" me. :-) I really have thought about competition.
> But it is not the same as to say "forget Lisp. Your 40 years of
> history is one big mistake".  I repeat: I'm trying to reproduce
> useful features of Lisp and another languages in my experiment. Why
> you see only "attack"? I want to learn and understand, not fight.

I don't see *only* "attack". But I do see you saying
things that amount to attacks on Lisp, and I don't see
why I shouldn't reply to those. As to whether your stuff
offers some benefit to Pascal programmers -- you need
to talk to Pascal programmers about that, not Lisp
programmers.

> > Which is not only false but laughably false.
> 
> Please add "in my opinion". :-)

OK. It's not only false but, in my opinion, laughably false.
I concede that some people might find it tragically or
pointlessly false and not be inclined to laugh about it.

>> It looks to me as if you have conflicting requirements.
>> On the one hand, you want something that's completely
>> familiar to Pascal programmers, something that's just
>> Pascal with a few little bags on the side. On the other
>> hand, you want something that's just as good as Lisp
>> for solving hard problems. I'm sorry to be the bringer
>> of bad news, but I don't think you can have both at once.
> 
> Let's see. Why not? 

Because there are all sorts of good things about Lisp
that would be impossible for "Pascal with a few little
bags on the side". I've listed several of them already
in this thread.

>> I'm not irritated by the name. I'm irritated only by the fact
>> that you seem to consider that this shallow resemblance to
>> Lisp makes your languages just as effective as Lisp for the
>> things Lisp does well. Believe me, it doesn't.
> 
> I do not consider all possible problems which "Lisp does well". But I
> state that for many problems Pascal+LISPPA (Basic+LISPPA etc). can be
> used with the same success or even better. These problems are:
> knowledge representation, search algorithmes, symbolic computations, 
> interpreters of logic programming.

I don't see any evidence to support that statement. It
seems obviously wrong to me, and from what you've said
it doesn't appear that you know enough about Lisp to
evaluate Pascal+LISPPA *in comparison to Lisp* for these
applications or any others.

>> No one is disputing that you can use heterogeneous arrays
>> when writing programs that do symbolic computation. Why
>> would they?
> 
> Sorry, I did not understand your question.

You appeared to think that someone in comp.lang.lisp
is disagreeing with the statement that it's possible to
use heterogeneous arrays when writing programs that do
symbolic computation. I don't have any idea why anyone
would disagree with that statement, which seems obviously
true and obviously not very interesting.The question "Why
would they?" means: "Why would anyone disagree with that
statement?"; I believe the answer is that no one would.
But unless someone has been disagreeing with it, I don't
understand the point of what you wrote:

  |                                                                  But I
  | have thought that consideration "array vs list" can be interesting for
  | wide area of symbolic computations and AI applications. Indeed, the
  | term
  | 
  | (f (g x y) p(a)) 
  | 
  | can be considered as list of lists and as array of arrays. This
  | duality has been reflected in the "LISPPA" name.

which doesn't appear to me to say anything beyond "You
can use heterogeneous arrays as a data structure when
writing code for symbolic computation". Well, yes, you
can, and no one would disagree. What's meant to be either
interesting or new about that?

> Besides you are not quite right in your review. 

Do feel free to show me where I am wrong.

-- 
Gareth McCaughan
.sig under construc
From: Pascal Costanza
Subject: Re: LISPPA
Date: 
Message-ID: <c7cs6i$olo$1@newsreader2.netcologne.de>
Gareth McCaughan wrote:

> Alexander Baranovsky wrote:

>>Seems too categorical to be true. I can assure: Pascal has the dynamic
>>type checking. Take a look at the variant types.
> 
> I'm not familiar with the provision for variant types in
> Pascal.

Pascal's variant records (sic!) are more a poor man's subclassing 
facility than an addition of dynamic typing. The idea is this:

TYPE GraphicObject = RECORD
        CASE kind OF
         CIRCLE: x, y, diameter: INTEGER;
         RECTANGLE: x1, y1, x2, y2: INTEGER;
         TRIANGLE: ...
         ...
      END;

The "kind" variable is part of the record structure and can be assigned 
to like any other record element. The other elements are assumed to be 
only accessible when the "kind" variable has the respective value. 
(Pascal compilers use this feature to map the different variants to the 
same memory locations and thus save some memory space.)

(Don't count on the syntax, I have probably mixed up Modula-2 and Pascal 
syntax here - this was a long time ago... ;)

Anyway, this is a really restricted machinery because you can't add any 
more cases to such variant records (i.e., no real subclasses).

And of course, you still have to declare variables of type GraphicObject 
in order to be make use of that facility. No dynamic typing.


Pascal

-- 
1st European Lisp and Scheme Workshop
June 13 - Oslo, Norway - co-located with ECOOP 2004
http://www.cs.uni-bonn.de/~costanza/lisp-ecoop/
From: Gareth McCaughan
Subject: Re: LISPPA
Date: 
Message-ID: <877jvpi0ql.fsf@g.mccaughan.ntlworld.com>
Pascal Costanza wrote:

[Alexander Baranovsky:]
>>> Seems too categorical to be true. I can assure: Pascal has the dynamic
>>> type checking. Take a look at the variant types.

[me:]
>> I'm not familiar with the provision for variant types in
>> Pascal.

[Pascal:]
> Pascal's variant records (sic!) are more a poor man's subclassing
> facility than an addition of dynamic typing. The idea is this:

[SNIP: discriminated union]

OK. That is, indeed, far from what I would consider real
dynamic typing, and a Pascal programmer whose only exposure
to dynamic typing is through discriminated unions like that
has *not* seen the benefits of proper dynamic typing.

-- 
Gareth McCaughan
.sig under construc
From: Alexander Baranovsky
Subject: Re: LISPPA
Date: 
Message-ID: <27d7ccd8.0405071320.41f0993a@posting.google.com>
Gareth McCaughan <················@pobox.com> wrote in message news:<··············@g.mccaughan.ntlworld.com>...
> Alexander Baranovsky wrote:
> 
> >> Most things you can do in C, Pascal or Basic you can also
> >> do in Common Lisp with code that runs about as fast. Some
> >> varieties of low-level bit diddling are still easier to
> >> do efficiently in C. (Some are easier to do efficiently
> >> in assembler.) The slowness of Lisp is mostly an urban
> >> legend.
> > 
> > I suppose that each legend based on a true story :-) .
> 
> What an extraordinary supposition.
> 
> >                                                       C or Pascal have
> > no legends regarding it. So, to see a reference on benchmarks would be
> > nice.
> 
> I have no idea what you mean by "C or Pascal have no legends
> regarding it". There was a recent discussion in comp.lang.lisp
> concerning a benchmark called something like "almabench". It
> was a floating-point-intensive program that calculated some
> astronomical thing or other. The original version was in
> Fortran or C or some such language. Someone translated it
> to Common Lisp, and various people in comp.lang.lisp played
> with it a bit, and after a modest amount of tuning we had
> it running *faster* than the C version compiled with gcc.
> None of the tuning was illegitimate -- it didn't involve
> things like simplifying the formulas or avoiding repeated
> computations. Google Groups has all the details. Google
> is your friend.
> 
> I don't know of much serious benchmarking work addressing
> the question of how fast you can make Lisp programs that
> do the same as C or Pascal or Basic ones, if you really
> need to make them fast. My usual experience is that Lisp
> programs run plenty fast enough without needing to take
> any trouble to make them run faster. (This is also my
> experience with C and C++, but not with Perl or Python.)
> It's quite unusual for it to matter enough that I'd even
> notice if there were, say, a factor of 3 one way or the
> other.

Thanks. I'll consider these benchmarks.

Ok. Now let's consider

(defun my-add (x y)
 (+ x y)
)

How Common Lisp compiler compiles it? If you pass (2 2) as parameter,
it is integer addition. It you pass (2.2 2.3), it is float addition.
If you pass (2 2.3) then there should be data type conversion. Does
Common Lisp uses only float numbers in implementation? Sorry for the
trivial question, but I really do not understand how the absense of
declaration of variables can lead to more effective programs?

> 
> >> It's true that Lisp programs are usually more memory-hungry
> >> than programs written in C or Pascal. As against that, they
> >> usually have considerably fewer memory-related bugs. Memories
> >> are so big these days that memory use only matters much for
> >> large programs, and it's not clear that *large* Lisp programs
> >> are much more memory-hungry than *large* programs in weaker
> >> languages.
> > 
> > There are a few tools which allows to decrease footprint of a program
> > in "weak languages": iteration instead of recursion and "local"
> > memory deallocation  (at least). The "local" memory deallocation can
> > include:
> > 
> > - manually deallocation
> > - time of life of variable
> > - "semantics" deallocation (for example, reduced assignments in
> > LISPPA)
> > 
> > I do not see why using of last 2 cases leads to the writing less clear
> > programs.
> 
> I don't recall saying that it does. I think the ability
> to make an object get deallocated as soon as it goes
> out of scope is a useful one, though I'm not sure it's
> useful enough that I'd rather have it than Lisp's
> guarantee that dangling pointers can't happen.
> 
> Manual deallocation certainly does make things less clear.

I agree with last statement.

> 
> >>>> Compile-time type checking is nice. But, at least when
> >>>> comparing Lisp with languages like C, Pascal and Basic,
> >>>> it just isn't worth the pain. The small gain in safety
> >>>> and convenience is much, much more than offset by the
> >>>> verbosity, the inconvenience whenever anything needs
> >>>> to be changed, and the near-impossibility of interactive
> >>>> use. With the time dynamic typing saves me, I can put
> >>>> more care into checking the safety of my code in other
> >>>> ways. The bugs that my C compiler catches for me are
> >>>> the trivial, shallow ones that I'd have caught myself
> >>>> anyway. 
> >>> 
> >>> Any Pascal programmer will testify that the compile-time error
> >>> checking is huge advantage. Not only type checking but explicit
> >>> declaration of variables allows to avoid many problems related to the
> >>> scope, time of life etc. These "trivial" bugs are dangerous.
> >> 
> >> "Any Pascal programmer"? The huge majority of Pascal programmers
> >> are not in a position to have an educated opinion on whether
> >> static type checking and explicit type declarations make for
> >> better programs than dynamic type checking and type inference,
> >> because the huge majority of Pascal programmers have never used
> >> any language with dynamic type checking and type inference for
> >> anything other than trivial tasks. (I suspect most of them
> >> have *never* used one at all.)
> > 
> > Seems too categorical to be true. I can assure: Pascal has the dynamic
> > type checking. Take a look at the variant types.
> 
> I'm not familiar with the provision for variant types in
> Pascal. Or, more precisely, in some versions of Pascal;
> I've definitely used versions without any such provision.
> If it's the case that (1) a substantial fraction of Pascal
> programmers are using versions of Pascal that support
> variant types, and (2) those versions of Pascal support
> variant types in a way that's transparent enough to count
> as "the language having dynamic type checking", then indeed
> I have erred and I retract my conjecture that most Pascal
> programmers have never used any language with dynamic
> type checking.
> 
> >                                                  As for the type
> > inference, any Pascal compiler makes it at compile-stage.
> 
> If you believe that, then you do not understand what I mean
> by "type inference". Type inference is where the compiler
> works out the types of variables, the signatures of functions,
> and so on, *without* you having to tell it explicitly. Many
> Common Lisp implementations do a certain amount of type
> inference, so as to optimize better. There are also statically
> typed languages in which scarcely any explicit type declarations
> are needed, because they have powerful type inference systems.
> Haskell and ML are good examples.

Yes, I know. I'm using a sort of the type inference at run-time in
paxScript. VBScript and JScript use the type inference at
compile-time. But please note that VBScript has been replaced with
VB.NET, JScript is replaced with JScript.NET. Both languages support
strict data typing and the support of strict data typing vs type
inference states more preferable in MSDN. I'm not going to agitate for
Microsoft solutions, however I consider it as a reason for the serious
consideration. I like some JavaScript ideas, btw I guess that
JavaScript authors were under influence of Lisp: data abstraction,
functions as data objects, Eval function. But I never think to use
JavaScript for serious programming.
 
> > I mean the representation terms as "term" + "parameters".
> > 
> > Let's consider T term
> > 
> >  (f(x, y), g(x))
> > 
> > You see, the x variable is presented twice. To replace x with z (it is
> > a part of theorem proving algorithm), you have to make tour of tree in
> > Lisp.
> > 
> > In paxPascal you can represent the T term as
> > 
> > (f(p1, p2), f(p1)), where p1 is alias of x, p2 is alias of y. So (x,y)
> > represents parameters of T.
> > 
> > To rename variables of T, you have just to rename parameters. So, you
> > have linear algorithm instead of exponential one. This representation
> > also simplifies and speeds up the unification algorithm - heart of any
> > logical programming system.
> 
> I cannot imagine what makes you think that you can't do
> the same in Lisp. (You can.) I do not believe there is
> anything in a logic programming system that takes linear
> time when implemented in Pascal+LISPPA and that cannot
> be cleanly implemented in Lisp so as to still take linear
> time. (I don't doubt that you can write a bad implementation
> in Lisp that takes exponential time. That isn't Lisp's
> fault.)

> 
> You might want to find a copy of "Paradigms of Artificial
> Intelligence Programming" and look at section 11.6, where
> you will find an implementation of unification that looks
> rather like yours except that (1) it's shorter, (2) it's
> easier to read because it doesn't involve artificial
> diddling with indices (pointers by another name!), and
> (3) it's correct. (Try to unify [a,b,c,d] and [b,c,d,a]
> with your code, where a,b,c,d are variables.)

(1). Sorry again, I do not have the mentioned book, but I have another
book by George Luger. It contains big chapter gerading this subject
and the unification algorithm in Lisp. It does not more short. Besides
it uses tour of tree to substitute variables.

paxPascal:

function Unify(N: Integer; T1, T2: Variant): boolean;
var
  I: Integer;
  P1, P2: Variant;
begin
  result := true;
  for I:=N to T1.length - 1 do
  begin
    P1 := @ T1[I];
    P2 := @ T2[I];
    if IsCompound(P1) and IsCompound(P2) then result := Unify(0, P1,
P2)
    else if IsVar(P1) and (not Inside(P1, P2)) then  P1^ := @ P2^
    else if IsVar(P2) and (not Inside(P2, P1)) then  P2^ := @ P1^
    else result := P1 = P2;
    if not result then
      Exit;
  end;
end;

I do not understand why it is not short? Why it is not enough clear?

Once again: take a look at the mentioned book. Author introduces all
important algorithmes represented in pseudo-pascal. Ok. Then he says
in next to last chapter: "Now let's consider languages for AI: Prolog
and Lisp". It looks not quite clear for me  Why do not use real Pascal
(extended with polymorphic arrays) instead of pseudo-pascal? Why do
not say "there are also another languages for AI programming: Prolog
and Lisp".

:-)

(2) Term is a data structure (tree) represented as polymorhic array. I
do not understand why indices of array are:

a) "pointers".
b) "artificial diddling".

(3). Sorry, I did not understand it. If a, b, c, d are variables, you
have to rename terms [a,b,c,d] and [b,c,d,a] before unification. What
is wrong?

It would be nice to see your linear algorithm in Common Lisp. The
algorithm in itself does not big matter for me, but I'm interested to
see expressive features of the language..

A.
From: Matthew Danish
Subject: Re: LISPPA
Date: 
Message-ID: <20040508040819.GS25328@mapcar.org>
On Fri, May 07, 2004 at 02:20:14PM -0700, Alexander Baranovsky wrote:
> Thanks. I'll consider these benchmarks.
> 
> Ok. Now let's consider
> 
> (defun my-add (x y)
>  (+ x y)
> )
> 
> How Common Lisp compiler compiles it? If you pass (2 2) as parameter,
> it is integer addition. It you pass (2.2 2.3), it is float addition.
> If you pass (2 2.3) then there should be data type conversion. Does
> Common Lisp uses only float numbers in implementation? Sorry for the
> trivial question, but I really do not understand how the absense of
> declaration of variables can lead to more effective programs?
                                            ^^^^^^^^^
		``I do not think this word means what you think it means''
		(sorry, had to say it =)

In this case, the function would have to be declared inline for the
compiler to really be able to do much with it.  In which case, it can
figure out what the types of the parameters are and optimize the the +
instruction.  Either that, or you could add annotations to declare the
types.  Though with such a short function you would normally inline it,
of course.

> Yes, I know. I'm using a sort of the type inference at run-time in
> paxScript. VBScript and JScript use the type inference at
> compile-time. But please note that VBScript has been replaced with
> VB.NET, JScript is replaced with JScript.NET. Both languages support
> strict data typing and the support of strict data typing vs type
> inference states more preferable in MSDN. 

Once again, you have no clue what you are talking about.  Let me repeat
that, to be sure that got through: YOU HAVE NO CLUE WHAT YOU ARE TALKING
ABOUT.

Okay, now that that's out of the way, let me describe why:

Type inference is a method to decide the types of variables from context
without (or with little) type annotation, STATICALLY.  Yes, that means
that for example the function in SML:

fun add1 x = x + 1

Does not need type annotations because the compiler can figure out that
you are saying, statically (at compile-time),

fun add1 (x:int) = (x + 1):int

The type system has been designed explicitly so this and parametric
polymorphism can be supported.  Now I suppose you probably have no clue
what parametric polymorphism is either, but I haven't got the time to
explain all this so I recommend that you google search for "Hindley
Milner type system" or "ML type system" and try to learn a few things.

-- 
; Matthew Danish <·······@andrew.cmu.edu>
; OpenPGP public key: C24B6010 on keyring.debian.org
; Signed or encrypted mail welcome.
; "There is no dark side of the moon really; matter of fact, it's all dark."
From: Thomas Schilling
Subject: Re: LISPPA
Date: 
Message-ID: <2g2mihF3tasaU1@uni-berlin.de>
Alexander Baranovsky wrote:
> It would be nice to see your linear algorithm in Common Lisp. The
> algorithm in itself does not big matter for me, but I'm interested to
> see expressive features of the language..

Maybe you want to take a look at this: http://www.norvig.com/paip/unify.lisp
It's from Peter Norvig's book. I didn't take a thorough look at it, yet
(Haven't reached the chapter, yet ;-), so I can't tell anything about
efficiency. It's from his Prolog interpreter. He also has written a Prolog
compiler which is, of course, faster.

Sorry, I can't tell you some qualified statements about it, but I'm sure
many others here can.

regards

Thomas
From: Alexander Baranovsky
Subject: Re: LISPPA
Date: 
Message-ID: <27d7ccd8.0405082020.3d8964d5@posting.google.com>
Thomas Schilling <······@yahoo.de> wrote in message news:<··············@uni-berlin.de>...
> Alexander Baranovsky wrote:
> > It would be nice to see your linear algorithm in Common Lisp. The
> > algorithm in itself does not big matter for me, but I'm interested to
> > see expressive features of the language..
> 
> Maybe you want to take a look at this: http://www.norvig.com/paip/unify.lisp
> It's from Peter Norvig's book. I didn't take a thorough look at it, yet
> (Haven't reached the chapter, yet ;-), so I can't tell anything about
> efficiency. It's from his Prolog interpreter. He also has written a Prolog
> compiler which is, of course, faster.
> 
> Sorry, I can't tell you some qualified statements about it, but I'm sure
> many others here can.
> 
> regards
> 
> Thomas

Thank you very much!

I also have found a nice article

http://www.cs.bham.ac.uk/research/poplog/paradigms_lectures/lecture20.html#unification

which describes the unification algorithm in Scheme. I guess that
Peter Norvig's book presents the same idea of unification.

It looks more nice than algorithm from George Luger's book. Besides it
more closely to the approach which I'm using.

A few comments.

1. I do not use the binding list. The bindings are expressed by means
of aliases in my approach. Ok, I use extra structure which contains
parameters of term, but this approach provides the linear algorithm of
renaming variables.
2. I do not use search in the binding list.
3. I use one function Unify instead of two functions: unify and
unify-variable.
4. I do not use the mutual recursion (of unify and unify-variable) in
my algorithm, so it is much more easy to get a non-recursive version
of the algorithm.
5. My algorithm is more concise and easy for understanding. 
 
A.
From: Thomas Schilling
Subject: Re: LISPPA
Date: 
Message-ID: <opr7qix7ddtrs3c0@news.CIS.DFN.DE>
Alexander Baranovsky wrote:
> http://www.cs.bham.ac.uk/research/poplog/paradigms_lectures/lecture20.html#unification
>
> which describes the unification algorithm in Scheme. I guess that
> Peter Norvig's book presents the same idea of unification.

> 1. I do not use the binding list. The bindings are expressed by means
> of aliases in my approach. Ok, I use extra structure which contains
> parameters of term, but this approach provides the linear algorithm of
> renaming variables.
> 2. I do not use search in the binding list.
> 3. I use one function Unify instead of two functions: unify and
> unify-variable.
> 4. I do not use the mutual recursion (of unify and unify-variable) in
> my algorithm, so it is much more easy to get a non-recursive version
> of the algorithm.
> 5. My algorithm is more concise and easy for understanding.

Here's the version from Norvig's compiler:

(defstruct var name (binding unbound))
;; in pascal this is like:
;; type
;;  Var = record
;;    name: ?;
;;    binding: ^var; (* default value = unbound *)
;;  end;

(defun bound-p (var) (not (eq (var-binding var) unbound)))
;; (var-binding var) would be var.binding

(defmacro deref (exp)
   "Follow pointers for bound variables."
   `(progn (loop while (and (var-p ,exp) (bound-p ,exp))
              do (setf ,exp (var-binding ,exp)))
           ,exp))
;; this is a macro, although in this case you can
;; remove all the "`" and "," and consider it as a function
;; being inlined in the actual code

(defun unify! (x y)
   "Destructively unify two expressions"
   (cond ((eql (deref x) (deref y)) t)
         ((var-p x) (set-binding! x y))
         ((var-p y) (set-binding! y x))
         ((and (consp x) (consp y))
          (and (unify! (first x) (first y))
               (unify! (rest x) (rest y))))
         (t nil)))

(defun set-binding! (var value)
   "Set var's binding to value.  Always succeeds (returns t)."
   (setf (var-binding var) value)
   t)
;; (setf (var-binding var) value) means var.binding := value;

So, now concerning your remarks:
1) thesame (see the macro)
2) thesame
3) thesame
4) thesame. also: recursion must not be bad
5) well, that's a matter of taste (and getting used to).
    I tried to rewrite these functions into pascal style and I really 
prefered the way it stands here ... all these "begin"s and "end"s and 
"if"+"then". It also took a while for me to get used to the parentheses 
but now i don't really see them anymore. Instead it looks to me like 
cleaned-up pseudo-code

But let's get back to why I (just like many here) think lisp is superior 
to pascal/basic/c/.. even with your LISSPA extension (no special order):

a) Lisp has first-class functions with closures. A nice example is:

(defun is (val)
   "returns a function that returns true if it's parameter is equal to x"
   #'(lambda (x) (eq val x))

You can then easily pass this function to a generalized search function as 
the parameter for search termination. (BTW: it's also from Norvig's book.)
This is a feature that the languages you seem to prefer don't have. Well, 
of course, you might not miss them if you've never used them. But that's 
normal. I think if you once start using them you'll see the difference. 
(Ok, you can probably live without them - but that's not the point. The 
point is whether they can ease your life, and they definitely do that.)

b) macros:

I already told you that they're great. you can really create your own 
special domain language without much effort. Especially they avoid very 
much pure typing work (which is error-prone). Other nice cases are code 
optimization macros (macros that optimize special code patterns) or reader 
macros (allthough they're quite different kind of macros - actually 
they're hooks into the read-function to allow special custom syntaxes.)
Again I'll give you an example:

(with-html-output (file)
   (:html
     (:body (:bgcolor "blue")
       (:p "some example paragraph")
       (:table
         (doimes (i 5)
           (htm (:tr (:td (print i)) (:td (print (* i i))))))))))))

This will print the following into the specified file:
"<html><body bgcolor="blue"><p>Some example paragraph</p>
<table>
<tr><td>0</td><td>0</td></tr>
<tr><td>1</td><td>1</td></tr>
<tr><td>2</td><td>4</td></tr>
<tr><td>3</td><td>9</td></tr>
<tr><td>4</td><td>16</td></tr>
</table></body></html>"

So this is a nice example of a sub-language and a "code-compression"-macro 
because you could, of course, have written it out as a bunch of print or 
format statements (oops, I mean expression, since lisp doesn't have 
statements ;-). BTW: this macro is really quite simple in it's 
implementation. Edi Weitz has written something like this, see 
http://www.weitz.de/cl-who/.

The power of lisp macros is directly related to lisp's syntax (code = 
data, i.e. code=syntax tree). There are extensions to other languages, but 
they're much less powerful and full of special cases. Nervertheless, 
writing good macros is an art of it's own, although definitely worth 
learning ;-).

c) You can change a running program. This is especially useful for 
long-running programs like servers or applications in, say, bioinformatics.
    If you encounter a bug your program get's interrupted. Now you can spot 
the error, redefine the buggy function and try to continue the program. If 
it still won't work you can continue debugging. Of course a server won't 
response at the time you're debugging it, but OTOH no information is lost 
and a shutdown+restart would probably be more annoying. (There are still 
cases where a running program cannot continue running, but most chances 
for hard-crashes are removed, especially pointer errors.)

d) REPL or Incremental Compilation.
    This is the Run-Eval-Print-Loop. You probably already heard about it. 
It's quite comparable to d). You can modify your running program by 
replacing functions, but you can also _extend_ your (running) program. 
Well, normally your program doesn't run all the time, but in general you 
only add your new functions instead of recompiling all you functions (the 
CBRL - Compile-Build-Run-Loop ;-).
    This is one way to use the REPL, the other way is to first test your 
newly written function and then add it to your running program. This is 
the main way of programming in lisp (I think, at least it's mine).
    Many languages and IDEs tend towards this style of development. So this 
is no longer purely a lisp feature but IMO lisp still does it best. (I 
recently read an article about jython--java-compiling python--, which has 
a REPL, but the basic message of the article was that you can use this to 
easily test new functions instead of using it as your development style. 
Well, ok, at least they see half of the benefit ;-)

e..?) There are still many other features that are really nice but I found 
these to be most convincing. Other good features are it's nice (superior?) 
object system (CLOS), powerful error handling, many built-in data-types 
and functions, garbage collection, etc. etc.

All these features enable you to write your programs more effectively (by 
means of develoment time) - bringing back the fun in programming. At least 
it worked for me :-)

Maybe that helps you understanding why many here cannot understand that 
you think pascal/c/basic would be comparable to lisp with your 
LISPPA-extension.

Perhaps you have other arguments.
Perhaps you try out lisp ;-)

regards

Thomas
From: Thomas Schilling
Subject: Re: LISPPA
Date: 
Message-ID: <opr7qjybjptrs3c0@news.CIS.DFN.DE>
Oh, sorry. Some errata here:

>> http://www.cs.bham.ac.uk/research/poplog/paradigms_lectures/lecture20.html#unification
>>
>> which describes the unification algorithm in Scheme. I guess that
>> Peter Norvig's book presents the same idea of unification.

Agreed.

> Here's the version from Norvig's compiler:

The link is: http://www.norvig.com/paip/prologc.lisp

> d) REPL or Incremental Compilation.
>     This is the Run-Eval-Print-Loop. You probably already heard about 
> it. It's quite comparable to d). [...]
                               ^^^^
                               must be c).
From: Alexander Baranovsky
Subject: Re: LISPPA
Date: 
Message-ID: <27d7ccd8.0405100203.e04a185@posting.google.com>
Thomas Schilling <······@yahoo.de> wrote in message news:<················@news.CIS.DFN.DE>...
> Alexander Baranovsky wrote:
> > http://www.cs.bham.ac.uk/research/poplog/paradigms_lectures/lecture20.html#unification
> >
> > which describes the unification algorithm in Scheme. I guess that
> > Peter Norvig's book presents the same idea of unification.
>  
> > 1. I do not use the binding list. The bindings are expressed by means
> > of aliases in my approach. Ok, I use extra structure which contains
> > parameters of term, but this approach provides the linear algorithm of
> > renaming variables.
> > 2. I do not use search in the binding list.
> > 3. I use one function Unify instead of two functions: unify and
> > unify-variable.
> > 4. I do not use the mutual recursion (of unify and unify-variable) in
> > my algorithm, so it is much more easy to get a non-recursive version
> > of the algorithm.
> > 5. My algorithm is more concise and easy for understanding.
> 
> Here's the version from Norvig's compiler:
> 
> (defstruct var name (binding unbound))
> ;; in pascal this is like:
> ;; type
> ;;  Var = record
> ;;    name: ?;
> ;;    binding: ^var; (* default value = unbound *)
> ;;  end;
> 
> (defun bound-p (var) (not (eq (var-binding var) unbound)))
> ;; (var-binding var) would be var.binding
> 
> (defmacro deref (exp)
>    "Follow pointers for bound variables."
>    `(progn (loop while (and (var-p ,exp) (bound-p ,exp))
>               do (setf ,exp (var-binding ,exp)))
>            ,exp))
> ;; this is a macro, although in this case you can
> ;; remove all the "`" and "," and consider it as a function
> ;; being inlined in the actual code
> 
> (defun unify! (x y)
>    "Destructively unify two expressions"
>    (cond ((eql (deref x) (deref y)) t)
>          ((var-p x) (set-binding! x y))
>          ((var-p y) (set-binding! y x))
>          ((and (consp x) (consp y))
>           (and (unify! (first x) (first y))
>                (unify! (rest x) (rest y))))
>          (t nil)))
> 
> (defun set-binding! (var value)
>    "Set var's binding to value.  Always succeeds (returns t)."
>    (setf (var-binding var) value)
>    t)
> ;; (setf (var-binding var) value) means var.binding := value;
> 
> So, now concerning your remarks:
> 1) thesame (see the macro)
> 2) thesame
> 3) thesame
> 4) thesame. also: recursion must not be bad
> 5) well, that's a matter of taste (and getting used to).

Please let me explain why I'm not quite agree with you. The key of the
problem is the different representation of bindings. In the Lisp
program, the bindings are represented as an associative list which
contains pairs (variable, value). In my approach the binding expressed
by means of alias, so the variable is just an alias of value. It
contains address of the value in the implementation. So, my approach
is a bit less expensive regarding the memory amount and it promises
more fast work because the search of variable in the binding list is
not necessary.

Let's assume, we have two terms:

t1 = L(x, x)
t2 = L(f(y), f(a))

I represent these terms as

t1 = [L, PX, PX]
t2 = [L, [F, PY], [F, A]]

where PX and PY are parameters of terms t1 and t2 accordingly. PX and
PY are aliases of X and Y, so the terms "looks" as

t1 = [L, X, X]
t2 = [L, [F, Y], [F, A]]

in any expressions.

The line 

 else if IsVar(T1[I]) and (not Inside(T1[I], T2[I])) then T1[I]^ := @
T2[I]^

of Unify function will assign X (parameter of t1, not actual item of
T1 !) as alias of Y, so now terms "look" like

t1 = [L, [F, Y], [F, Y]]
t2 = [L, [F, Y], [F, A]]

Please note, we have not made actual substitutions, we just executed
simple operation of creation of alias.

The Unify algorithm continues his work, now it consideres the third
item of t1 as a compound term [F, Y], not variable X. Once again, the
line

 else if IsVar(T1[I]) and (not Inside(T1[I], T2[I])) then T1[I]^ := @
T2[I]^

will assign Y variable as alias of A.

As you can see, the Unify function acts quite straightforward. At the
end of work we obtain that the terms "look" like

t1 = [L, [F, A], [F, A]]
t2 = [L, [F, A], [F, A]]

and the Unify function returns true. Note, the actual values of items
of t1 and t2 have not been changed, we did not actual substitutions.
But only parameters of t1 and t2 have been changed.

I'll answer the rest of your message later this day.

A..
From: Thomas Schilling
Subject: Re: LISPPA
Date: 
Message-ID: <2g9bs8F4jgjU1@uni-berlin.de>
Alexander Baranovsky wrote:
>> (defmacro deref (exp)
>>    "Follow pointers for bound variables."
>>    `(progn (loop while (and (var-p ,exp) (bound-p ,exp))
>>               do (setf ,exp (var-binding ,exp)))
>>            ,exp))
[...]
>> (defun unify! (x y)
>>    "Destructively unify two expressions"
>>    (cond ((eql (deref x) (deref y)) t)
>>          ((var-p x) (set-binding! x y))
>>          ((var-p y) (set-binding! y x))
>>          ((and (consp x) (consp y))
>>           (and (unify! (first x) (first y))
>>                (unify! (rest x) (rest y))))
>>          (t nil)))
[...]
> Please let me explain why I'm not quite agree with you. The key of the
> problem is the different representation of bindings. In the Lisp
> program, the bindings are represented as an associative list which
> contains pairs (variable, value). In my approach the binding expressed
> by means of alias, so the variable is just an alias of value. It
> contains address of the value in the implementation. So, my approach
> is a bit less expensive regarding the memory amount and it promises
> more fast work because the search of variable in the binding list is
> not necessary.

That's exactly what Peter Norvig does. The items for variables are structs
that have a field called var-binding associated with them. Ain't that an
alias? There no longer is an assoc-list here.

The while loop in deref is executed exactly once in the normal case. Only if
you have a variable that is bound to another variable you have to
dereference again. Oh, and if it's an expression the while-loop won't be
executed at all. So, I doubt this is inefficient.

The rest is just like your code.
From: Gareth McCaughan
Subject: Re: LISPPA
Date: 
Message-ID: <87hduqfsod.fsf@g.mccaughan.ntlworld.com>
Alexander Baranovsky wrote:

> Ok. Now let's consider
> 
> (defun my-add (x y)
>  (+ x y)
> )
> 
> How Common Lisp compiler compiles it? If you pass (2 2) as parameter,
> it is integer addition. It you pass (2.2 2.3), it is float addition.
> If you pass (2 2.3) then there should be data type conversion. Does
> Common Lisp uses only float numbers in implementation? Sorry for the
> trivial question, but I really do not understand how the absense of
> declaration of variables can lead to more effective programs?

If you define it like that, then indeed a CL compiler will
produce code that checks the types of the arguments and
acts accordingly. But that isn't the whole story.

1. Common Lisp has an amazingly cool feature: *optional*
   type declarations. When you have a function whose
   argument and/or return types are known, and you
   need that function to execute quickly, you can tell
   the compiler those types and it can make use of that
   information.

2. Common Lisp compilers can do *type inference*, which
   means that they can work out some of the types without
   having to be told. In the example you gave above, of
   course, there's nothing to make that possible, but in
   real code there are usually plenty of opportunities.

3. You can tell the CL compiler that it's allowed to inline
   the MY-ADD function. Then it can compile uses of that
   function in the same way as it would compile calls to
   the + function, and in particular it can use any type
   information it has at each individual call site to
   avoid redundant type checking or dispatching.

>>>                                                  As for the type
>>> inference, any Pascal compiler makes it at compile-stage.
>> 
>> If you believe that, then you do not understand what I mean
>> by "type inference". Type inference is where the compiler
>> works out the types of variables, the signatures of functions,
>> and so on, *without* you having to tell it explicitly. Many
>> Common Lisp implementations do a certain amount of type
>> inference, so as to optimize better. There are also statically
>> typed languages in which scarcely any explicit type declarations
>> are needed, because they have powerful type inference systems.
>> Haskell and ML are good examples.
> 
> Yes, I know. I'm using a sort of the type inference at run-time in
> paxScript.

Type inference is not something that happens at run time.
It happens at *compile time*. It lets the compiler avoid
doing type checking or type dispatch at run time, because
it's already worked out at compile time what the relevant
types are. There are some languages where typing is completely
static, but where most type declarations can be omitted
because of type inference. Common Lisp is not such a language,
but good CL compilers (of which there are several) can still
eliminate a lot of type checking and type dispatching by
doing type inference at compile time.

>            VBScript and JScript use the type inference at
> compile-time. But please note that VBScript has been replaced with
> VB.NET, JScript is replaced with JScript.NET. Both languages support
> strict data typing and the support of strict data typing vs type
> inference states more preferable in MSDN.

"Strict data typing" and "type inference" are not opposed.
Not even "strict static data typing" (which is what I think
you meant) and "type inference" are opposed.

>                                           I'm not going to agitate for
> Microsoft solutions, however I consider it as a reason for the serious
> consideration. I like some JavaScript ideas, btw I guess that
> JavaScript authors were under influence of Lisp: data abstraction,
> functions as data objects, Eval function. But I never think to use
> JavaScript for serious programming.

I'm not sure whether this is meant to be some sort of side-swipe
at Lisp (JavaScript has some Lispy features; JavaScript is not
appropriate for serious programming; therefore Lisp isn't either).
If it is, the implied logic is badly broken. If it isn't, then
I've no idea what point you're making by bringing in JavaScript
here.

>> You might want to find a copy of "Paradigms of Artificial
>> Intelligence Programming" and look at section 11.6, where
>> you will find an implementation of unification that looks
>> rather like yours except that (1) it's shorter, (2) it's
>> easier to read because it doesn't involve artificial
>> diddling with indices (pointers by another name!), and
>> (3) it's correct. (Try to unify [a,b,c,d] and [b,c,d,a]
>> with your code, where a,b,c,d are variables.)
> 
> (1). Sorry again, I do not have the mentioned book, but I have another
> book by George Luger.

I don't have George Luger's book, but I do have "The Lord of the Rings"
by J R R Tolkien. Is it OK if I assess what you're saying on the basis
of that instead of the book you're actually talking about?

>                       It contains big chapter gerading this subject
> and the unification algorithm in Lisp. It does not more short. Besides
> it uses tour of tree to substitute variables.

So Luger either doesn't know how to do the job efficiently
or, more likely, doesn't care about doing so because his
purpose is just to illustrate the principles and not to
provide an efficient program.

> paxPascal:
> 
> function Unify(N: Integer; T1, T2: Variant): boolean;
> var
>   I: Integer;
>   P1, P2: Variant;
> begin
>   result := true;
>   for I:=N to T1.length - 1 do
>   begin
>     P1 := @ T1[I];
>     P2 := @ T2[I];
>     if IsCompound(P1) and IsCompound(P2) then result := Unify(0, P1,
> P2)
>     else if IsVar(P1) and (not Inside(P1, P2)) then  P1^ := @ P2^
>     else if IsVar(P2) and (not Inside(P2, P1)) then  P2^ := @ P1^
>     else result := P1 = P2;
>     if not result then
>       Exit;
>   end;
> end;
> 
> I do not understand why it is not short? Why it is not enough clear?

It's quite short, even though it would be shorter in Lisp.
It's less clear than it could be because (for reasons that
presumably relate to the way this function is used in the
rest of the program) you have this parameter N to let
unification commence somewhere other than at the start
of the two terms. One advantage of Lisp-style lists is
that you can work directly with a sublist that starts in
the middle of another list and goes on to the end, without
needing to pass the original list and an index into it.

> (2) Term is a data structure (tree) represented as polymorhic array. I
> do not understand why indices of array are:
> 
> a) "pointers".
> b) "artificial diddling".

(a) There isn't really much difference between an array index
and a pointer pointing into the middle of the array.

(b) The fact that whatever you're doing with unification
requires your "unify" function to have an interface not
in terms of "unify these two terms" but of "unify certain
subsequences of these two terms, starting at the Nth element
of each" looks to me pretty artificial.

> (3). Sorry, I did not understand it. If a, b, c, d are variables, you
> have to rename terms [a,b,c,d] and [b,c,d,a] before unification. What
> is wrong?

Nothing, if [a,b,c,d] and [b,c,d,a] were handed in at the top level,
so to speak. But suppose they got that way in the course of doing
a larger unification?

> It would be nice to see your linear algorithm in Common Lisp. The
> algorithm in itself does not big matter for me, but I'm interested to
> see expressive features of the language..

It isn't mine. Someone else has already pointed you at the code
from Norvig's book.

-- 
Gareth McCaughan
.sig under construc
From: Alexander Baranovsky
Subject: Re: LISPPA
Date: 
Message-ID: <27d7ccd8.0405082053.59199dfa@posting.google.com>
Gareth McCaughan <················@pobox.com> wrote in message news:<··············@g.mccaughan.ntlworld.com>...
> Alexander Baranovsky wrote:
> 
> > Ok. Now let's consider
> > 
> > (defun my-add (x y)
> >  (+ x y)
> > )
> > 
> > How Common Lisp compiler compiles it? If you pass (2 2) as parameter,
> > it is integer addition. It you pass (2.2 2.3), it is float addition.
> > If you pass (2 2.3) then there should be data type conversion. Does
> > Common Lisp uses only float numbers in implementation? Sorry for the
> > trivial question, but I really do not understand how the absense of
> > declaration of variables can lead to more effective programs?
> 
> If you define it like that, then indeed a CL compiler will
> produce code that checks the types of the arguments and
> acts accordingly. But that isn't the whole story.
> 
> 1. Common Lisp has an amazingly cool feature: *optional*
>    type declarations. When you have a function whose
>    argument and/or return types are known, and you
>    need that function to execute quickly, you can tell
>    the compiler those types and it can make use of that
>    information.

Ok, it is a solution. (Although "amazingly cool" is emotional because
the "optional" type declaration use many another languages :-).

> 
> 2. Common Lisp compilers can do *type inference*, which
>    means that they can work out some of the types without
>    having to be told. In the example you gave above, of
>    course, there's nothing to make that possible, but in
>    real code there are usually plenty of opportunities.

I understood.

> 
> 3. You can tell the CL compiler that it's allowed to inline
>    the MY-ADD function. Then it can compile uses of that
>    function in the same way as it would compile calls to
>    the + function, and in particular it can use any type
>    information it has at each individual call site to
>    avoid redundant type checking or dispatching.

I agree. (my example is not very general, so it allows the inline
version, indeed).

> 
> >>>                                                  As for the type
> >>> inference, any Pascal compiler makes it at compile-stage.
> >> 
> >> If you believe that, then you do not understand what I mean
> >> by "type inference". Type inference is where the compiler
> >> works out the types of variables, the signatures of functions,
> >> and so on, *without* you having to tell it explicitly. Many
> >> Common Lisp implementations do a certain amount of type
> >> inference, so as to optimize better. There are also statically
> >> typed languages in which scarcely any explicit type declarations
> >> are needed, because they have powerful type inference systems.
> >> Haskell and ML are good examples.
> > 
> > Yes, I know. I'm using a sort of the type inference at run-time in
> > paxScript.
> 
> Type inference is not something that happens at run time.

:-)

I understand the "standard" treatment of the concept. But I also
suppose that the "type inference" is the inference of type word for
word. paxScript program is able to change himself at run-time and
replace "general" operators by "detailed" operators.

> It happens at *compile time*. It lets the compiler avoid
> doing type checking or type dispatch at run time, because
> it's already worked out at compile time what the relevant
> types are. There are some languages where typing is completely
> static, but where most type declarations can be omitted
> because of type inference. Common Lisp is not such a language,
> but good CL compilers (of which there are several) can still
> eliminate a lot of type checking and type dispatching by
> doing type inference at compile time.
> 
> >            VBScript and JScript use the type inference at
> > compile-time. But please note that VBScript has been replaced with
> > VB.NET, JScript is replaced with JScript.NET. Both languages support
> > strict data typing and the support of strict data typing vs type
> > inference states more preferable in MSDN.
> 
> "Strict data typing" and "type inference" are not opposed.
> Not even "strict static data typing" (which is what I think
> you meant) and "type inference" are opposed.

I never stated contrary. In VB.NET these concepts coexists. But
presence of explicit declaration can help in a case where the type
inference is not applicable.

> 
> >                                           I'm not going to agitate for
> > Microsoft solutions, however I consider it as a reason for the serious
> > consideration. I like some JavaScript ideas, btw I guess that
> > JavaScript authors were under influence of Lisp: data abstraction,
> > functions as data objects, Eval function. But I never think to use
> > JavaScript for serious programming.
> 
> I'm not sure whether this is meant to be some sort of side-swipe
> at Lisp (JavaScript has some Lispy features; JavaScript is not
> appropriate for serious programming; therefore Lisp isn't either).
> If it is, the implied logic is badly broken. If it isn't, then
> I've no idea what point you're making by bringing in JavaScript
> here.

Again you do not understand me quite correctly. I do not try to say
that Lisp is a bad language. I just say that I have choice and I
prefer strict typed languages.

> 
> >> You might want to find a copy of "Paradigms of Artificial
> >> Intelligence Programming" and look at section 11.6, where
> >> you will find an implementation of unification that looks
> >> rather like yours except that (1) it's shorter, (2) it's
> >> easier to read because it doesn't involve artificial
> >> diddling with indices (pointers by another name!), and
> >> (3) it's correct. (Try to unify [a,b,c,d] and [b,c,d,a]
> >> with your code, where a,b,c,d are variables.)
> > 
> > (1). Sorry again, I do not have the mentioned book, but I have another
> > book by George Luger.
> 
> I don't have George Luger's book, but I do have "The Lord of the Rings"
> by J R R Tolkien. Is it OK if I assess what you're saying on the basis
> of that instead of the book you're actually talking about?

Ok. May be it is my bad English, because you do not understand me
again. I really sorry that I have no this book. I do not try to state
that George Luger's book is my best choice so I do not want to read
something else.

> 
> >                       It contains big chapter gerading this subject
> > and the unification algorithm in Lisp. It does not more short. Besides
> > it uses tour of tree to substitute variables.
> 
> So Luger either doesn't know how to do the job efficiently
> or, more likely, doesn't care about doing so because his
> purpose is just to illustrate the principles and not to
> provide an efficient program.
> 
> > paxPascal:
> > 
> > function Unify(N: Integer; T1, T2: Variant): boolean;
> > var
> >   I: Integer;
> >   P1, P2: Variant;
> > begin
> >   result := true;
> >   for I:=N to T1.length - 1 do
> >   begin
> >     P1 := @ T1[I];
> >     P2 := @ T2[I];
> >     if IsCompound(P1) and IsCompound(P2) then result := Unify(0, P1,
> > P2)
> >     else if IsVar(P1) and (not Inside(P1, P2)) then  P1^ := @ P2^
> >     else if IsVar(P2) and (not Inside(P2, P1)) then  P2^ := @ P1^
> >     else result := P1 = P2;
> >     if not result then
> >       Exit;
> >   end;
> > end;
> > 
> > I do not understand why it is not short? Why it is not enough clear?
> 
> It's quite short, even though it would be shorter in Lisp.

I do not agree.

> It's less clear than it could be because (for reasons that
> presumably relate to the way this function is used in the
> rest of the program) you have this parameter N to let
> unification commence somewhere other than at the start
> of the two terms. One advantage of Lisp-style lists is
> that you can work directly with a sublist that starts in
> the middle of another list and goes on to the end, without
> needing to pass the original list and an index into it.

Sorry, I do not see big tragedy here. Honestly I might do not use N.
But if I need to unify

(plus, P, A, B) and (plus, P, U, V)

and I know that these predicates has the same functor I call Unify(2,
t1, t2).

> 
> > (2) Term is a data structure (tree) represented as polymorhic array. I
> > do not understand why indices of array are:
> > 
> > a) "pointers".
> > b) "artificial diddling".
> 
> (a) There isn't really much difference between an array index
> and a pointer pointing into the middle of the array.

You do not want to accept that arrays and lists can be considered on
the same level of abstraction.

> 
> (b) The fact that whatever you're doing with unification
> requires your "unify" function to have an interface not
> in terms of "unify these two terms" but of "unify certain
> subsequences of these two terms, starting at the Nth element
> of each" looks to me pretty artificial.

Oh, please, please stop it :-)

> 
> > (3). Sorry, I did not understand it. If a, b, c, d are variables, you
> > have to rename terms [a,b,c,d] and [b,c,d,a] before unification. What
> > is wrong?
> 
> Nothing, if [a,b,c,d] and [b,c,d,a] were handed in at the top level,
> so to speak. But suppose they got that way in the course of doing
> a larger unification?

And?

> 
> > It would be nice to see your linear algorithm in Common Lisp. The
> > algorithm in itself does not big matter for me, but I'm interested to
> > see expressive features of the language..
> 
> It isn't mine. Someone else has already pointed you at the code
> from Norvig's book.

No problem.

A.
From: Gareth McCaughan
Subject: Re: LISPPA
Date: 
Message-ID: <87u0ype3y5.fsf@g.mccaughan.ntlworld.com>
Alexander Baranovsky wrote:

[initially quoting me:]
>> 1. Common Lisp has an amazingly cool feature: *optional*
>>    type declarations. When you have a function whose
>>    argument and/or return types are known, and you
>>    need that function to execute quickly, you can tell
>>    the compiler those types and it can make use of that
>>    information.
> 
> Ok, it is a solution. (Although "amazingly cool" is emotional because
> the "optional" type declaration use many another languages :-).

I only know a dozen or so languages well enough to comment
on them; I haven't encountered any, other than CL, that
have optional type declarations. Could you give me, say,
three examples of other languages that have the same feature?

(I see from later in your article that VB.NET has something
of the kind.)

>>>>>                                                  As for the type
>>>>> inference, any Pascal compiler makes it at compile-stage.
>>>> 
>>>> If you believe that, then you do not understand what I mean
>>>> by "type inference". Type inference is where the compiler
>>>> works out the types of variables, the signatures of functions,
>>>> and so on, *without* you having to tell it explicitly. Many
>>>> Common Lisp implementations do a certain amount of type
>>>> inference, so as to optimize better. There are also statically
>>>> typed languages in which scarcely any explicit type declarations
>>>> are needed, because they have powerful type inference systems.
>>>> Haskell and ML are good examples.
>>> 
>>> Yes, I know. I'm using a sort of the type inference at run-time in
>>> paxScript.
>> 
>> Type inference is not something that happens at run time.
> 
> :-)
> 
> I understand the "standard" treatment of the concept. But I also
> suppose that the "type inference" is the inference of type word for
> word. paxScript program is able to change himself at run-time and
> replace "general" operators by "detailed" operators.

That's neat. Let me point you at Python again. There's a
thing called Psyco, short for "Python specializing compiler",
that does something similar. It might have some clever tricks
for you to learn from.

>>>            VBScript and JScript use the type inference at
>>> compile-time. But please note that VBScript has been replaced with
>>> VB.NET, JScript is replaced with JScript.NET. Both languages support
>>> strict data typing and the support of strict data typing vs type
>>> inference states more preferable in MSDN.
>> 
>> "Strict data typing" and "type inference" are not opposed.
>> Not even "strict static data typing" (which is what I think
>> you meant) and "type inference" are opposed.
> 
> I never stated contrary.

No; you merely said something that doesn't make the slightest
sense unless the two are opposed.

>                          In VB.NET these concepts coexists. But
> presence of explicit declaration can help in a case where the type
> inference is not applicable.

Mhm. And you're saying MS have issued a blanket recommendation
that you should declare all variables explicitly? Well, they're
entitled to do so. Either their advice is (in my opinion) bad
advice, or VB.NET doesn't handle dynamic typing as well as CL
does. I don't know enough about VB.NET to tell which of those
is the case.

>>>                                           I'm not going to agitate for
>>> Microsoft solutions, however I consider it as a reason for the serious
>>> consideration. I like some JavaScript ideas, btw I guess that
>>> JavaScript authors were under influence of Lisp: data abstraction,
>>> functions as data objects, Eval function. But I never think to use
>>> JavaScript for serious programming.
>> 
>> I'm not sure whether this is meant to be some sort of side-swipe
>> at Lisp (JavaScript has some Lispy features; JavaScript is not
>> appropriate for serious programming; therefore Lisp isn't either).
>> If it is, the implied logic is badly broken. If it isn't, then
>> I've no idea what point you're making by bringing in JavaScript
>> here.
> 
> Again you do not understand me quite correctly. I do not try to say
> that Lisp is a bad language. I just say that I have choice and I
> prefer strict typed languages.

Feel free.

>>> paxPascal:
>>> 
>>> function Unify(N: Integer; T1, T2: Variant): boolean;
>>> var
>>>   I: Integer;
>>>   P1, P2: Variant;
>>> begin
>>>   result := true;
>>>   for I:=N to T1.length - 1 do
>>>   begin
>>>     P1 := @ T1[I];
>>>     P2 := @ T2[I];
>>>     if IsCompound(P1) and IsCompound(P2) then result := Unify(0, P1, P2)
>>>     else if IsVar(P1) and (not Inside(P1, P2)) then  P1^ := @ P2^
>>>     else if IsVar(P2) and (not Inside(P2, P1)) then  P2^ := @ P1^
>>>     else result := P1 = P2;
>>>     if not result then
>>>       Exit;
>>>   end;
>>> end;
>>> 
>>> I do not understand why it is not short? Why it is not enough clear?
>> 
>> It's quite short, even though it would be shorter in Lisp.
> 
> I do not agree.

I'm sure you agree that it's quite short, having said as much
yourself. So presumably you mean that it is impossible to
write a shorter unification function in Lisp. Since by your
own admission you don't know Lisp well enough to be sure
whether it uses floating-point for all calculations, I'd
love to know on what basis you say it couldn't be shorter
in Lisp.

Your "Unify" function above is 18 lines long. Peter Norvig's
book gives a similar function that's only 8 lines long (plus
one line of "documentation string" -- like a comment, but
better). It needs a little supporting code -- as your function
does too, of course. That comes to a total of 11 lines. So
unless you have definitions of IsCompound, IsVar and Inside
that take a total of 1 line or fewer, the Lisp code is
shorter.

>> It's less clear than it could be because (for reasons that
>> presumably relate to the way this function is used in the
>> rest of the program) you have this parameter N to let
>> unification commence somewhere other than at the start
>> of the two terms. One advantage of Lisp-style lists is
>> that you can work directly with a sublist that starts in
>> the middle of another list and goes on to the end, without
>> needing to pass the original list and an index into it.
> 
> Sorry, I do not see big tragedy here. Honestly I might do not use N.
> But if I need to unify
> 
> (plus, P, A, B) and (plus, P, U, V)
> 
> and I know that these predicates has the same functor I call Unify(2,
> t1, t2).

I never claimed there was a big tragedy. If you represented
your expressions as Lisp-style lists you wouldn't need to
have N in the interface of Unify to do this; you could just
pass the cddr of each expression in.

>>> (2) Term is a data structure (tree) represented as polymorhic array. I
>>> do not understand why indices of array are:
>>> 
>>> a) "pointers".
>>> b) "artificial diddling".
>> 
>> (a) There isn't really much difference between an array index
>> and a pointer pointing into the middle of the array.
> 
> You do not want to accept that arrays and lists can be considered on
> the same level of abstraction.

If you're reading my mind now, then perhaps I needn't have bothered
writing any of this. If not, then kindly desist from telling me what
I don't want to accept, especially when (as here) you are incorrect.

>>> (3). Sorry, I did not understand it. If a, b, c, d are variables, you
>>> have to rename terms [a,b,c,d] and [b,c,d,a] before unification. What
>>> is wrong?
>> 
>> Nothing, if [a,b,c,d] and [b,c,d,a] were handed in at the top level,
>> so to speak. But suppose they got that way in the course of doing
>> a larger unification?
> 
> And?

Then Unify might get called with arguments [a,b,c,d] and [b,c,d,a]
despite the fact that you rename variables before unification. And
then it will not unify [a,b,c,d] and [b,c,d,a] correctly.

-- 
Gareth McCaughan
.sig under construc
From: Alexander Baranovsky
Subject: Re: LISPPA
Date: 
Message-ID: <27d7ccd8.0405092132.30d57b9d@posting.google.com>
Gareth McCaughan <················@pobox.com> wrote in message news:<··············@g.mccaughan.ntlworld.com>...
> Alexander Baranovsky wrote:
> 
> [initially quoting me:]
> >> 1. Common Lisp has an amazingly cool feature: *optional*
> >>    type declarations. When you have a function whose
> >>    argument and/or return types are known, and you
> >>    need that function to execute quickly, you can tell
> >>    the compiler those types and it can make use of that
> >>    information.
> > 
> > Ok, it is a solution. (Although "amazingly cool" is emotional because
> > the "optional" type declaration use many another languages :-).
> 
> I only know a dozen or so languages well enough to comment
> on them; I haven't encountered any, other than CL, that
> have optional type declarations. Could you give me, say,
> three examples of other languages that have the same feature?

paxPascal, paxBasic, paxC. 

> 
> (I see from later in your article that VB.NET has something
> of the kind.)

JScript.NET as well.

> 
> >>>>>                                                  As for the type
> >>>>> inference, any Pascal compiler makes it at compile-stage.
> >>>> 
> >>>> If you believe that, then you do not understand what I mean
> >>>> by "type inference". Type inference is where the compiler
> >>>> works out the types of variables, the signatures of functions,
> >>>> and so on, *without* you having to tell it explicitly. Many
> >>>> Common Lisp implementations do a certain amount of type
> >>>> inference, so as to optimize better. There are also statically
> >>>> typed languages in which scarcely any explicit type declarations
> >>>> are needed, because they have powerful type inference systems.
> >>>> Haskell and ML are good examples.
> >>> 
> >>> Yes, I know. I'm using a sort of the type inference at run-time in
> >>> paxScript.
> >> 
> >> Type inference is not something that happens at run time.
>  
> > :-)
> > 
> > I understand the "standard" treatment of the concept. But I also
> > suppose that the "type inference" is the inference of type word for
> > word. paxScript program is able to change himself at run-time and
> > replace "general" operators by "detailed" operators.
> 
> That's neat. Let me point you at Python again. There's a
> thing called Psyco, short for "Python specializing compiler",
> that does something similar. It might have some clever tricks
> for you to learn from.

Thanks. 

> 
> >>>            VBScript and JScript use the type inference at
> >>> compile-time. But please note that VBScript has been replaced with
> >>> VB.NET, JScript is replaced with JScript.NET. Both languages support
> >>> strict data typing and the support of strict data typing vs type
> >>> inference states more preferable in MSDN.
> >> 
> >> "Strict data typing" and "type inference" are not opposed.
> >> Not even "strict static data typing" (which is what I think
> >> you meant) and "type inference" are opposed.
> > 
> > I never stated contrary.
> 
> No; you merely said something that doesn't make the slightest
> sense unless the two are opposed.
> 
> >                          In VB.NET these concepts coexists. But
> > presence of explicit declaration can help in a case where the type
> > inference is not applicable.
> 
> Mhm. And you're saying MS have issued a blanket recommendation
> that you should declare all variables explicitly? Well, they're
> entitled to do so. Either their advice is (in my opinion) bad
> advice, or VB.NET doesn't handle dynamic typing as well as CL
> does. I don't know enough about VB.NET to tell which of those
> is the case.

Yes, this is their advice. And I think they are right. The
compile-time type inference do not work
in selected cases. Besides the explicit type declaration leads to more
readable code, in my opinion.

> >>> paxPascal:
> >>> 
> >>> function Unify(N: Integer; T1, T2: Variant): boolean;
> >>> var
> >>>   I: Integer;
> >>>   P1, P2: Variant;
> >>> begin
> >>>   result := true;
> >>>   for I:=N to T1.length - 1 do
> >>>   begin
> >>>     P1 := @ T1[I];
> >>>     P2 := @ T2[I];
> >>>     if IsCompound(P1) and IsCompound(P2) then result := Unify(0, P1, P2)
> >>>     else if IsVar(P1) and (not Inside(P1, P2)) then  P1^ := @ P2^
> >>>     else if IsVar(P2) and (not Inside(P2, P1)) then  P2^ := @ P1^
> >>>     else result := P1 = P2;
> >>>     if not result then
> >>>       Exit;
> >>>   end;
> >>> end;
> >>> 
> >>> I do not understand why it is not short? Why it is not enough clear?
> >> 
> >> It's quite short, even though it would be shorter in Lisp.
> > 
> > I do not agree.
> 
> I'm sure you agree that it's quite short, having said as much
> yourself. So presumably you mean that it is impossible to
> write a shorter unification function in Lisp. Since by your
> own admission you don't know Lisp well enough to be sure
> whether it uses floating-point for all calculations, I'd
> love to know on what basis you say it couldn't be shorter
> in Lisp.
> 
> Your "Unify" function above is 18 lines long. Peter Norvig's
> book gives a similar function that's only 8 lines long (plus
> one line of "documentation string" -- like a comment, but
> better). It needs a little supporting code -- as your function
> does too, of course. That comes to a total of 11 lines. So
> unless you have definitions of IsCompound, IsVar and Inside
> that take a total of 1 line or fewer, the Lisp code is
> shorter.

You like to count lines :-) 

I might to rewrite Unify as

function Unify(N: Integer; T1, T2: Variant): boolean;
var
  I: Integer;
begin
  result := true;
  for I:=N to T1.length - 1 do
  begin
    if IsCompound(T1[I]) and IsCompound(T2[I]) then result := Unify(0,
T1[I], T2[I])
    else if IsVar(T1[I]) and (not Inside(T1[I], T2[I])) then T1[I]^ :=
@ T2[I]^
    else if IsVar(T2[I]) and (not Inside(T2[I], T1[I])) then T2[I]^ :=
@ T1[I]^
    else result := T1[I] = T2[I];
    if not result then Exit;
  end;
end;

but, honetsly, I prefer the previous notation as more readable one.

But what you do not take into consideration that my Unify does not use
the bindings list and extra unify-variable function ?? (I already
mentioned it in previous message in this thread).

> > You do not want to accept that arrays and lists can be considered on
> > the same level of abstraction.
> 
> If you're reading my mind now, then perhaps I needn't have bothered
> writing any of this. If not, then kindly desist from telling me what
> I don't want to accept, especially when (as here) you are incorrect.

I'm do not trying to read your mind, I'm reading your statements.
According to it, I conclude that you consider array as a less abstract
data structure than list, what is a wrong point of view.

> 
> >>> (3). Sorry, I did not understand it. If a, b, c, d are variables, you
> >>> have to rename terms [a,b,c,d] and [b,c,d,a] before unification. What
> >>> is wrong?
> >> 
> >> Nothing, if [a,b,c,d] and [b,c,d,a] were handed in at the top level,
> >> so to speak. But suppose they got that way in the course of doing
> >> a larger unification?
> > 
> > And?
> 
> Then Unify might get called with arguments [a,b,c,d] and [b,c,d,a]
> despite the fact that you rename variables before unification. And
> then it will not unify [a,b,c,d] and [b,c,d,a] correctly.

I'm sorry, because again I do not understand. Could you give me please
an example of 2 terms which illustrates incorrectness of Unify.
Thanks.

A.
From: David Steuber
Subject: Re: LISPPA
Date: 
Message-ID: <87r7tyy0ox.fsf@david-steuber.com>
Gareth McCaughan <················@pobox.com> writes:

> It's true that Lisp programs are usually more memory-hungry
> than programs written in C or Pascal. As against that, they
> usually have considerably fewer memory-related bugs. Memories
> are so big these days that memory use only matters much for
> large programs, and it's not clear that *large* Lisp programs
> are much more memory-hungry than *large* programs in weaker
> languages.

On my Mac, Safari is by far the largest user of memory.  Emacs +
OpenMCL simply do not come close to touching it.  At this moment,
Emacs is consuming about 1/6 as much resident memory as Safari.

> > Any Pascal programmer will testify that the compile-time error
> > checking is huge advantage. Not only type checking but explicit
> > declaration of variables allows to avoid many problems related to the
> > scope, time of life etc. These "trivial" bugs are dangerous.
> 
> "Any Pascal programmer"? The huge majority of Pascal programmers
> are not in a position to have an educated opinion on whether
> static type checking and explicit type declarations make for
> better programs than dynamic type checking and type inference,
> because the huge majority of Pascal programmers have never used
> any language with dynamic type checking and type inference for
> anything other than trivial tasks. (I suspect most of them
> have *never* used one at all.)

Just another data point here.  Not for Pascal, but for C, C++, and
Java.  The latter two are very strong on static type checking.  I have
NEVER had an error from the compiler complaining that I have tried to
use the wrong type argument (Java's checked exceptions excepted).
OTOH, the Lisp compiler (OpenMCL and SBCL) has caught me on several
occasions at COMPILE TIME.  So I don't think the type checking
argument holds water.  It is also the most trivial of things to check
and fix.

> Another reason, I think, is that when writing in C or C++
> there is so much irrelevant nonsense that you have to be
> thinking about that your brain has less attention available
> for thinking about other things, like all the bugs you're
> putting into your code.

My bugs tend to show up at runtime, not compile time.  Again, this is
with C.  I don't see any harm in Lisp moving things to runtime (if it
really does).  Lisp makes it easy for me to test my code on the fly.
This has proven to be a wonderful feature.  I can't even do this level
of interactive checking with scripting languages like Perl.

> "Perhaps zero is enough" wasn't an entirely serious statement,
> but there *are* plenty of people who find fancy graphical IDEs
> less productive than, say, Emacs and the usual collection of
> Unixy stuff. If by "visual programming" you mean the sort of
> thing VB does for you, that's very valuable when you're building
> GUIs but much less useful for everything else one does when
> programming. Yes, a good debugger is very useful. It doesn't
> need to be graphical. Yes, a good editor is very useful. It
> doesn't need to be part of an IDE.

Xcode is a very nice GUI IDE that front-ends GCC and GDB.  It also has
a very nice GUI builder tool.  However, the editor is stone age
primative next to Emacs.  I feel like I'm in notepad with syntax
highlighting.  Once I've got Lisp down, I'm hoping to leverage
Interface Builder for making NIB files and use OpenMCL in Emacs for
the heavy lifting code.  If threads + a Cocoa bridge could be made to
work on ppc-darwin for SBCL, that would be even better.  Not that
there is anything wrong with OpenMCL.  Far from it.  I would just
prefer to have a single Lisp environment to work with.  I'm getting to
really like Emacs+SLIME+Lisp.  As things are, I probably can deal with
bouncing back and forth between SBCL on Linux and OpenMCL on OS X.

I think the less advanced look of Emacs is totally cosmetic and wholly
deceptive.  Both Emacs and Vi are far more advanced editing tools than
a typical GUI IDE like Dev Studio or Xcode seem to provide.  The only
real advantage of the latter is that they have built in knowledge of
MFC/Cocoa.

> I've seen people be very productive indeed in Emacs. I've
> seen prople be very productive in Visual Studio, too.

Each has its strengths.

> The same would be just as true in Common Lisp, except that
> Common Lisp has powerful features for syntactic abstraction --
> macros and a configurable reader -- so that you can write
> logic programs in Common Lisp that are nearly as clear and
> concise as they would be in Prolog. You can find some nice
> examples in Norvig's excellent book "Paradigms of artificial
> intelligence programming".

OnLisp also has a Prolog in CL implementation.

-- 
I wouldn't mind the rat race so much if it wasn't for all the damn cats.
From: David Steuber
Subject: Re: LISPPA
Date: 
Message-ID: <87n04o1ah7.fsf@david-steuber.com>
Gareth McCaughan <················@pobox.com> writes:

> Ah yes, money-oriented programming. The new paradigm.

Is this something we can say that Microsoft actually invented?  Or did
they just help to make it popular?

"Microsoft Money.  Not so much an application as a manifesto."

-- 
I wouldn't mind the rat race so much if it wasn't for all the damn cats.
From: Rahul Jain
Subject: Re: LISPPA
Date: 
Message-ID: <87d65ddpsa.fsf@nyct.net>
··@cable.netlux.org (Alexander Baranovsky) writes:

> My remark has been related to the word "includes". The Lisps cannot be
> a match for such imperative languages as C, Basic or Pascal in view of
> effectivity of programming, compile-time type checking, clean
> algol-liked syntax and readability, huge number of libraries, big
> number of IDEs, huge number of applications, huge amount of
> investments and huge number of users.

Ok, worse. You're now a complete moron, as you insist on believing your
fantasy and convincing me that it has any relation to the world that
exists outside of your head.

...


:)

-- 
Rahul Jain
·····@nyct.net
Professional Software Developer, Amateur Quantum Mechanicist
From: Alexander Baranovsky
Subject: Re: LISPPA
Date: 
Message-ID: <27d7ccd8.0405092352.13d8f862@posting.google.com>
Rahul Jain <·····@nyct.net> wrote in message news:<··············@nyct.net>...
> ··@cable.netlux.org (Alexander Baranovsky) writes:
> 
> > My remark has been related to the word "includes". The Lisps cannot be
> > a match for such imperative languages as C, Basic or Pascal in view of
> > effectivity of programming, compile-time type checking, clean
> > algol-liked syntax and readability, huge number of libraries, big
> > number of IDEs, huge number of applications, huge amount of
> > investments and huge number of users.
> 
> Ok, worse. You're now a complete moron, as you insist on believing your
> fantasy and convincing me that it has any relation to the world that
> exists outside of your head.
> 

"Tis dangerous when the baser nature comes 
Between the pass and fell incensed points
Of mighty opposites."

(Ham., Act 5, Scene 2).

A.
From: Raymond Wiker
Subject: Re: LISPPA
Date: 
Message-ID: <86zn8gpv1f.fsf@raw.grenland.fast.no>
··@cable.netlux.org (Alexander Baranovsky) writes:

> Rahul Jain <·····@nyct.net> wrote in message news:<··············@nyct.net>...
>> ··@cable.netlux.org (Alexander Baranovsky) writes:
>> 
>> > My remark has been related to the word "includes". The Lisps cannot be
>> > a match for such imperative languages as C, Basic or Pascal in view of
>> > effectivity of programming, compile-time type checking, clean
>> > algol-liked syntax and readability, huge number of libraries, big
>> > number of IDEs, huge number of applications, huge amount of
>> > investments and huge number of users.
>> 
>> Ok, worse. You're now a complete moron, as you insist on believing your
>> fantasy and convincing me that it has any relation to the world that
>> exists outside of your head.
>> 
>
> "Tis dangerous when the baser nature comes 
> Between the pass and fell incensed points
> Of mighty opposites."

        Since it appears that you know Shakespeare better than you
know Lisp, maybe you should go to rec.arts.* instead?

-- 
Raymond Wiker                        Mail:  ·············@fast.no
Senior Software Engineer             Web:   http://www.fast.no/
Fast Search & Transfer ASA           Phone: +47 23 01 11 60
P.O. Box 1677 Vika                   Fax:   +47 35 54 87 99
NO-0120 Oslo, NORWAY                 Mob:   +47 48 01 11 60

Try FAST Search: http://alltheweb.com/
From: Alexander Baranovsky
Subject: Re: LISPPA
Date: 
Message-ID: <27d7ccd8.0405102044.31dd1c94@posting.google.com>
Raymond Wiker <·············@fast.no> wrote in message news:<··············@raw.grenland.fast.no>...
> ··@cable.netlux.org (Alexander Baranovsky) writes:
> 
> > Rahul Jain <·····@nyct.net> wrote in message news:<··············@nyct.net>...
> >> ··@cable.netlux.org (Alexander Baranovsky) writes:
> >> 
> >> > My remark has been related to the word "includes". The Lisps cannot be
> >> > a match for such imperative languages as C, Basic or Pascal in view of
> >> > effectivity of programming, compile-time type checking, clean
> >> > algol-liked syntax and readability, huge number of libraries, big
> >> > number of IDEs, huge number of applications, huge amount of
> >> > investments and huge number of users.
> >> 
> >> Ok, worse. You're now a complete moron, as you insist on believing your
> >> fantasy and convincing me that it has any relation to the world that
> >> exists outside of your head.
> >> 
> >
> > "Tis dangerous when the baser nature comes 
> > Between the pass and fell incensed points
> > Of mighty opposites."
> 
>         Since it appears that you know Shakespeare better than you
> know Lisp, maybe you should go to rec.arts.* instead?

Perhaps 

comp.lang.general.no-fans 

would be more preferable, indeed.

A.
From: Gareth McCaughan
Subject: Re: LISPPA
Date: 
Message-ID: <8765bw46m4.fsf@g.mccaughan.ntlworld.com>
··@cable.netlux.org (Alexander Baranovsky) writes:

> LISPPA (List Processing based on the Polymorphic Arrays) technology
> provides a way to process dynamic data structures (lists, trees and
> more) without using pointers. LISPPA uses polymorphic arrays as a base
> of  the data representation.
...
> http://www.virtlabs.com.ua/paxscript/lisppa.htm

I'm sure it's very nice, but the only thing it has in common
with Lisp is (1) the first four letters of its name and
(2) the fact that it provides a way of working with sequences
of arbitrary objects in a programming language. So I'm not
sure why you've posted it to comp.lang.lisp.

Now, you've said

  | I know only one answer: the imperative approach and such languages as
  | Pascal, Basic C or Java are more attractive for majority of
  | programmers.
  | 
  | You will answer me: "The success is not a good reason. There are a lot
  | data domains where the imperative approach is not applicable". My
  | answer is: LISPPA.

so perhaps the answer is that you think your LISPPA thing
is going to make Lisp obsolete by providing ways to solve
the problems Lisp is good at, while remaining palatable to
the "majority of programmers".

But there are already languages that do what your LISPPA
does. Have a look at Python, for instance; it has "polymorphic
arrays" (which it calls lists, but never mind) among its
data structures, and it doesn't require you to keep writing
"AddressOf(...)" whenever you want to do anything non-trivial
with them. -- And yet, despite the existence of Python, Lisp
programmers don't seem to have given up. Why might that be?
Answer: the advantages of Lisp are *not* all about the "data
domains" it can handle, but about the facilities it provides
for writing powerful software efficiently and concisely.
LISPPA, so far as I can see, offers none of this.

Random example: your symbolid differentiation program has
a function called "InOrder" that produces an infix
representation of an expression. It's 30 lines long.
In Lisp it would say [danger: untested code!]

    (defun prefix->infix (x &optional (lparen "(") (rparen ")"))
      (cond
        ((atom x) (write-to-string x))
        ((= (length x) 2)
         (case (car x)
           ((-)   (format nil "~A-~A~A" lparen (prefix->infix (second x)) rparen))
           ((log) (format nil "log(~A)" (prefix->infix (second x) t)))))
        (t (format nil "~A~A~A~A~A" lparen
                                    (prefix->infix (second x))
                                    (first x)
                                    (prefix->infix (third x))
                                    rparen))))

That's 40% the length of yours. (Actually, in Lisp we'd
take advantage of the fact that this stuff is so much less
painful and write it differently so that it didn't have
to do so much copying of strings.) Your "InOrder" function
has some code in it that's meant to turn "A+-B" into "A-B",
but it doesn't actually work (there's no way what you've
called RR can ever begin with "-") so I haven't felt it
necessary to copy it.

Another random example: all those functions like IsPower, which
(1) occupy 6 lines each in your code and would be 2 lines each
in Lisp and (2) wouldn't actually be needed in a Lisp implementation,
for reasons related to the fact that Lisp isn't stuck with purely
"procedural" code. You talked about "data domains"; procedural
programming is unsuitable for this application because of the
nature of the *algorithms*, not the *data*.

-- 
Gareth McCaughan
.sig under construc
From: Alexander Baranovsky
Subject: Re: LISPPA
Date: 
Message-ID: <27d7ccd8.0404191104.375476cb@posting.google.com>
First of all, thank you for your answer as it turns the discussion
into serious direction, at least.

> I'm sure it's very nice, but the only thing it has in common
> with Lisp is (1) the first four letters of its name and
> (2) the fact that it provides a way of working with sequences
> of arbitrary objects in a programming language. So I'm not
> sure why you've posted it to comp.lang.lisp.

(1) Both Lisp and LISPPA have relation to the list processing. LISPPA
uses a "classic" data structure in the imperative programming, namely
the array, for the representation of linked lists. Both directions try
to "reconcile" the functional and imperative approaches of the
programming: Pure functional Lisp introduces "imperative" concepts via
side effects, LISPPA tries to provide uniform and concise data
representation and algorithmes for the recursive data structures what
is peculiar to the functional approach. It explains the name.

(2) I hoped that the comparison of both approaches can be interesting.

> so perhaps the answer is that you think your LISPPA thing
> is going to make Lisp obsolete by providing ways to solve
> the problems Lisp is good at, while remaining palatable to
> the "majority of programmers".

No, I did not think that. However, I suppose that mentioned imperative
languages are more claimed at present time. You cannot avoid to see
it. At the same time, they have many drawbacks. I have tried to remove
one such defect.

> But there are already languages that do what your LISPPA
> does. Have a look at Python, for instance; it has "polymorphic
> arrays" 

This is the second mention about Python (the first I've received in a
private email). Not only Python uses polymorphic arrays. JavaScript,
PHP and many other languages use it. In particular, there are a lot
languages which support Variant types and COM technology. I never
stated that the polymorphic array is a new concept. But, it seems, the
using arrays for the representation and processing of linked lists is
quite new idea. Indeed, in terms of the arrays we can give the simple
recursive definition of lists and binary trees. For example:

1. NULL (undefined variable) is empty list.
2. If L is a list, the one-dimensional array A with two elements A(0)
and A(1) = L is a list too.

We cannot produce such concise definition of the linked list using
standard approach based on pointers. This is the start point, further
LISPPA offers a uniform ways to process lists and other dynamic data
structures.

Please note, that the presence of polymorphic arrays in an imperative
programming language not yet provides
effective programming. A few extra conditions are required. LISPPA
uncovers a minimal set of such conditions. (In this relation, you can
take a look at the discussion about LISPPA in comp.lang.php).

> (which it calls lists, but never mind) among its
> data structures, and it doesn't require you to keep writing
> "AddressOf(...)" whenever you want to do anything non-trivial
> with them. --

Oh, I do not like AddressOf too :-) This denotation has been borrowed
from VB.NET (AddressOf denotes delegate of a function there). I'm
using AddressOf in paxBasic to denote a delegate of variable (alias).

> And yet, despite the existence of Python, Lisp
> programmers don't seem to have given up. Why might that be?
> Answer: the advantages of Lisp are *not* all about the "data
> domains" it can handle, but about the facilities it provides
> for writing powerful software efficiently and concisely.

I think the real advantages of LISPPA can be discovered after its
implementation in a compiler such as Visual Basic or Delphi. (It is
quite easy to extend well known Variant types to take into
consideration LISPPA requirements). Benefits of it are obvious, I
hope.

> LISPPA, so far as I can see, offers none of this.

Sorry, I cannot agree with it. See, for example:

http://www.virtlabs.com.ua/paxscript/demo_basic_lists.htm
http://www.virtlabs.com.ua/paxscript/demo_basic_trees.htm

http://www.virtlabs.com.ua/paxscript/demo_pascal_lists.htm
http://www.virtlabs.com.ua/paxscript/demo_pascal_trees.htm

http://www.virtlabs.com.ua/paxscript/demo_c_lists.htm
http://www.virtlabs.com.ua/paxscript/demo_c_trees.htm

http://www.virtlabs.com.ua/paxscript/demo_js_lists.htm
http://www.virtlabs.com.ua/paxscript/demo_js_trees.htm

> Random example: your symbolid differentiation program has
> a function called "InOrder" that produces an infix
> representation of an expression. It's 30 lines long.
> In Lisp it would say [danger: untested code!] ............

The function is written bad, I can accept it. But it is not a typical
example. I think more interesting example is Compress procedure which
allows you to compress a sub-tree of a tree without having to relocate
the whole tree in each act of the compression. The body of the
procedure is not short, but quite uniform and clear for understanding.
I do not think that you will use less number of lines to express the
same in Lisp.

Besides you might take a look at the source code of the theoprem
proving program in paxBasic

http://www.virtlabs.com.ua/paxscript/demo_basic_theorem.htm

or in paxPascal

http://www.virtlabs.com.ua/paxscript/demo_pascal_theorem.htm

and compare this code with source code presented in book Chang C.L and
Lee R.C. "Symbolic Logic and Mechanical Theorem Proving". Then you can
conclude what of codes is more suitable for understanding.

I do not discuss representation of algorithmes of linear algebra :-)

Finally, the ancients spoke: "If two man do the same thing, it already
is not same thing". I think that alternative approaches are always
useful for the consideration. Not?

A.

Gareth McCaughan <················@pobox.com> wrote in message news:<··············@g.mccaughan.ntlworld.com>...
> ··@cable.netlux.org (Alexander Baranovsky) writes:
> 
> > LISPPA (List Processing based on the Polymorphic Arrays) technology
> > provides a way to process dynamic data structures (lists, trees and
> > more) without using pointers. LISPPA uses polymorphic arrays as a base
> > of  the data representation.
>  ...
> > http://www.virtlabs.com.ua/paxscript/lisppa.htm
> 
> I'm sure it's very nice, but the only thing it has in common
> with Lisp is (1) the first four letters of its name and
> (2) the fact that it provides a way of working with sequences
> of arbitrary objects in a programming language. So I'm not
> sure why you've posted it to comp.lang.lisp.
> 
> Now, you've said
> 
>   | I know only one answer: the imperative approach and such languages as
>   | Pascal, Basic C or Java are more attractive for majority of
>   | programmers.
>   | 
>   | You will answer me: "The success is not a good reason. There are a lot
>   | data domains where the imperative approach is not applicable". My
>   | answer is: LISPPA.
> 
> so perhaps the answer is that you think your LISPPA thing
> is going to make Lisp obsolete by providing ways to solve
> the problems Lisp is good at, while remaining palatable to
> the "majority of programmers".
> 
> But there are already languages that do what your LISPPA
> does. Have a look at Python, for instance; it has "polymorphic
> arrays" (which it calls lists, but never mind) among its
> data structures, and it doesn't require you to keep writing
> "AddressOf(...)" whenever you want to do anything non-trivial
> with them. -- And yet, despite the existence of Python, Lisp
> programmers don't seem to have given up. Why might that be?
> Answer: the advantages of Lisp are *not* all about the "data
> domains" it can handle, but about the facilities it provides
> for writing powerful software efficiently and concisely.
> LISPPA, so far as I can see, offers none of this.
> 
> Random example: your symbolid differentiation program has
> a function called "InOrder" that produces an infix
> representation of an expression. It's 30 lines long.
> In Lisp it would say [danger: untested code!]
> 
>     (defun prefix->infix (x &optional (lparen "(") (rparen ")"))
>       (cond
>         ((atom x) (write-to-string x))
>         ((= (length x) 2)
>          (case (car x)
>            ((-)   (format nil "~A-~A~A" lparen (prefix->infix (second x)) rparen))
>            ((log) (format nil "log(~A)" (prefix->infix (second x) t)))))
>         (t (format nil "~A~A~A~A~A" lparen
>                                     (prefix->infix (second x))
>                                     (first x)
>                                     (prefix->infix (third x))
>                                     rparen))))
> 
> That's 40% the length of yours. (Actually, in Lisp we'd
> take advantage of the fact that this stuff is so much less
> painful and write it differently so that it didn't have
> to do so much copying of strings.) Your "InOrder" function
> has some code in it that's meant to turn "A+-B" into "A-B",
> but it doesn't actually work (there's no way what you've
> called RR can ever begin with "-") so I haven't felt it
> necessary to copy it.
> 
> Another random example: all those functions like IsPower, which
> (1) occupy 6 lines each in your code and would be 2 lines each
> in Lisp and (2) wouldn't actually be needed in a Lisp implementation,
> for reasons related to the fact that Lisp isn't stuck with purely
> "procedural" code. You talked about "data domains"; procedural
> programming is unsuitable for this application because of the
> nature of the *algorithms*, not the *data*.
From: William Bland
Subject: Re: LISPPA
Date: 
Message-ID: <pan.2004.04.19.23.19.42.569602@abstractnonsense.com>
On Mon, 19 Apr 2004 12:04:06 -0700, Alexander Baranovsky wrote:

> I never stated that the polymorphic array is a new concept.  But, it
> seems, the using arrays for the representation and processing of linked
> lists is quite new idea.

Sorry, no.  When I was still using Basic (because I couldn't find a freely
available Lisp), I experimented with simulating linked lists using arrays.
Granted, they could only hold strings or numbers, but full polymorphism is
hardly a huge leap from that.  When I finally found a Lisp (it was
PC-LISP), I was overjoyed not to have to muck around with these
pseudo-lists any more.  I would have been around 11 or 12 years old at the
time (so this would have been in the mid 1980s).  I would claim prior art,
but i don't think the idea is sufficiently interesting to claim anything
for.  Sorry.

Best wishes,
		Bill.
-- 
Dr. William Bland                          www.abstractnonsense.com
Computer Programmer                           awksedgrep (Yahoo IM)
Any sufficiently advanced Emacs user is indistinguishable from magic
From: Alexander Baranovsky
Subject: Re: LISPPA
Date: 
Message-ID: <27d7ccd8.0404200444.52f01b02@posting.google.com>
Oh, could guess. The idea is too obvious to visit exclusively my head
:-) My first implementation is related to 1997 in VIRT language. But
my big false step was in the considering assignment of arrays as
actual copying, not copying of references. That's why VIRT used ugly
notation like

L << [NewItem, << L]

instead of

L := [NewItem, L]

to insert new item effectively (without creation a copy of L and new
chip of garbage).

VIRT was unsuccessful :-)

A.

William Bland <····@abstractnonsense.com> wrote in message news:<······························@abstractnonsense.com>...
> On Mon, 19 Apr 2004 12:04:06 -0700, Alexander Baranovsky wrote:
> 
> > I never stated that the polymorphic array is a new concept.  But, it
> > seems, the using arrays for the representation and processing of linked
> > lists is quite new idea.
> 
> Sorry, no.  When I was still using Basic (because I couldn't find a freely
> available Lisp), I experimented with simulating linked lists using arrays.
> Granted, they could only hold strings or numbers, but full polymorphism is
> hardly a huge leap from that.  When I finally found a Lisp (it was
> PC-LISP), I was overjoyed not to have to muck around with these
> pseudo-lists any more.  I would have been around 11 or 12 years old at the
> time (so this would have been in the mid 1980s).  I would claim prior art,
> but i don't think the idea is sufficiently interesting to claim anything
> for.  Sorry.
> 
> Best wishes,
> 		Bill.
From: Mario S. Mommer
Subject: Re: LISPPA
Date: 
Message-ID: <fzzn967lvp.fsf@germany.igpm.rwth-aachen.de>
··@cable.netlux.org (Alexander Baranovsky) writes:
> Oh, could guess. The idea is too obvious to visit exclusively my head
> :-) My first implementation is related to 1997 in VIRT language. But
> my big false step was in the considering assignment of arrays as
> actual copying, not copying of references. That's why VIRT used ugly
> notation like
>
> L << [NewItem, << L]
>
> instead of
>
> L := [NewItem, L]

Sorry, but I find both notations ugly, almost grotesque.

Perhaps your idea is indeed worth something, I'm not going to rule
that out completely. But c.l.l. is very probably not the right forum
for it.
From: Alexander Baranovsky
Subject: Re: LISPPA
Date: 
Message-ID: <27d7ccd8.0404200959.289852d0@posting.google.com>
> Sorry, but I find both notations ugly, almost grotesque.

Your opinion sounds too categorical. Are you really sure that true can
be perceived exclusively through the round brackets? :-)

A.

Mario S. Mommer <········@yahoo.com> wrote in message news:<··············@germany.igpm.rwth-aachen.de>...
> ··@cable.netlux.org (Alexander Baranovsky) writes:
> > Oh, could guess. The idea is too obvious to visit exclusively my head
> > :-) My first implementation is related to 1997 in VIRT language. But
> > my big false step was in the considering assignment of arrays as
> > actual copying, not copying of references. That's why VIRT used ugly
> > notation like
> >
> > L << [NewItem, << L]
> >
> > instead of
> >
> > L := [NewItem, L]
> 
> Sorry, but I find both notations ugly, almost grotesque.
> 
> Perhaps your idea is indeed worth something, I'm not going to rule
> that out completely. But c.l.l. is very probably not the right forum
> for it.
From: Gareth McCaughan
Subject: Re: LISPPA
Date: 
Message-ID: <87r7uj2zgq.fsf@g.mccaughan.ntlworld.com>
Alexander Baranovsky wrote:

> First of all, thank you for your answer as it turns the discussion
> into serious direction, at least.

You're welcome.

> > I'm sure it's very nice, but the only thing it has in common
> > with Lisp is (1) the first four letters of its name and
> > (2) the fact that it provides a way of working with sequences
> > of arbitrary objects in a programming language. So I'm not
> > sure why you've posted it to comp.lang.lisp.
> 
> (1) Both Lisp and LISPPA have relation to the list processing. LISPPA
> uses a "classic" data structure in the imperative programming, namely
> the array, for the representation of linked lists. Both directions try
> to "reconcile" the functional and imperative approaches of the
> programming: Pure functional Lisp introduces "imperative" concepts via
> side effects, LISPPA tries to provide uniform and concise data
> representation and algorithmes for the recursive data structures what
> is peculiar to the functional approach. It explains the name.

I'm puzzled by the statement that "Pure functional Lisp
introduces imperative concepts via side effects"; once
there are side effects the language isn't "pure functional",
so I suppose what you mean is that Lisp starts with a
"pure functional" core and adds imperative side-effect-ful
bits on. But Lisp hasn't been "pure functional" for a
long, long, long time, and Lisp as it is now is certainly
*not* accurately described as a functional language with
imperative extensions. (You could probably describe O'Caml,
for instance, that way.)

In short: Lisp is not a functional language with some
imperative bits; it is a multi-paradigm language that
supports both functional and imperative styles of
programming.

> (2) I hoped that the comparison of both approaches can be interesting.

OK.

> > so perhaps the answer is that you think your LISPPA thing
> > is going to make Lisp obsolete by providing ways to solve
> > the problems Lisp is good at, while remaining palatable to
> > the "majority of programmers".
> 
> No, I did not think that. However, I suppose that mentioned imperative
> languages are more claimed at present time. You cannot avoid to see
> it. At the same time, they have many drawbacks. I have tried to remove
> one such defect.

I think the defect you've tried to remove isn't anything to do
with the functional/imperative distinction. There's no reason
at all why a functional language has to have lists[1] of arbitrary-typed
elements (Haskell doesn't) and no reason at all why a language with
lists of arbitrary-typed elements has to be functional (Python isn't).
Now, obviously you know this, since your languages are themselves
non-functional languages with lists of arbitrary-typed elements, but
you do seem to be suggesting that somehow your languages with LISPPA
are nearer to being "functional" than they would be without. I don't
think they are. The two issues are completely separate.

    [1] Or vectors, or whatever.

> > But there are already languages that do what your LISPPA
> > does. Have a look at Python, for instance; it has "polymorphic
> > arrays" 
> 
> This is the second mention about Python (the first I've received in a
> private email). Not only Python uses polymorphic arrays. JavaScript,
> PHP and many other languages use it. In particular, there are a lot
> languages which support Variant types and COM technology.

Right: tons of them. I mentioned Python rather than the others
because I think it's the nicest language of its kind, and if you're
only going to look at one of them then Python is a good choice.
Of course, opinions on this matter vary :-).

>                                                           I never
> stated that the polymorphic array is a new concept. But, it seems, the
> using arrays for the representation and processing of linked lists is
> quite new idea.

Take a look at Peter Norvig's "Python for Lisp programmers"
page at http://www.norvig.com/python-lisp.html ; he mentions
an implementation of linked lists using Python tuples in his
feature-comparison table. That's quite recent too, of course.
Mostly, it hasn't been done because if you want linked lists
then using a general-purpose polymorphic array type to build
them is inefficient. At least, it usually is; perhaps your
languages have a special implementation of polymorphic arrays
that makes small p.a.'s particularly space-efficient.

>                 Indeed, in terms of the arrays we can give the simple
> recursive definition of lists and binary trees. For example:
> 
> 1. NULL (undefined variable) is empty list.
> 2. If L is a list, the one-dimensional array A with two elements A(0)
> and A(1) = L is a list too.
> 
> We cannot produce such concise definition of the linked list using
> standard approach based on pointers. This is the start point, further
> LISPPA offers a uniform ways to process lists and other dynamic data
> structures.

I don't see why we can't produce a similarly concise definition
using pointers.

  1 The null pointer represents the empty list.
  2 If L is a list and X any object, then a block
    of memory containing two pointers, one of which
    points to X and the other of which points to L,
    is a list.

By the way, let me advise you not to use the same thing to mean
both "undefined variable" and "empty list".

> Please note, that the presence of polymorphic arrays in an imperative
> programming language not yet provides
> effective programming. A few extra conditions are required. LISPPA
> uncovers a minimal set of such conditions. (In this relation, you can
> take a look at the discussion about LISPPA in comp.lang.php).

I don't know much about PHP. Isn't it rather similar to Perl?
Because Perl has somewhat adequate polymorphic arrays. (You have
to take references everywhere, otherwise horrible splicing things
happen when you put a list inside another list, but that's all.)

> > And yet, despite the existence of Python, Lisp
> > programmers don't seem to have given up. Why might that be?
> > Answer: the advantages of Lisp are *not* all about the "data
> > domains" it can handle, but about the facilities it provides
> > for writing powerful software efficiently and concisely.
> 
> I think the real advantages of LISPPA can be discovered after its
> implementation in a compiler such as Visual Basic or Delphi. (It is
> quite easy to extend well known Variant types to take into
> consideration LISPPA requirements). Benefits of it are obvious, I
> hope.
> 
> > LISPPA, so far as I can see, offers none of this.
> 
> Sorry, I cannot agree with it. See, for example:
> 
> http://www.virtlabs.com.ua/paxscript/demo_basic_lists.htm
> http://www.virtlabs.com.ua/paxscript/demo_basic_trees.htm
> 
> http://www.virtlabs.com.ua/paxscript/demo_pascal_lists.htm
> http://www.virtlabs.com.ua/paxscript/demo_pascal_trees.htm
> 
> http://www.virtlabs.com.ua/paxscript/demo_c_lists.htm
> http://www.virtlabs.com.ua/paxscript/demo_c_trees.htm
> 
> http://www.virtlabs.com.ua/paxscript/demo_js_lists.htm
> http://www.virtlabs.com.ua/paxscript/demo_js_trees.htm
> 
> > Random example: your symbolid differentiation program has
> > a function called "InOrder" that produces an infix
> > representation of an expression. It's 30 lines long.
> > In Lisp it would say [danger: untested code!] ............
> 
> The function is written bad, I can accept it.

The point is: I'm not sure it *is* written badly. (I mentioned
one little bug, but good code can have bugs.) I think the reason
why the Lisp code is so much shorter (and, to my mind, neater)
is simply that Lisp is a better language than Pascal, even when
you add a handy new variant type to Pascal and provide some
features for building arrays of variants. Certainly adding
variants and nicer arrays is an improvement to Pascal, but
the deficiencies of Pascal in comparison to Lisp go so much
deeper than that.

>                                               But it is not a typical
> example. I think more interesting example is Compress procedure which
> allows you to compress a sub-tree of a tree without having to relocate
> the whole tree in each act of the compression. The body of the
> procedure is not short, but quite uniform and clear for understanding.
> I do not think that you will use less number of lines to express the
> same in Lisp.

Oh, really? :-) [Warning: very untested code ahead.]

    (defun build-matcher (pattern x escape-tag already-seen body)
      (cond
        ((null pattern)
         (values `(if ,x (go ,escape-tag) (progn . ,body)) already-seen))
        ((symbolp pattern)
         (if (member pattern already-seen)
           (values `(if (equal ,x ,pattern) (progn . ,body) (go ,escape-tag)) already-seen)
           (values `(let ((,pattern ,x)) . ,body) (cons pattern already-seen))))
        ((or (atom pattern) (eq (first pattern) 'quote))
         (values `(if (equal ,pattern ,x) (progn . ,body) (go ,escape-tag))
                 already-seen))
        (t
         (let ((x-sym (gensym)) (body-sym (gensym)))
           (multiple-value-bind (car-matcher already-seen)
                                (build-matcher (car pattern) `(car ,x-sym)
                                               escape-tag already-seen body-sym)
             (multiple-value-bind (cdr-matcher already-seen)
                                  (build-matcher (cdr pattern) `(cdr ,x-sym)
                                                 escape-tag already-seen body)
               (values `(let ((,x-sym ,x))
                          (unless (consp ,x-sym) (go ,escape-tag))
                            ,(subst (list cdr-matcher) body-sym car-matcher))
                       already-seen)))))))

    (defmacro matching-bind (pattern x escape-tag &body body)
      (build-matcher pattern x escape-tag nil body))

    (defmacro match-case (x &rest clauses)
      (let ((x-sym (gensym)) (escape-tag (gensym)) (block-name (gensym)))
        `(let ((,x-sym ,x))
           (block ,block-name
             ,@(loop for (pattern . body) in clauses collect
                 `(macrolet ((next-case () (go ,escape-tag)))
                    (tagbody
                      (return-from ,block-name
                        (matching-bind ,pattern ,x-sym ,escape-tag . ,body))
                      ,escape-tag)))))))
                  
    (defun compress (expr)
      (if (atom expr)
        expr
        (setf expr (mapcar #'compress expr))
        (match-case expr
          (('- x)
           (if (is-constant x) (- x) expr))
          (('+ x 0) x)
          (('+ 0 x) x)
          (('+ x x) `(* 2 x))
          (('+ x ('- y)) `(- ,x ,y))
          (('+ ('- x) y) `(- ,y ,x))
          (('+ ('* a b) ('* a d)) `(* ,a (+ ,b ,d)))
          (('+ ('* a b) ('* c b)) `(* (+ ,a ,c) ,b))
          (('+ ('* a b) a) `(* ,a (+ ,b 1)))
          (('+ ('* b a) a) `(* ,a (+ ,b 1)))
          (('+ ('+ a b) c)
           (if (is-constant c)
             (cond ((is-constant a) `(+ b ,(+ a c)))
                   ((is-constant b) `(+ a ,(+ b c)))
                   (t (next-case)))
             (next-case)))
          (('+ a ('+ b c))
           (if (is-constant a)
             (cond ((is-constant b) `(+ c ,(+ a b)))
                   ((is-constant c) `(+ b ,(+ a c)))
                   (t (next-case)))
             (next-case)))
          (('+ ('- a b) c)
           (if (is-constant c)
             (cond ((is-constant a) `(- ,(+ a c) b))
                   ((is-constant b) `(+ a ,(- c b)))
                   (t (next-case)))
             (next-case)))
          (('+ a ('- b c))
           (if (is-constant a)
             (cond ((is-constant b) `(- ,(+ a b) c))
                   ((is-constant c) `(+ b ,(- a c)))
                   (t (next-case)))
             (next-case)))
          (('+ x y)
           (if (and (is-constant x) (is-constant y)) (+ x y) `(+ x y)))
          ;; ... another 35 lines for - ...
          (('* x 0) 0)
          (('* 0 x) 0)
          (('* x 1) x)
          (('* 1 x) x)
          (('* ('- x) y) `(* -1 (* ,x ,y)))
          (('* x ('- y)) `(* -1 (* ,x ,y)))
          (('* x x) `(expt ,x 2))
          (('* ('expt x a) ('expt x b)) `(expt ,x (+ ,a ,b)))
          (('* x (expt x a))           `(expt ,x (+ ,a 1)))
          (('* (expt x a) x)           `(expt ,x (+ ,a 1)))
          (('* ('* x y) ('expt x a))          `(* ,y (expt ,x (+ ,a 1))))
          (('* ('* y x) ('expt x a))          `(* ,y (expt ,x (+ ,a 1))))
          (('* ('* (expt x a) y) ('expt x b)) `(* ,y (expt ,x (+ ,a ,b))))
          (('* ('* y (expt x a)) ('expt x b)) `(* ,y (expt ,x (+ ,a ,b))))
          (('* ('expt x a) ('* x y))          `(* ,y (expt ,x (+ ,a 1))))
          (('* ('expt x a) ('* y x))          `(* ,y (expt ,x (+ ,a 1))))
          (('* ('expt x b) ('* (expt x a) y)) `(* ,y (expt ,x (+ ,a ,b))))
          (('* ('expt x b) ('* y (expt x a))) `(* ,y (expt ,x (+ ,a ,b))))
          (('* ('* x y) x) `(* ,y (expt ,x 2)))
          (('* ('* y x) x) `(* ,y (expt ,x 2)))
          (('* ('* ('expt x a) y) x) `(* ,y (expt ,x (+ ,a 1))))
          (('* ('* y ('expt x a)) x) `(* ,y (expt ,x (+ ,a 1))))
          (('* x ('* ('expt x a) y)) `(* ,y (expt ,x (+ ,a 1))))
          (('* x ('* y ('expt x a))) `(* ,y (expt ,x (+ ,a 1))))
          (('/ x 1) x)
          (('/ 0 x) 0)
          (('/ x ('expt y a)) `(* ,x (expt ,y (- ,a))))
          (('/ x 0) (error "Division by zero"))
          (('/ x y) `(* ,x (expt ,y -1)))
          (('expt x 1) x)
          (('expt x 0) 1)
          (x x))))

This will produce code a little less efficient than your Pascal, because
it doesn't take advantage of the opportunities for merging the initial
bits of the matching clauses. It's about 55% of the length of your
Pascal code (and yes, I am including the 35 lines I didn't bother to
write), and the ratio will only become more favourable as the number
of simplifications increases. (I'm not sure what the rationale
is for the particular set you've chosen.) And I think it's much easier
to read.

(Actually, the ratio is more favourable to Lisp than the above
suggests, because I haven't included the code for your functions
IsMult, IsPow, and so on, which aren't needed in the Lisp version.
I have used something called IS-CONSTANT, which I haven't defined,
but you could substitute NUMBERP and it would work fine.)

If the efficiency loss matters, it isn't too hard to write a much
smarter MATCH-CASE macro. That might add another 50-100 lines,
bringing the Lisp code to almost the same length as the Pascal --
but still much easier to read and much more concisely extensible.

Oh, and you can use MATCH-CASE for the differentiation too:

    (defun differentiate (expr)
      (match-case expr
        ('x 1)
        (('- x) `(- ,(differentiate x)))
        (('+ x y) `(+ ,(differentiate x) ,(differentiate y)))
        (('- x y) `(- ,(differentiate x) ,(differentiate y)))
        (('* x y) `(+ (* ,x ,(differentiate y)) (* ,y ,(differentiate x))))
        (('/ x y) `(/ (- (* ,y ,(differentiate x)) (* ,x ,(differentiate y)))
                      (expt ,y 2)))
        (('log x) `(/ ,(differentiate x) ,x))
        (('pow x a) `(* (+ (* (differentiate ,a) (log ,x))
                           (* ,a (/ (differentiate ,x) ,x)))
                        (expt ,x ,a)))
        ((x . y) (error "Syntax error"))
        (x 0)))

> Besides you might take a look at the source code of the theoprem
> proving program in paxBasic
...
> or in paxPascal
...
> and compare this code with source code presented in book Chang C.L and
> Lee R.C. "Symbolic Logic and Mechanical Theorem Proving". Then you can
> conclude what of codes is more suitable for understanding.

I don't have that book; sorry.

> Finally, the ancients spoke: "If two man do the same thing, it already
> is not same thing". I think that alternative approaches are always
> useful for the consideration. Not?

Perhaps. And it may be that your languages with "LISPPA technology"
are useful to people who are (for whatever reason) unable to program
in Lisp or in other languages with polymorphic arrays such as Python.
But here in comp.lang.lisp, that's not your audience :-).

-- 
Gareth McCaughan
.sig under construc
From: Alexander Baranovsky
Subject: Re: LISPPA
Date: 
Message-ID: <27d7ccd8.0404200432.3e5b378e@posting.google.com>
> I'm puzzled by the statement that "Pure functional Lisp
> introduces imperative concepts via side effects"; once
> there are side effects the language isn't "pure functional",
> so I suppose what you mean is that Lisp starts with a
> "pure functional" core and adds imperative side-effect-ful
> bits on. But Lisp hasn't been "pure functional" for a
> long, long, long time, and Lisp as it is now is certainly
> *not* accurately described as a functional language with
> imperative extensions. (You could probably describe O'Caml,
> for instance, that way.)
> 
> In short: Lisp is not a functional language with some
> imperative bits; it is a multi-paradigm language that
> supports both functional and imperative styles of
> programming.

The acticle"History of Lisp" by John McCarthy 

http://www-formal.stanford.edu/jmc/history/lisp/node1.html

shows that you are not quite right :-)

> Take a look at Peter Norvig's "Python for Lisp programmers"
> page at http://www.norvig.com/python-lisp.html ; he mentions
> an implementation of linked lists using Python tuples in his
> feature-comparison table.

The article is interesting, thank you. After the first reading I can
breathe with relief: my approach is independent.

> > 1. NULL (undefined variable) is empty list.
> > 2. If L is a list, the one-dimensional array A with two elements A(0)
> > and A(1) = L is a list too.
> > 
> > We cannot produce such concise definition of the linked list using
> > standard approach based on pointers. This is the start point, further
> > LISPPA offers a uniform ways to process lists and other dynamic data
> > structures.
> 
> I don't see why we can't produce a similarly concise definition
> using pointers.
> 
>   1 The null pointer represents the empty list.
>   2 If L is a list and X any object, then a block
>     of memory containing two pointers, one of which
>     points to X and the other of which points to L,
>     is a list.

But what is the "memory block"? Can you explain it in a few words? :-)
My definition does not know any "memory blocks". Besides, I can define
the polymorphic array without any references on the "memory blocks".

> > Please note, that the presence of polymorphic arrays in an imperative
> > programming language not yet provides
> > effective programming. A few extra conditions are required. LISPPA
> > uncovers a minimal set of such conditions. (In this relation, you can
> > take a look at the discussion about LISPPA in comp.lang.php).

> I don't know much about PHP. Isn't it rather similar to Perl?
> Because Perl has somewhat adequate polymorphic arrays.

Yes, PHP is another language designed for the Web programming. My idea
was that the implementation of LISPPA in Perl or PHP can considerably
simplify writing server-side AI applications.

> (You have
> to take references everywhere, otherwise horrible splicing things
> happen when you put a list inside another list, but that's all.)

In particular, my researches testify that I can also easily represent
and process two-way linked lists without big problems.

> I think the reason
> why the Lisp code is so much shorter (and, to my mind, neater)
> is simply that Lisp is a better language than Pascal, even when
> you add a handy new variant type to Pascal and provide some
> features for building arrays of variants. Certainly adding
> variants and nicer arrays is an improvement to Pascal, but
> the deficiencies of Pascal in comparison to Lisp go so much
> deeper than that.

Oh, in my opinion the Pascal is an exellent language even without OOP,
i.e in his first edition by Niklaus Wirth. Representation of the data
structures based on Hoare theory, clean algol-liked syntax, structured
programming, I like it from the beginning. But pointers are not very
good thing, the processing of simple recursive data structures is not
adequate.

> Perhaps. And it may be that your languages with "LISPPA technology"
> are useful to people who are (for whatever reason) unable to program
> in Lisp or in other languages with polymorphic arrays such as Python.
> But here in comp.lang.lisp, that's not your audience :-).

"Recruitment" was not my goal at comp.lang.lisp :-) Besides, I know
the first love never die. You and subscribers of the newsgroup will
like Lisp even after I'll able to prove you that these "improvements"
to Pascal allows to program AI application as well with the same
success or even better than Lisp it does  ;-)

A.

Gareth McCaughan <················@pobox.com> wrote in message news:<··············@g.mccaughan.ntlworld.com>...
> Alexander Baranovsky wrote:
> 
> > First of all, thank you for your answer as it turns the discussion
> > into serious direction, at least.
> 
> You're welcome.
> 
> > > I'm sure it's very nice, but the only thing it has in common
> > > with Lisp is (1) the first four letters of its name and
> > > (2) the fact that it provides a way of working with sequences
> > > of arbitrary objects in a programming language. So I'm not
> > > sure why you've posted it to comp.lang.lisp.
> > 
> > (1) Both Lisp and LISPPA have relation to the list processing. LISPPA
> > uses a "classic" data structure in the imperative programming, namely
> > the array, for the representation of linked lists. Both directions try
> > to "reconcile" the functional and imperative approaches of the
> > programming: Pure functional Lisp introduces "imperative" concepts via
> > side effects, LISPPA tries to provide uniform and concise data
> > representation and algorithmes for the recursive data structures what
> > is peculiar to the functional approach. It explains the name.
> 
> I'm puzzled by the statement that "Pure functional Lisp
> introduces imperative concepts via side effects"; once
> there are side effects the language isn't "pure functional",
> so I suppose what you mean is that Lisp starts with a
> "pure functional" core and adds imperative side-effect-ful
> bits on. But Lisp hasn't been "pure functional" for a
> long, long, long time, and Lisp as it is now is certainly
> *not* accurately described as a functional language with
> imperative extensions. (You could probably describe O'Caml,
> for instance, that way.)
> 
> In short: Lisp is not a functional language with some
> imperative bits; it is a multi-paradigm language that
> supports both functional and imperative styles of
> programming.
> 
> > (2) I hoped that the comparison of both approaches can be interesting.
> 
> OK.
> 
> > > so perhaps the answer is that you think your LISPPA thing
> > > is going to make Lisp obsolete by providing ways to solve
> > > the problems Lisp is good at, while remaining palatable to
> > > the "majority of programmers".
> > 
> > No, I did not think that. However, I suppose that mentioned imperative
> > languages are more claimed at present time. You cannot avoid to see
> > it. At the same time, they have many drawbacks. I have tried to remove
> > one such defect.
> 
> I think the defect you've tried to remove isn't anything to do
> with the functional/imperative distinction. There's no reason
> at all why a functional language has to have lists[1] of arbitrary-typed
> elements (Haskell doesn't) and no reason at all why a language with
> lists of arbitrary-typed elements has to be functional (Python isn't).
> Now, obviously you know this, since your languages are themselves
> non-functional languages with lists of arbitrary-typed elements, but
> you do seem to be suggesting that somehow your languages with LISPPA
> are nearer to being "functional" than they would be without. I don't
> think they are. The two issues are completely separate.
> 
>     [1] Or vectors, or whatever.
> 
> > > But there are already languages that do what your LISPPA
> > > does. Have a look at Python, for instance; it has "polymorphic
> > > arrays" 
> > 
> > This is the second mention about Python (the first I've received in a
> > private email). Not only Python uses polymorphic arrays. JavaScript,
> > PHP and many other languages use it. In particular, there are a lot
> > languages which support Variant types and COM technology.
> 
> Right: tons of them. I mentioned Python rather than the others
> because I think it's the nicest language of its kind, and if you're
> only going to look at one of them then Python is a good choice.
> Of course, opinions on this matter vary :-).
> 
> >                                                           I never
> > stated that the polymorphic array is a new concept. But, it seems, the
> > using arrays for the representation and processing of linked lists is
> > quite new idea.
> 
> Take a look at Peter Norvig's "Python for Lisp programmers"
> page at http://www.norvig.com/python-lisp.html ; he mentions
> an implementation of linked lists using Python tuples in his
> feature-comparison table. That's quite recent too, of course.
> Mostly, it hasn't been done because if you want linked lists
> then using a general-purpose polymorphic array type to build
> them is inefficient. At least, it usually is; perhaps your
> languages have a special implementation of polymorphic arrays
> that makes small p.a.'s particularly space-efficient.
> 
> >                 Indeed, in terms of the arrays we can give the simple
> > recursive definition of lists and binary trees. For example:
> > 
> > 1. NULL (undefined variable) is empty list.
> > 2. If L is a list, the one-dimensional array A with two elements A(0)
> > and A(1) = L is a list too.
> > 
> > We cannot produce such concise definition of the linked list using
> > standard approach based on pointers. This is the start point, further
> > LISPPA offers a uniform ways to process lists and other dynamic data
> > structures.
> 
> I don't see why we can't produce a similarly concise definition
> using pointers.
> 
>   1 The null pointer represents the empty list.
>   2 If L is a list and X any object, then a block
>     of memory containing two pointers, one of which
>     points to X and the other of which points to L,
>     is a list.
> 
> By the way, let me advise you not to use the same thing to mean
> both "undefined variable" and "empty list".
> 
> > Please note, that the presence of polymorphic arrays in an imperative
> > programming language not yet provides
> > effective programming. A few extra conditions are required. LISPPA
> > uncovers a minimal set of such conditions. (In this relation, you can
> > take a look at the discussion about LISPPA in comp.lang.php).
> 
> I don't know much about PHP. Isn't it rather similar to Perl?
> Because Perl has somewhat adequate polymorphic arrays. (You have
> to take references everywhere, otherwise horrible splicing things
> happen when you put a list inside another list, but that's all.)
> 
> > > And yet, despite the existence of Python, Lisp
> > > programmers don't seem to have given up. Why might that be?
> > > Answer: the advantages of Lisp are *not* all about the "data
> > > domains" it can handle, but about the facilities it provides
> > > for writing powerful software efficiently and concisely.
> > 
> > I think the real advantages of LISPPA can be discovered after its
> > implementation in a compiler such as Visual Basic or Delphi. (It is
> > quite easy to extend well known Variant types to take into
> > consideration LISPPA requirements). Benefits of it are obvious, I
> > hope.
> > 
> > > LISPPA, so far as I can see, offers none of this.
> > 
> > Sorry, I cannot agree with it. See, for example:
> > 
> > http://www.virtlabs.com.ua/paxscript/demo_basic_lists.htm
> > http://www.virtlabs.com.ua/paxscript/demo_basic_trees.htm
> > 
> > http://www.virtlabs.com.ua/paxscript/demo_pascal_lists.htm
> > http://www.virtlabs.com.ua/paxscript/demo_pascal_trees.htm
> > 
> > http://www.virtlabs.com.ua/paxscript/demo_c_lists.htm
> > http://www.virtlabs.com.ua/paxscript/demo_c_trees.htm
> > 
> > http://www.virtlabs.com.ua/paxscript/demo_js_lists.htm
> > http://www.virtlabs.com.ua/paxscript/demo_js_trees.htm
> > 
> > > Random example: your symbolid differentiation program has
> > > a function called "InOrder" that produces an infix
> > > representation of an expression. It's 30 lines long.
> > > In Lisp it would say [danger: untested code!] ............
> > 
> > The function is written bad, I can accept it.
> 
> The point is: I'm not sure it *is* written badly. (I mentioned
> one little bug, but good code can have bugs.) I think the reason
> why the Lisp code is so much shorter (and, to my mind, neater)
> is simply that Lisp is a better language than Pascal, even when
> you add a handy new variant type to Pascal and provide some
> features for building arrays of variants. Certainly adding
> variants and nicer arrays is an improvement to Pascal, but
> the deficiencies of Pascal in comparison to Lisp go so much
> deeper than that.
> 
> >                                               But it is not a typical
> > example. I think more interesting example is Compress procedure which
> > allows you to compress a sub-tree of a tree without having to relocate
> > the whole tree in each act of the compression. The body of the
> > procedure is not short, but quite uniform and clear for understanding.
> > I do not think that you will use less number of lines to express the
> > same in Lisp.
> 
> Oh, really? :-) [Warning: very untested code ahead.]
> 
>     (defun build-matcher (pattern x escape-tag already-seen body)
>       (cond
>         ((null pattern)
>          (values `(if ,x (go ,escape-tag) (progn . ,body)) already-seen))
>         ((symbolp pattern)
>          (if (member pattern already-seen)
>            (values `(if (equal ,x ,pattern) (progn . ,body) (go ,escape-tag)) already-seen)
>            (values `(let ((,pattern ,x)) . ,body) (cons pattern already-seen))))
>         ((or (atom pattern) (eq (first pattern) 'quote))
>          (values `(if (equal ,pattern ,x) (progn . ,body) (go ,escape-tag))
>                  already-seen))
>         (t
>          (let ((x-sym (gensym)) (body-sym (gensym)))
>            (multiple-value-bind (car-matcher already-seen)
>                                 (build-matcher (car pattern) `(car ,x-sym)
>                                                escape-tag already-seen body-sym)
>              (multiple-value-bind (cdr-matcher already-seen)
>                                   (build-matcher (cdr pattern) `(cdr ,x-sym)
>                                                  escape-tag already-seen body)
>                (values `(let ((,x-sym ,x))
>                           (unless (consp ,x-sym) (go ,escape-tag))
>                             ,(subst (list cdr-matcher) body-sym car-matcher))
>                        already-seen)))))))
> 
>     (defmacro matching-bind (pattern x escape-tag &body body)
>       (build-matcher pattern x escape-tag nil body))
> 
>     (defmacro match-case (x &rest clauses)
>       (let ((x-sym (gensym)) (escape-tag (gensym)) (block-name (gensym)))
>         `(let ((,x-sym ,x))
>            (block ,block-name
>              ,@(loop for (pattern . body) in clauses collect
>                  `(macrolet ((next-case () (go ,escape-tag)))
>                     (tagbody
>                       (return-from ,block-name
>                         (matching-bind ,pattern ,x-sym ,escape-tag . ,body))
>                       ,escape-tag)))))))
>                   
>     (defun compress (expr)
>       (if (atom expr)
>         expr
>         (setf expr (mapcar #'compress expr))
>         (match-case expr
>           (('- x)
>            (if (is-constant x) (- x) expr))
>           (('+ x 0) x)
>           (('+ 0 x) x)
>           (('+ x x) `(* 2 x))
>           (('+ x ('- y)) `(- ,x ,y))
>           (('+ ('- x) y) `(- ,y ,x))
>           (('+ ('* a b) ('* a d)) `(* ,a (+ ,b ,d)))
>           (('+ ('* a b) ('* c b)) `(* (+ ,a ,c) ,b))
>           (('+ ('* a b) a) `(* ,a (+ ,b 1)))
>           (('+ ('* b a) a) `(* ,a (+ ,b 1)))
>           (('+ ('+ a b) c)
>            (if (is-constant c)
>              (cond ((is-constant a) `(+ b ,(+ a c)))
>                    ((is-constant b) `(+ a ,(+ b c)))
>                    (t (next-case)))
>              (next-case)))
>           (('+ a ('+ b c))
>            (if (is-constant a)
>              (cond ((is-constant b) `(+ c ,(+ a b)))
>                    ((is-constant c) `(+ b ,(+ a c)))
>                    (t (next-case)))
>              (next-case)))
>           (('+ ('- a b) c)
>            (if (is-constant c)
>              (cond ((is-constant a) `(- ,(+ a c) b))
>                    ((is-constant b) `(+ a ,(- c b)))
>                    (t (next-case)))
>              (next-case)))
>           (('+ a ('- b c))
>            (if (is-constant a)
>              (cond ((is-constant b) `(- ,(+ a b) c))
>                    ((is-constant c) `(+ b ,(- a c)))
>                    (t (next-case)))
>              (next-case)))
>           (('+ x y)
>            (if (and (is-constant x) (is-constant y)) (+ x y) `(+ x y)))
>           ;; ... another 35 lines for - ...
>           (('* x 0) 0)
>           (('* 0 x) 0)
>           (('* x 1) x)
>           (('* 1 x) x)
>           (('* ('- x) y) `(* -1 (* ,x ,y)))
>           (('* x ('- y)) `(* -1 (* ,x ,y)))
>           (('* x x) `(expt ,x 2))
>           (('* ('expt x a) ('expt x b)) `(expt ,x (+ ,a ,b)))
>           (('* x (expt x a))           `(expt ,x (+ ,a 1)))
>           (('* (expt x a) x)           `(expt ,x (+ ,a 1)))
>           (('* ('* x y) ('expt x a))          `(* ,y (expt ,x (+ ,a 1))))
>           (('* ('* y x) ('expt x a))          `(* ,y (expt ,x (+ ,a 1))))
>           (('* ('* (expt x a) y) ('expt x b)) `(* ,y (expt ,x (+ ,a ,b))))
>           (('* ('* y (expt x a)) ('expt x b)) `(* ,y (expt ,x (+ ,a ,b))))
>           (('* ('expt x a) ('* x y))          `(* ,y (expt ,x (+ ,a 1))))
>           (('* ('expt x a) ('* y x))          `(* ,y (expt ,x (+ ,a 1))))
>           (('* ('expt x b) ('* (expt x a) y)) `(* ,y (expt ,x (+ ,a ,b))))
>           (('* ('expt x b) ('* y (expt x a))) `(* ,y (expt ,x (+ ,a ,b))))
>           (('* ('* x y) x) `(* ,y (expt ,x 2)))
>           (('* ('* y x) x) `(* ,y (expt ,x 2)))
>           (('* ('* ('expt x a) y) x) `(* ,y (expt ,x (+ ,a 1))))
>           (('* ('* y ('expt x a)) x) `(* ,y (expt ,x (+ ,a 1))))
>           (('* x ('* ('expt x a) y)) `(* ,y (expt ,x (+ ,a 1))))
>           (('* x ('* y ('expt x a))) `(* ,y (expt ,x (+ ,a 1))))
>           (('/ x 1) x)
>           (('/ 0 x) 0)
>           (('/ x ('expt y a)) `(* ,x (expt ,y (- ,a))))
>           (('/ x 0) (error "Division by zero"))
>           (('/ x y) `(* ,x (expt ,y -1)))
>           (('expt x 1) x)
>           (('expt x 0) 1)
>           (x x))))
> 
> This will produce code a little less efficient than your Pascal, because
> it doesn't take advantage of the opportunities for merging the initial
> bits of the matching clauses. It's about 55% of the length of your
> Pascal code (and yes, I am including the 35 lines I didn't bother to
> write), and the ratio will only become more favourable as the number
> of simplifications increases. (I'm not sure what the rationale
> is for the particular set you've chosen.) And I think it's much easier
> to read.
> 
> (Actually, the ratio is more favourable to Lisp than the above
> suggests, because I haven't included the code for your functions
> IsMult, IsPow, and so on, which aren't needed in the Lisp version.
> I have used something called IS-CONSTANT, which I haven't defined,
> but you could substitute NUMBERP and it would work fine.)
> 
> If the efficiency loss matters, it isn't too hard to write a much
> smarter MATCH-CASE macro. That might add another 50-100 lines,
> bringing the Lisp code to almost the same length as the Pascal --
> but still much easier to read and much more concisely extensible.
> 
> Oh, and you can use MATCH-CASE for the differentiation too:
> 
>     (defun differentiate (expr)
>       (match-case expr
>         ('x 1)
>         (('- x) `(- ,(differentiate x)))
>         (('+ x y) `(+ ,(differentiate x) ,(differentiate y)))
>         (('- x y) `(- ,(differentiate x) ,(differentiate y)))
>         (('* x y) `(+ (* ,x ,(differentiate y)) (* ,y ,(differentiate x))))
>         (('/ x y) `(/ (- (* ,y ,(differentiate x)) (* ,x ,(differentiate y)))
>                       (expt ,y 2)))
>         (('log x) `(/ ,(differentiate x) ,x))
>         (('pow x a) `(* (+ (* (differentiate ,a) (log ,x))
>                            (* ,a (/ (differentiate ,x) ,x)))
>                         (expt ,x ,a)))
>         ((x . y) (error "Syntax error"))
>         (x 0)))
> 
> > Besides you might take a look at the source code of the theoprem
> > proving program in paxBasic
>  ...
> > or in paxPascal
>  ...
> > and compare this code with source code presented in book Chang C.L and
> > Lee R.C. "Symbolic Logic and Mechanical Theorem Proving". Then you can
> > conclude what of codes is more suitable for understanding.
> 
> I don't have that book; sorry.
> 
> > Finally, the ancients spoke: "If two man do the same thing, it already
> > is not same thing". I think that alternative approaches are always
> > useful for the consideration. Not?
> 
> Perhaps. And it may be that your languages with "LISPPA technology"
> are useful to people who are (for whatever reason) unable to program
> in Lisp or in other languages with polymorphic arrays such as Python.
> But here in comp.lang.lisp, that's not your audience :-).
From: Matthew Danish
Subject: Re: LISPPA
Date: 
Message-ID: <20040420194101.GK25328@mapcar.org>
On Tue, Apr 20, 2004 at 05:32:08AM -0700, Alexander Baranovsky wrote:
> > In short: Lisp is not a functional language with some
> > imperative bits; it is a multi-paradigm language that
> > supports both functional and imperative styles of
> > programming.
> 
> The acticle"History of Lisp" by John McCarthy 
> 
> http://www-formal.stanford.edu/jmc/history/lisp/node1.html
> 
> shows that you are not quite right :-)

No, it shows that you are not right at all.  Check the date of the
paper: it is written over a decade before Common Lisp was standardized.
Read the paper: in it McCarthy states that he was influenced by Church's
paper BUT HE DID NOT IMPLEMENT IT CORRECTLY.  In fact, the first truely
"functional" Lisp dialect was Scheme.  Early Lisp dialects were
programmed in a very imperative style.

> You and subscribers of the newsgroup will like Lisp even after I'll
> able to prove you that these "improvements" to Pascal allows to
> program AI application as well with the same success or even better
> than Lisp it does  ;-)

If anything, you've demonstrated a complete lack of understanding of the
philosophy of Lisp and its users.  Perhaps if I say it explicitly you
will understand:

    The power of Lisp does not lie solely with lists.

Lists and symbolic manipulation are nice and flexible, but certainly
aren't unique to Lisp anymore (LISPPA notwithstanding).  Certainly you
can do all these things in your typical functional language (ML,
Haskell) and even in languages like Java and Perl.  Closures and
higher-order functions certainly aren't unique to Lisp.  Lisp does have
the code<->data equivalence, and powerful macros as a result, but surely
that isn't the "killer feature" for AI.  Nor would be the extremely
flexible object system, condition handling and restart system, or the
large library of types, classes, and functions.  There really isn't
anything "AI-specific" in there.  However, there's a lot of tools to
help solve *hard problems*.  AI would be one of those, don't you think?

And finally, even with LISPPA, how many of the above features do [pax]C,
Pascal, or Basic have?  I would say that they are missing all of the
important ones; and it is these ones which are hard to add to a language
which lacks them.  You claim that adding symbolic list manipulation to
an imperative language somehow gives it expresiveness equal to Lisp, but
you are forgetting that Lisp has moved on from LISP1.5 in the past 40
years.

-- 
; Matthew Danish <·······@andrew.cmu.edu>
; OpenPGP public key: C24B6010 on keyring.debian.org
; Signed or encrypted mail welcome.
; "There is no dark side of the moon really; matter of fact, it's all dark."
From: Alexander Baranovsky
Subject: Re: LISPPA
Date: 
Message-ID: <27d7ccd8.0404210006.603fc8cf@posting.google.com>
> > The acticle"History of Lisp" by John McCarthy 
> > 
> > http://www-formal.stanford.edu/jmc/history/lisp/node1.html
> > 
> > shows that you are not quite right :-)
> 
> No, it shows that you are not right at all.  Check the date of the
> paper: it is written over a decade before Common Lisp was standardized.
> Read the paper: in it McCarthy states that he was influenced by Church's
> paper BUT HE DID NOT IMPLEMENT IT CORRECTLY.  In fact, the first truely
> "functional" Lisp dialect was Scheme.  Early Lisp dialects were
> programmed in a very imperative style.

No comments :-)

> And finally, even with LISPPA, how many of the above features do [pax]C,
> Pascal, or Basic have?  I would say that they are missing all of the
> important ones; and it is these ones which are hard to add to a language
> which lacks them.  You claim that adding symbolic list manipulation to
> an imperative language somehow gives it expresiveness equal to Lisp, but
> you are forgetting that Lisp has moved on from LISP1.5 in the past 40
> years.

The imperative programming also has been a bit improved during these
years :-)

As for the "the code<->data equivalence", I'm constantly hearing it.
The concept is great, indeed. But show me a real problem related to
this subject which I could not solve with procedural types in Pascal,
eval function in JavaScript or even the code blocks in Clipper and
lambda-expressions in VIRT.

A. 

Matthew Danish <·······@andrew.cmu.edu> wrote in message news:<······················@mapcar.org>...
> On Tue, Apr 20, 2004 at 05:32:08AM -0700, Alexander Baranovsky wrote:
> > > In short: Lisp is not a functional language with some
> > > imperative bits; it is a multi-paradigm language that
> > > supports both functional and imperative styles of
> > > programming.
> > 
> > The acticle"History of Lisp" by John McCarthy 
> > 
> > http://www-formal.stanford.edu/jmc/history/lisp/node1.html
> > 
> > shows that you are not quite right :-)
> 
> No, it shows that you are not right at all.  Check the date of the
> paper: it is written over a decade before Common Lisp was standardized.
> Read the paper: in it McCarthy states that he was influenced by Church's
> paper BUT HE DID NOT IMPLEMENT IT CORRECTLY.  In fact, the first truely
> "functional" Lisp dialect was Scheme.  Early Lisp dialects were
> programmed in a very imperative style.
> 
> > You and subscribers of the newsgroup will like Lisp even after I'll
> > able to prove you that these "improvements" to Pascal allows to
> > program AI application as well with the same success or even better
> > than Lisp it does  ;-)
> 
> If anything, you've demonstrated a complete lack of understanding of the
> philosophy of Lisp and its users.  Perhaps if I say it explicitly you
> will understand:
> 
>     The power of Lisp does not lie solely with lists.
> 
> Lists and symbolic manipulation are nice and flexible, but certainly
> aren't unique to Lisp anymore (LISPPA notwithstanding).  Certainly you
> can do all these things in your typical functional language (ML,
> Haskell) and even in languages like Java and Perl.  Closures and
> higher-order functions certainly aren't unique to Lisp.  Lisp does have
> the code<->data equivalence, and powerful macros as a result, but surely
> that isn't the "killer feature" for AI.  Nor would be the extremely
> flexible object system, condition handling and restart system, or the
> large library of types, classes, and functions.  There really isn't
> anything "AI-specific" in there.  However, there's a lot of tools to
> help solve *hard problems*.  AI would be one of those, don't you think?
> 
> And finally, even with LISPPA, how many of the above features do [pax]C,
> Pascal, or Basic have?  I would say that they are missing all of the
> important ones; and it is these ones which are hard to add to a language
> which lacks them.  You claim that adding symbolic list manipulation to
> an imperative language somehow gives it expresiveness equal to Lisp, but
> you are forgetting that Lisp has moved on from LISP1.5 in the past 40
> years.
From: Gareth McCaughan
Subject: Re: LISPPA
Date: 
Message-ID: <87n0550wme.fsf@g.mccaughan.ntlworld.com>
Alexander Baranovsky wrote:

> As for the "the code<->data equivalence", I'm constantly hearing it.
> The concept is great, indeed. But show me a real problem related to
> this subject which I could not solve with procedural types in Pascal,
> eval function in JavaScript or even the code blocks in Clipper and
> lambda-expressions in VIRT.

No one can show you any real problem which you could not solve
with *machine code*. Or even with Turing machines. The question
is how efficiently (in terms of your time, not only that of the
computer) you can solve problems with any given set of facilities.
Those of us who are familiar with Lisp know that some of its
features (such as its very powerful macro system) make a big
difference to how efficiently some kinds of problem can be
solved.

-- 
Gareth McCaughan
.sig under construc
From: Thomas Schilling
Subject: Re: LISPPA
Date: 
Message-ID: <c66s1b$8ngpg$1@uni-berlin.de>
Alexander Baranovsky wrote:

> As for the "the code<->data equivalence", I'm constantly hearing it.
> The concept is great, indeed. But show me a real problem related to
> this subject which I could not solve with procedural types in Pascal,
> eval function in JavaScript or even the code blocks in Clipper and
> lambda-expressions in VIRT.

Haha.

You must be very happy with C and Pascal and Basic. I was, too, with C++.
But then I wanted to do a good code completion module for some kde ide. But
to make it really useful I wanted to do a whole C++-parser--bad fault. You
cannot do this painlessly. Since in C++ syntax and semantics are so tightly
coupled it's really hard, especially when it comes to templates. (Yeah, ok,
could use bison+flex, but had to be incremental and error tolerant and all
that stuff.)

Of couse it all was _possible_. But it's not about that. It's about how much
_effort_ it is.

Due to time probs and other reasons I gave up and wanted to do some own
language with a very easy-to-parse syntax (inspired by python's) but before
creating just another new language I first looked for a book about advanced
programming language design, but before I finished reading it I luckily got
introduced to Common Lisp.

And besides all that other great features already listed in other posts to
this thread the code<->data equivalence is a _great_ concept. The great
about it is that lisp code already is the lisp syntax tree, ie. there's no
(real) parsing. Just take a look at the "loop"-macro it's a whole
sublanguage. and one big statement of Graham's "On Lisp" is that in lisp
due to Common Lisp's powerful macros (which are that powerful due to the
code<->data-equivalence) you effectively create a problem-specific
programming language. And all that without much effort and any sort of
(self-written) lexer or parser.

Believe me. It is incredible how far above other languages common lisp is if
you don't take a thorough look at it.

Take an unbiased and thorough look at Common Lisp and disprove my sig. It'll
really be worth it.
BTW, a good book about the power of lisp-macros is "On Lisp" by Paul Graham.
It's freely downloadable.

regards

Thomas
-- 
"But I don't expect to convince anyone (over 25) to go out and learn Lisp."
- Paul Graham.
From: Alexander Baranovsky
Subject: Re: LISPPA
Date: 
Message-ID: <27d7ccd8.0404211952.5771d22b@posting.google.com>
Thank you, Thomas.

I also grateful to all people which have participated in the discussion.

Regards,

A.

Thomas Schilling <······@yahoo.de> wrote in message news:<··············@uni-berlin.de>...
> Alexander Baranovsky wrote:
> 
> > As for the "the code<->data equivalence", I'm constantly hearing it.
> > The concept is great, indeed. But show me a real problem related to
> > this subject which I could not solve with procedural types in Pascal,
> > eval function in JavaScript or even the code blocks in Clipper and
> > lambda-expressions in VIRT.
> 
> Haha.
> 
> You must be very happy with C and Pascal and Basic. I was, too, with C++.
> But then I wanted to do a good code completion module for some kde ide. But
> to make it really useful I wanted to do a whole C++-parser--bad fault. You
> cannot do this painlessly. Since in C++ syntax and semantics are so tightly
> coupled it's really hard, especially when it comes to templates. (Yeah, ok,
> could use bison+flex, but had to be incremental and error tolerant and all
> that stuff.)
> 
> Of couse it all was _possible_. But it's not about that. It's about how much
> _effort_ it is.
> 
> Due to time probs and other reasons I gave up and wanted to do some own
> language with a very easy-to-parse syntax (inspired by python's) but before
> creating just another new language I first looked for a book about advanced
> programming language design, but before I finished reading it I luckily got
> introduced to Common Lisp.
> 
> And besides all that other great features already listed in other posts to
> this thread the code<->data equivalence is a _great_ concept. The great
> about it is that lisp code already is the lisp syntax tree, ie. there's no
> (real) parsing. Just take a look at the "loop"-macro it's a whole
> sublanguage. and one big statement of Graham's "On Lisp" is that in lisp
> due to Common Lisp's powerful macros (which are that powerful due to the
> code<->data-equivalence) you effectively create a problem-specific
> programming language. And all that without much effort and any sort of
> (self-written) lexer or parser.
> 
> Believe me. It is incredible how far above other languages common lisp is if
> you don't take a thorough look at it.
> 
> Take an unbiased and thorough look at Common Lisp and disprove my sig. It'll
> really be worth it.
> BTW, a good book about the power of lisp-macros is "On Lisp" by Paul Graham.
> It's freely downloadable.
> 
> regards
> 
> Thomas
From: David Steuber
Subject: Re: LISPPA
Date: 
Message-ID: <87ekqhj5ga.fsf@david-steuber.com>
Thomas Schilling <······@yahoo.de> writes:

> Take an unbiased and thorough look at Common Lisp and disprove my sig. It'll
> really be worth it.
> BTW, a good book about the power of lisp-macros is "On Lisp" by Paul Graham.
> It's freely downloadable.
> 
> -- 
> "But I don't expect to convince anyone (over 25) to go out and learn Lisp."
> - Paul Graham.

I'm over 25 and I am working on learning Lisp.  I have great
expectations of what I can do with it if I learn to master it.

I got the book "On Lisp" in dead tree form before I knew it was
available for free in electronic form.  Oh well.  I still like dead
tree books better than electronic books.  It is still easier to read
print than off the monitor.

I do know at least one person who thinks Lisp isn't worth programming
in because of "mind share".  Oh well.  Some people just can't be
convinced.

-- 
I wouldn't mind the rat race so much if it wasn't for all the damn cats.
From: Thomas Schilling
Subject: Re: LISPPA
Date: 
Message-ID: <c68e7b$96s0l$1@uni-berlin.de>
David Steuber wrote:

> I'm over 25 and I am working on learning Lisp.  

Good :-)

Yes I already noticed that. I just put it to my sig because I made this
experience. The person I tried to convince was even below 25.

It really is like in Graham's Blub paradox.

I think you need to meet certain preconditions to be infected by lisp fever.
This can be real trouble with other languages or just curiosity.
Furthermore you cannot really appreciate it's advantages if you haven't
seen the other (dark ;-) side of the programming world. And what's even
more important is that someone has to tell good things about it. While
doing this C++-parser-stuff I stumbled over some emacs-lisp code but I
didn't expect it to be different from any of the other "other languages".

But then someone proved me wrong ... :)

> I have great
> expectations of what I can do with it if I learn to master it.

Me too. But I already use it. ;)

> I got the book "On Lisp" in dead tree form before I knew it was
> available for free in electronic form.  Oh well.  I still like dead
> tree books better than electronic books.  It is still easier to read
> print than off the monitor.

Indeed.

Well, except if you want to save money and paper and put 4 pages on a single
sheet of A4 paper ...

> I do know at least one person who thinks Lisp isn't worth programming
> in because of "mind share".  Oh well.  Some people just can't be
> convinced.

Huh? How can this be seen as a counter-argument?

-- 
"But I don't expect to convince anyone (over 25) to go out and learn Lisp."
- Paul Graham.
From: David Steuber
Subject: Re: LISPPA
Date: 
Message-ID: <87vfjrzrex.fsf@david-steuber.com>
Thomas Schilling <······@yahoo.de> writes:

> > I do know at least one person who thinks Lisp isn't worth programming
> > in because of "mind share".  Oh well.  Some people just can't be
> > convinced.
> 
> Huh? How can this be seen as a counter-argument?

It's more like a random data point.  This person thinks that Java is
the language to go with because so many people have written libraries
for it like J2EE and such.

I didn't try to come up with a counter argument to that.

My personal experience with Java is that it does not really fix any of
the perceived blemishes of C.  Sure, it adds a garbage collector.  Big
deal.  From my POV, that is just about it.  Having libraries is nice,
but Java's libraries have gotten so extensive (that is the "core"
libraries) that the language is HUGE without really cutting down on
the code I have to write.

Lisp is also huge, but it makes my code smaller.

This is all just anecdotal of course.

-- 
I wouldn't mind the rat race so much if it wasn't for all the damn cats.
From: Alexander Baranovsky
Subject: Re: LISPPA
Date: 
Message-ID: <27d7ccd8.0404222231.1d448918@posting.google.com>
> This is all just anecdotal of course.

I think your examination is more anecdotal. 

> It's more like a random data point.  This person thinks that Java is
> the language to go with because so many people have written libraries
> for it like J2EE and such.

"This person" never stated that one presence of libraries in Java,
Basic or Delphi can compensate power of the Lisp language. Java and
other mentioned languages should be extended with extra features which
will allow to effectively use these languages in the symbolic
computations. I have discussed  these features earlier.

More interesting is another question: is a garbage collector really
necessary in the processing of dynamic data structures?  LISPPA
confirms - no, we can avoid it.

I already mentioned the problem of simplification of mathematical
expressions (Compress procedure in program of symbolic
differentiation) which allows to compress tree without producing
garbage at each step of compression. It uses the reduced assignments.

A. 

David Steuber <·····@david-steuber.com> wrote in message news:<··············@david-steuber.com>...
> Thomas Schilling <······@yahoo.de> writes:
> 
> > > I do know at least one person who thinks Lisp isn't worth programming
> > > in because of "mind share".  Oh well.  Some people just can't be
> > > convinced.
> > 
> > Huh? How can this be seen as a counter-argument?
> 
> It's more like a random data point.  This person thinks that Java is
> the language to go with because so many people have written libraries
> for it like J2EE and such.
> 
> I didn't try to come up with a counter argument to that.
> 
> My personal experience with Java is that it does not really fix any of
> the perceived blemishes of C.  Sure, it adds a garbage collector.  Big
> deal.  From my POV, that is just about it.  Having libraries is nice,
> but Java's libraries have gotten so extensive (that is the "core"
> libraries) that the language is HUGE without really cutting down on
> the code I have to write.
> 
> Lisp is also huge, but it makes my code smaller.
> 
> This is all just anecdotal of course.
From: Marc Spitzer
Subject: Re: LISPPA
Date: 
Message-ID: <861xmfzh1h.fsf@bogomips.optonline.net>
David Steuber <·····@david-steuber.com> writes:

> My personal experience with Java is that it does not really fix any of
> the perceived blemishes of C.  Sure, it adds a garbage collector.  Big
> deal.  From my POV, that is just about it.  Having libraries is nice,

In all honesty it is a big deal, for Java's target market.  This
market is called "average programmers" and it is the same market that
VB was designed for.  Not playing with pointers *is* a big win there.

> but Java's libraries have gotten so extensive (that is the "core"
> libraries) that the language is HUGE without really cutting down on
> the code I have to write.

Having a huge set of libraries that gives you a bunch of cookie
cutter, more or less, solutions is also a big win there.

>
> Lisp is also huge, but it makes my code smaller.

It may have been 20 years ago but compared to C++ or
Java it is small today.

marc
From: Alexander Baranovsky
Subject: Re: LISPPA
Date: 
Message-ID: <27d7ccd8.0404222348.2b2b6e6b@posting.google.com>
> In all honesty it is a big deal, for Java's target market.  This
> market is called "average programmers" and it is the same market that
> VB was designed for.  Not playing with pointers *is* a big win there.

I think the division on "experts" (Lisp programmers) and "averages"
(the rest of programmers) exists exclusively in minds of some Lisp
programmers. :-)

However, there is a undoubted chasm between VB, Java and Delphi
programmers and Lisp programmers, indeed. VB, Java and Delphi
programmers are not bad ones in average, quite the contrary. If there
are not much interest of these programmers to AI problems, I think the
root of it is not intellect of the programmers, but it is lacks of
mentioned programming languages. Yes, I know, many of these
programmers know Lisp, but the problem of cross-language integration
really exists. Many programmers would prefer seamless solutions.
LISPPA can "open door" for many and many VB, Java and Delphi
programmers really interested to extend their applications with AI
functionality.

A.

Marc Spitzer <········@optonline.net> wrote in message news:<··············@bogomips.optonline.net>...
> David Steuber <·····@david-steuber.com> writes:
> 
> > My personal experience with Java is that it does not really fix any of
> > the perceived blemishes of C.  Sure, it adds a garbage collector.  Big
> > deal.  From my POV, that is just about it.  Having libraries is nice,
> 
> In all honesty it is a big deal, for Java's target market.  This
> market is called "average programmers" and it is the same market that
> VB was designed for.  Not playing with pointers *is* a big win there.
> 
> > but Java's libraries have gotten so extensive (that is the "core"
> > libraries) that the language is HUGE without really cutting down on
> > the code I have to write.
> 
> Having a huge set of libraries that gives you a bunch of cookie
> cutter, more or less, solutions is also a big win there.
> 
> >
> > Lisp is also huge, but it makes my code smaller.
> 
> It may have been 20 years ago but compared to C++ or
> Java it is small today.
> 
> marc
From: Matthew Danish
Subject: Re: LISPPA
Date: 
Message-ID: <20040423080945.GP25328@mapcar.org>
On Fri, Apr 23, 2004 at 12:48:44AM -0700, Alexander Baranovsky wrote:
> LISPPA can "open door" for many and many VB, Java and Delphi
> programmers really interested to extend their applications with AI
> functionality.

Why is it that you have a fundamental inability to understand that
adding 'list-like' data-structures to a language does not in any way
lead to it being able to solve AI problems?  There is nothing magical
about lists, and Lisp programmers moved beyond them about 30 years ago.

-- 
; Matthew Danish <·······@andrew.cmu.edu>
; OpenPGP public key: C24B6010 on keyring.debian.org
; Signed or encrypted mail welcome.
; "There is no dark side of the moon really; matter of fact, it's all dark."
From: Alexander Baranovsky
Subject: Re: LISPPA
Date: 
Message-ID: <27d7ccd8.0404230901.74513b97@posting.google.com>
> Why is it that you have a fundamental inability to understand that
> adding 'list-like' data-structures to a language does not in any way
> lead to it being able to solve AI problems?  There is nothing magical
> about lists, and Lisp programmers moved beyond them about 30 years ago.

Why you see only lists?

LISPPA uses polymorphic arrays a base data structure. The polymorphic
array is the universal recursive
data structure. Lists, treees, or graphs can be considered as partial
cases of polymorphic array. All these data structures are intensively
used in AI for representation of data, so facts, rules, mathematical
expressions, logic predicates, semantics nets etc, etc, etc can be
represented as polymorphic arrays.

LISPPA offers concise, uniform and effective tools which allows to
operate and transform these data structures: you can add or delete
nodes, enumerate items of a data structure. At that you have
possibility of random access which is the most important property of
array regarding the effectivity of programming.

A.

Matthew Danish <·······@andrew.cmu.edu> wrote in message news:<······················@mapcar.org>...
> On Fri, Apr 23, 2004 at 12:48:44AM -0700, Alexander Baranovsky wrote:
> > LISPPA can "open door" for many and many VB, Java and Delphi
> > programmers really interested to extend their applications with AI
> > functionality.
> 
> Why is it that you have a fundamental inability to understand that
> adding 'list-like' data-structures to a language does not in any way
> lead to it being able to solve AI problems?  There is nothing magical
> about lists, and Lisp programmers moved beyond them about 30 years ago.
From: Fred Gilham
Subject: Re: LISPPA
Date: 
Message-ID: <u7llkmnv20.fsf@snapdragon.csl.sri.com>
> LISPPA uses polymorphic arrays a base data structure. The
> polymorphic array is the universal recursive data structure. Lists,
> treees, or graphs can be considered as partial cases of polymorphic
> array. All these data structures are intensively used in AI for
> representation of data, so facts, rules, mathematical expressions,
> logic predicates, semantics nets etc, etc, etc can be represented as
> polymorphic arrays.

The problem is that polymorphic arrays are only slightly more abstract
than "the store" or memory.  Indexes to polymorphic arrays are simply
pointers.

Conses or pairs, for example, are a more abstract concept than just
2-element polymorphic arrays.  That's because there are other ways to
represent them.  The infamous functional representation is an example.
Here's one version:


(defun cons (x y)
  (lambda (z)
    (if z x y)))

(defun car (x) (funcall x t))
(defun cdr (x) (funcall x nil))


Polymorphic arrays don't buy you much conceptually because they are so
low level.  You can also represent all the above as Turing machines,
but nobody would want to go out and program them, except as some kind
of mental exercise.

-- 
Fred Gilham                                         ······@csl.sri.com
In the course of making code more readable, performance was
accidentally improved by about 20%. --- From BRL 2.1.23 release notes
From: Alexander Baranovsky
Subject: Re: LISPPA
Date: 
Message-ID: <27d7ccd8.0404242051.40ccc918@posting.google.com>
> The problem is that polymorphic arrays are only slightly more abstract
> than "the store" or memory.  Indexes to polymorphic arrays are simply
> pointers.

I guess, this point of view is inculcated in your mind by C language.
You have to free your mind from it to see that the definition of
polymorphic array has no relation to the concepts of "memory",
"pointer" or "address".

A.

Fred Gilham <······@snapdragon.csl.sri.com> wrote in message news:<··············@snapdragon.csl.sri.com>...
> > LISPPA uses polymorphic arrays a base data structure. The
> > polymorphic array is the universal recursive data structure. Lists,
> > treees, or graphs can be considered as partial cases of polymorphic
> > array. All these data structures are intensively used in AI for
> > representation of data, so facts, rules, mathematical expressions,
> > logic predicates, semantics nets etc, etc, etc can be represented as
> > polymorphic arrays.
> 
> The problem is that polymorphic arrays are only slightly more abstract
> than "the store" or memory.  Indexes to polymorphic arrays are simply
> pointers.
> 
> Conses or pairs, for example, are a more abstract concept than just
> 2-element polymorphic arrays.  That's because there are other ways to
> represent them.  The infamous functional representation is an example.
> Here's one version:
> 
> 
> (defun cons (x y)
>   (lambda (z)
>     (if z x y)))
> 
> (defun car (x) (funcall x t))
> (defun cdr (x) (funcall x nil))
> 
> 
> Polymorphic arrays don't buy you much conceptually because they are so
> low level.  You can also represent all the above as Turing machines,
> but nobody would want to go out and program them, except as some kind
> of mental exercise.
From: Fred Gilham
Subject: Re: LISPPA
Date: 
Message-ID: <u7hdv811dt.fsf@snapdragon.csl.sri.com>
Alexander Baranovsky wrote:

> > The problem is that polymorphic arrays are only slightly more
> > abstract than "the store" or memory.  Indexes to polymorphic
> > arrays are simply pointers.
> 
> I guess, this point of view is inculcated in your mind by C
> language.  You have to free your mind from it to see that the
> definition of polymorphic array has no relation to the concepts of
> "memory", "pointer" or "address".

You are getting things backward.  First you have to prove that my
point is wrong.  Then you can indulge in mind-reading, psychological
analysis, and the like.

Your response is ironic given that the example I gave of functional
CONS was about as far removed as possible from a C language point of
view.

Perhaps you can explain why you think polymorphic arrays have no
relationship to memory and so on?  Does this mean that you can't store
things in polymorphic arrays?

-- 
Fred Gilham                                   ······@csl.sri.com
I was storing data in every conceivable way, including keeping a chain
of sound waves running between the speaker and the microphone. There
was no more memory left to be had....
From: Alexander Baranovsky
Subject: Re: LISPPA
Date: 
Message-ID: <27d7ccd8.0404260110.17b30baa@posting.google.com>
> > > The problem is that polymorphic arrays are only slightly more
> > > abstract than "the store" or memory.  Indexes to polymorphic
> > > arrays are simply pointers.
> > 
> > I guess, this point of view is inculcated in your mind by C
> > language.  You have to free your mind from it to see that the
> > definition of polymorphic array has no relation to the concepts of
> > "memory", "pointer" or "address".
> 
> You are getting things backward.  First you have to prove that my
> point is wrong.  Then you can indulge in mind-reading, psychological
> analysis, and the like.

According to the Hoare's theory, array is just a function defined on a
finite set of indexes (the domain) into a set of values (image). The
polymorphic array is the array in Hoare's sense with the following
specification: the image is a set defined recursively, so elements of
the set are primitive values (null, numbers, strings, booleans) and
polymorphic arrays. You see this is a just abstract mathematics
definition without any relation to the "memory" or "pointers"
concepts.

So, you mix concepts of "definition" and "implementation". 

Next, the "psychological analysis". The equivalence "array elements"
and "pointers" come from the C language. This point of view already
has been mentioned in this thread a few times. I guess the reason of
it is that some Lisp users better informed about C language than
another imperative languages because some Lisp engines convert Lisp
programs into the C language programs. :-)

> Perhaps you can explain why you think polymorphic arrays have no
> relationship to memory and so on?  Does this mean that you can't store
> things in polymorphic arrays?

You forget, I'm speaking about a definition of array, not
implementation. If we are considering AI problems, we are speaking,
for example, that facts or clauses can be represented and processed as
polymorphic arrays. I do not see why arrays less suitable than linked
lists for it. Rather - contrary. Linked lists can be considered just
as a partial case of polymorphic array. Array is more preferable data
structure than linked list in view of effectivity of programming
(random access, reduced assignments instead of garbage collector,
iteration instead of recursion).

A.

Fred Gilham <······@snapdragon.csl.sri.com> wrote in message news:<··············@snapdragon.csl.sri.com>...
> Alexander Baranovsky wrote:
> 
> > > The problem is that polymorphic arrays are only slightly more
> > > abstract than "the store" or memory.  Indexes to polymorphic
> > > arrays are simply pointers.
> > 
> > I guess, this point of view is inculcated in your mind by C
> > language.  You have to free your mind from it to see that the
> > definition of polymorphic array has no relation to the concepts of
> > "memory", "pointer" or "address".
> 
> You are getting things backward.  First you have to prove that my
> point is wrong.  Then you can indulge in mind-reading, psychological
> analysis, and the like.
> 
> Your response is ironic given that the example I gave of functional
> CONS was about as far removed as possible from a C language point of
> view.
> 
> Perhaps you can explain why you think polymorphic arrays have no
> relationship to memory and so on?  Does this mean that you can't store
> things in polymorphic arrays?
From: Fred Gilham
Subject: Re: LISPPA
Date: 
Message-ID: <u7ekqa254r.fsf@snapdragon.csl.sri.com>
Alexander Baranovsky wrote:
> According to the Hoare's theory, array is just a function defined on
> a finite set of indexes (the domain) into a set of values
> (image). The polymorphic array is the array in Hoare's sense with
> the following specification: the image is a set defined recursively,
> so elements of the set are primitive values (null, numbers, strings,
> booleans) and polymorphic arrays. You see this is a just abstract
> mathematics definition without any relation to the "memory" or
> "pointers" concepts.
> 
> So, you mix concepts of "definition" and "implementation".

You said that a cons is just a two-element polymorphic array where the
first element is the car and the second is the cdr:

A[0] == car
A[1] == cdr

But you have those numbers --- 0 and 1 --- and they are pointers (the
elements are not pointers, the indexes are).

On the other hand, the functional definition (I changed to scheme
because it looks better)

(define (cons x y)
   (lambda (z)
     (if z x y)))

(define (car x)
   (x #t))

(define (cdr x)
   (x #f))

makes no reference to array indexes.  Yes, it uses booleans, but
booleans are more abstract than 1 and 0 and more closely map onto the
idea of a "pair" which is the idea behind a CONS.

So what I'm saying is that polymorphic arrays aren't a very deep
concept, and are in fact a step backwards in abstraction from even the
lowly CONS.

> Next, the "psychological analysis". The equivalence "array elements"
> and "pointers" come from the C language. This point of view already
> has been mentioned in this thread a few times. I guess the reason of
> it is that some Lisp users better informed about C language than
> another imperative languages because some Lisp engines convert Lisp
> programs into the C language programs. :-)

First of all, I'm not equating array elements with pointers; I'm
equating array indices with pointers.  In fact, they are the same
thing, if you treat memory as a polymorphic array, which it is.

Besides that, if I remember my Pascal, they called indexes in Pascal
"pointers".

-- 
Fred Gilham                                       ······@csl.sri.com
The nice thing about being a celebrity is that when you bore people,
they think it's their fault.                     --- Henry Kissinger
From: Marcin 'Qrczak' Kowalczyk
Subject: Re: LISPPA
Date: 
Message-ID: <pan.2004.04.26.19.10.46.102969@knm.org.pl>
On Mon, 26 Apr 2004 07:19:48 -0700, Fred Gilham wrote:

> Yes, it uses booleans, but booleans are more abstract than 1 and 0

I disagree.

> and more closely map onto the idea of a "pair" which is the idea behind
> a CONS.

Why implementing a cons cell with a function would be less (or more)
abstract than implementing it with an array?

Functions are more basic in the lambda calculus, but we don't program
in the lambda calculus. Functions are not sufficient for mutable cons
cells anyway while arrays are.

I disagree that there is some deep difference between those realizations
of a cons cell. They are only pragmatic, i.e. in efficiency (cons cells
are better implemented natively anyway).

-- 
   __("<         Marcin Kowalczyk
   \__/       ······@knm.org.pl
    ^^     http://qrnik.knm.org.pl/~qrczak/
From: Fred Gilham
Subject: Re: LISPPA
Date: 
Message-ID: <u7isfmo3sx.fsf@snapdragon.csl.sri.com>
> > Yes, it uses booleans, but booleans are more abstract than 1 and 0
> 
> I disagree.

Why?

> > and more closely map onto the idea of a "pair" which is the idea
> > behind a CONS.
> 
> Why implementing a cons cell with a function would be less (or more)
> abstract than implementing it with an array?

Oh, I don't know.  Why pick 0 and 1 as the indices of your array?  Why
not 42 and 712?  Sort of an arbitrary implementation-level choice.

> Functions are more basic in the lambda calculus, but we don't
> program in the lambda calculus. Functions are not sufficient for
> mutable cons cells anyway while arrays are.
> 
> I disagree that there is some deep difference between those
> realizations of a cons cell. They are only pragmatic, i.e. in
> efficiency (cons cells are better implemented natively anyway).

I was accused by the OP of thinking in terms of C.  I wanted to
contradict this.  I thought a functional representation of CONS would
help do this.

I don't care about the implementation efficiency of the various
representations of CONS for this discussion.

To put my point another way, a polymorphic array representation of a
cons cell is a lot more like something you would do in C than a
functional representation is, especially since dynamically creating
anonymous functions in C is not an easy task.

-- 
Fred Gilham ······@csl.sri.com | See the lambs and the lions playin?
I join in and I drink the music. Holiness is the air I'm breathin'.
My faithful heroes break the bread and answer all of my questions.
Not to mention what the streets are made of. My heart's held hostage
by this love. -- Chris Rice, DEEP ENOUGH TO DREAM
From: Joe Marshall
Subject: Re: LISPPA
Date: 
Message-ID: <llki33ob.fsf@ccs.neu.edu>
Marcin 'Qrczak' Kowalczyk <······@knm.org.pl> writes:

> On Mon, 26 Apr 2004 07:19:48 -0700, Fred Gilham wrote:
>
>> Yes, it uses booleans, but booleans are more abstract than 1 and 0
>
> I disagree.

How about this, then:

(defun kons (left right)
  (lambda (select) (funcall select left right)))

(defun kar (cell)
  (funcall cell (lambda (l r) l)))

(defun kdr (cell)
  (funcall cell (lambda (l r) r)))

No numbers, no booleans.
From: Julian Stecklina
Subject: Re: LISPPA
Date: 
Message-ID: <86llkis8ux.fsf@web.de>
Marcin 'Qrczak' Kowalczyk <······@knm.org.pl> writes:

> Functions are not sufficient for mutable cons cells anyway while
> arrays are.

Just add (setf car) and (setf cdr) functions and there you have your
mutable cons cells, even if they are implemented as closures. Nothing
magic.

Regards,
-- 
Julian Stecklina 

Signed and encrypted mail welcome.
Key-Server: pgp.mit.edu         Key-ID: 0xD65B2AB5
FA38 DCD3 00EC 97B8 6DD8  D7CC 35D8 8D0E D65B 2AB5

Any sufficiently complicated C or Fortran program
contains an ad hoc informally-specified bug-ridden
slow implementation of half of Common Lisp.
 - Greenspun's Tenth Rule of Programming
From: Gareth McCaughan
Subject: Re: LISPPA
Date: 
Message-ID: <87isfmv5l8.fsf@g.mccaughan.ntlworld.com>
Alexander Baranovsky wrote:

> According to the Hoare's theory, array is just a function defined on a
> finite set of indexes (the domain) into a set of values (image). The
> polymorphic array is the array in Hoare's sense with the following
> specification: the image is a set defined recursively, so elements of
> the set are primitive values (null, numbers, strings, booleans) and
> polymorphic arrays. You see this is a just abstract mathematics
> definition without any relation to the "memory" or "pointers"
> concepts.

The abstract mathematical definition doesn't cope too well
when you start having structures that are both mutable and
sharable.

    a := [1,2]
    b := [a,a]
    a[0] := 99

I do not believe it is possible to give a coherent account
of what's going on here without some concept that's basically
equivalent to that of "pointers" or "references". (The
latter term may be better because it doesn't have the same
associations of pointer *arithmetic* as in C.)

Unless, of course, your LISPPA does a deep copy every time
you write an assignment like those?

> So, you mix concepts of "definition" and "implementation". 
> 
> Next, the "psychological analysis". The equivalence "array elements"
> and "pointers" come from the C language. This point of view already
> has been mentioned in this thread a few times.

Firstly, the equivalence C sort-of makes is between *arrays*
and pointers, not between *array elements* and pointers.
Indeed, the fact that C and C++ treat an array of "struct foo"
objects as actually *containing* the structures rather than
references to them (and likewise for other kinds of container
other than arrays) is, I think, responsible for a substantial
amount of the pain associated with writing complicated programs
in those languages.

(That's not to say that it's entirely bad that C and C++
do that; for some purposes it's a very good thing.)

>                                                I guess the reason of
> it is that some Lisp users better informed about C language than
> another imperative languages because some Lisp engines convert Lisp
> programs into the C language programs. :-)

I think it's enormously unlikely that more than the tiniest
proportion of Lisp programmers use such compilers and pay
the slightest attention to the C they output.

> > Perhaps you can explain why you think polymorphic arrays have no
> > relationship to memory and so on?  Does this mean that you can't store
> > things in polymorphic arrays?
> 
> You forget, I'm speaking about a definition of array, not
> implementation. If we are considering AI problems, we are speaking,
> for example, that facts or clauses can be represented and processed as
> polymorphic arrays. I do not see why arrays less suitable than linked
> lists for it. Rather - contrary. Linked lists can be considered just
> as a partial case of polymorphic array. Array is more preferable data
> structure than linked list in view of effectivity of programming
> (random access, reduced assignments instead of garbage collector,
> iteration instead of recursion).

I would be very sorry to lose either of them. Fortunately Lisp
has them both natively and doesn't require one to be explicitly
implemented on top of the other.

-- 
Gareth McCaughan
.sig under construc
From: Alexander Baranovsky
Subject: Re: LISPPA
Date: 
Message-ID: <27d7ccd8.0404262151.28062b39@posting.google.com>
> The abstract mathematical definition doesn't cope too well
> when you start having structures that are both mutable and
> sharable.
> 
>     a := [1,2]
>     b := [a,a]
>     a[0] := 99
> 
> I do not believe it is possible to give a coherent account
> of what's going on here without some concept that's basically
> equivalent to that of "pointers" or "references". (The
> latter term may be better because it doesn't have the same
> associations of pointer *arithmetic* as in C.)

Your example seems to be slightly unnatural. Indeed, there are no
programming languages which do not allow to write bad programs, each
programming language has own style of programming (I mean a set of
base ideas and techniques), to follow this style means to avoid many
problems. For example, good style of imperative programming suppose
the replacement number constants with symbolic constants. As for the
sharable structures, it is necessary to avoid them (it if it is
possible, of course). But how these things contradict the "functional"
definition of array? :-)

> > So, you mix concepts of "definition" and "implementation". 
> > 
> > Next, the "psychological analysis". The equivalence "array elements"
> > and "pointers" come from the C language. This point of view already
> > has been mentioned in this thread a few times.
> 
> Firstly, the equivalence C sort-of makes is between *arrays*
> and pointers, not between *array elements* and pointers.

Accepted, my erratum.

> Indeed, the fact that C and C++ treat an array of "struct foo"
> objects as actually *containing* the structures rather than
> references to them (and likewise for other kinds of container
> other than arrays) is, I think, responsible for a substantial
> amount of the pain associated with writing complicated programs
> in those languages.

> (That's not to say that it's entirely bad that C and C++
> do that; for some purposes it's a very good thing.)

Very good question. Indeed, why "ugly" pointers do not prevent C
language be very attractive for programmers? There are many reasons,
but the most important for me is that C provides a possibility to
write concise algorithmes for wide set of problems of low-level
programming. For example:

while ((*s++ = *t++) != '\0');

I'm trying to provide concise and uniform algorithmes and minimal set
of concepts for the working with dynamic data structures in a
high-level language. I sure, the brevity determines style of
programming, if you have have concise patterns and follow the style,
you will able to write effective and free-of-bugs programs.

A.

Gareth McCaughan <················@pobox.com> wrote in message news:<··············@g.mccaughan.ntlworld.com>...
> Alexander Baranovsky wrote:
> 
> > According to the Hoare's theory, array is just a function defined on a
> > finite set of indexes (the domain) into a set of values (image). The
> > polymorphic array is the array in Hoare's sense with the following
> > specification: the image is a set defined recursively, so elements of
> > the set are primitive values (null, numbers, strings, booleans) and
> > polymorphic arrays. You see this is a just abstract mathematics
> > definition without any relation to the "memory" or "pointers"
> > concepts.
> 
> The abstract mathematical definition doesn't cope too well
> when you start having structures that are both mutable and
> sharable.
> 
>     a := [1,2]
>     b := [a,a]
>     a[0] := 99
> 
> I do not believe it is possible to give a coherent account
> of what's going on here without some concept that's basically
> equivalent to that of "pointers" or "references". (The
> latter term may be better because it doesn't have the same
> associations of pointer *arithmetic* as in C.)
> 
> Unless, of course, your LISPPA does a deep copy every time
> you write an assignment like those?
> 
> > So, you mix concepts of "definition" and "implementation". 
> > 
> > Next, the "psychological analysis". The equivalence "array elements"
> > and "pointers" come from the C language. This point of view already
> > has been mentioned in this thread a few times.
> 
> Firstly, the equivalence C sort-of makes is between *arrays*
> and pointers, not between *array elements* and pointers.
> Indeed, the fact that C and C++ treat an array of "struct foo"
> objects as actually *containing* the structures rather than
> references to them (and likewise for other kinds of container
> other than arrays) is, I think, responsible for a substantial
> amount of the pain associated with writing complicated programs
> in those languages.
> 
> (That's not to say that it's entirely bad that C and C++
> do that; for some purposes it's a very good thing.)
> 
> >                                                I guess the reason of
> > it is that some Lisp users better informed about C language than
> > another imperative languages because some Lisp engines convert Lisp
> > programs into the C language programs. :-)
> 
> I think it's enormously unlikely that more than the tiniest
> proportion of Lisp programmers use such compilers and pay
> the slightest attention to the C they output.
> 
> > > Perhaps you can explain why you think polymorphic arrays have no
> > > relationship to memory and so on?  Does this mean that you can't store
> > > things in polymorphic arrays?
> > 
> > You forget, I'm speaking about a definition of array, not
> > implementation. If we are considering AI problems, we are speaking,
> > for example, that facts or clauses can be represented and processed as
> > polymorphic arrays. I do not see why arrays less suitable than linked
> > lists for it. Rather - contrary. Linked lists can be considered just
> > as a partial case of polymorphic array. Array is more preferable data
> > structure than linked list in view of effectivity of programming
> > (random access, reduced assignments instead of garbage collector,
> > iteration instead of recursion).
> 
> I would be very sorry to lose either of them. Fortunately Lisp
> has them both natively and doesn't require one to be explicitly
> implemented on top of the other.
From: Gareth McCaughan
Subject: Re: LISPPA
Date: 
Message-ID: <874qr5tdes.fsf@g.mccaughan.ntlworld.com>
··@cable.netlux.org (Alexander Baranovsky) writes:

> > The abstract mathematical definition doesn't cope too well
> > when you start having structures that are both mutable and
> > sharable.
> > 
> >     a := [1,2]
> >     b := [a,a]
> >     a[0] := 99
> > 
> > I do not believe it is possible to give a coherent account
> > of what's going on here without some concept that's basically
> > equivalent to that of "pointers" or "references". (The
> > latter term may be better because it doesn't have the same
> > associations of pointer *arithmetic* as in C.)
> 
> Your example seems to be slightly unnatural.

Deliberately simplified, that's all.

>                                              Indeed, there are no
> programming languages which do not allow to write bad programs, each
> programming language has own style of programming (I mean a set of
> base ideas and techniques), to follow this style means to avoid many
> problems. For example, good style of imperative programming suppose
> the replacement number constants with symbolic constants. As for the
> sharable structures, it is necessary to avoid them (it if it is
> possible, of course). But how these things contradict the "functional"
> definition of array? :-)

If you think sharing mutable structure is bad programming,
then I must respectfully but strongly disagree. Suppose you
have, to take a hackneyed example, some objects representing
employees of a company; and suppose each person's object has
a reference to the person's boss and a list fo subordinates.
Then you have tons of shared mutable structure, and a good
thing too.

How do these things contradict the "functional" definition
of an array? Maybe they don't; it depends on how you fill in
various details of that definition that you've left vague.
The ways of filling them in that don't lead to contradictions
when you have shared mutable structure are exactly the ones
that require some concept equivalent to that of "pointer" or
"reference".

> > Indeed, the fact that C and C++ treat an array of "struct foo"
> > objects as actually *containing* the structures rather than
> > references to them (and likewise for other kinds of container
> > other than arrays) is, I think, responsible for a substantial
> > amount of the pain associated with writing complicated programs
> > in those languages.
> 
> > (That's not to say that it's entirely bad that C and C++
> > do that; for some purposes it's a very good thing.)
> 
> Very good question.

I didn't ask a question. :-)

>                     Indeed, why "ugly" pointers do not prevent C
> language be very attractive for programmers? There are many reasons,
> but the most important for me is that C provides a possibility to
> write concise algorithmes for wide set of problems of low-level
> programming. For example:
> 
> while ((*s++ = *t++) != '\0');
> 
> I'm trying to provide concise and uniform algorithmes and minimal set
> of concepts for the working with dynamic data structures in a
> high-level language. I sure, the brevity determines style of
> programming, if you have have concise patterns and follow the style,
> you will able to write effective and free-of-bugs programs.

"Minimal set of concepts" and "brevity" work against one another.
For instance, if you want to write a linked list in a language
with no particular support for them where the best you can do is
to use 2-element arrays, you get things like

    [1,[2,[3,[4,[5,nil]]]]]

instead of

    (1 2 3 4 5)

and you presumably have to write your own code for looping over
the elements of a linked list instead of having tools like DOLIST

    (dolist (x some-list)
      (do-something-with x))

provided for you.

Anyway, I think it's naive to suggest that having "concise patterns"
and "following the style" suffice to make effective, bugless programs.

-- 
Gareth McCaughan
.sig under construc
From: Karl A. Krueger
Subject: Re: LISPPA
Date: 
Message-ID: <c6mgff$2f2$1@baldur.whoi.edu>
Gareth McCaughan <················@pobox.com> wrote:
> Suppose you have, to take a hackneyed example, some objects
> representing employees of a company; and suppose each person's
> object has a reference to the person's boss and a list fo
> subordinates.  Then you have tons of shared mutable structure,
> and a good thing too.

Well, actually, what you have there is an ad-hoc implementation of
half of an archaic (navigational) database.

-- 
Karl A. Krueger <········@example.edu>
Woods Hole Oceanographic Institution
Email address is spamtrapped.  s/example/whoi/
"Outlook not so good." -- Magic 8-Ball Software Reviews
From: Gareth McCaughan
Subject: Re: LISPPA
Date: 
Message-ID: <87hdv4sa20.fsf@g.mccaughan.ntlworld.com>
"Karl A. Krueger" <········@example.edu> writes:

> Gareth McCaughan <················@pobox.com> wrote:
> > Suppose you have, to take a hackneyed example, some objects
> > representing employees of a company; and suppose each person's
> > object has a reference to the person's boss and a list fo
> > subordinates.  Then you have tons of shared mutable structure,
> > and a good thing too.
> 
> Well, actually, what you have there is an ad-hoc implementation of
> half of an archaic (navigational) database.

I do hope you aren't suggesting that whenever you have
shared mutable structure you should stick all the objects
involved into a relational database. I'm sorry if you
disapprove of the example I used.

-- 
Gareth McCaughan
.sig under construc
From: Karl A. Krueger
Subject: Re: LISPPA
Date: 
Message-ID: <c6ohd5$obe$1@baldur.whoi.edu>
Gareth McCaughan <················@pobox.com> wrote:
> "Karl A. Krueger" <········@example.edu> writes:
>> Gareth McCaughan <················@pobox.com> wrote:
>> > Suppose you have, to take a hackneyed example, some objects
>> > representing employees of a company; and suppose each person's
>> > object has a reference to the person's boss and a list fo
>> > subordinates.  Then you have tons of shared mutable structure,
>> > and a good thing too.
>> 
>> Well, actually, what you have there is an ad-hoc implementation of
>> half of an archaic (navigational) database.
> 
> I do hope you aren't suggesting that whenever you have
> shared mutable structure you should stick all the objects
> involved into a relational database. I'm sorry if you
> disapprove of the example I used.

It was just a joke between Greenspun's Tenth and the object/relational
mismatch.  :)

-- 
Karl A. Krueger <········@example.edu>
Woods Hole Oceanographic Institution
Email address is spamtrapped.  s/example/whoi/
"Outlook not so good." -- Magic 8-Ball Software Reviews
From: Alexander Baranovsky
Subject: Re: LISPPA
Date: 
Message-ID: <27d7ccd8.0404272330.1251c0fe@posting.google.com>
> If you think sharing mutable structure is bad programming,

I say: "it is necessary to avoid them (it it is possible, of course)",
it is not the same as "bad programming". In another words, I mean that
simple solutions and minimal number of concepts are preferable
(Occam's principle), IF they can be a suitable replacement for a
complex solution.

> then I must respectfully but strongly disagree. Suppose you
> have, to take a hackneyed example, some objects representing
> employees of a company; and suppose each person's object has
> a reference to the person's boss and a list fo subordinates.
> Then you have tons of shared mutable structure, and a good
> thing too.

I can offer another example: 

P =['X', 'Y'];
T := ["R", ["F", @P[0]], @P[1]];

It represents term T = R(F(X), Y). This representation allows both to
simplify the algorithmes of working with terms and to increase
effectivity, for example, the renaming of terms and the unification
algorithm. (@A denotes alias of A).

So, please do not be too categorical in conclusions. There are many
hues between "black" and "white" colors :-)

> "Minimal set of concepts" and "brevity" work against one another.
> For instance, if you want to write a linked list in a language
> with no particular support for them where the best you can do is
> to use 2-element arrays, you get things like
> 
>     [1,[2,[3,[4,[5,nil]]]]]
> 
> instead of
> 
>     (1 2 3 4 5)

Oh, I might add () to denote linked lists (a partial case of
polymorphic array) but I do not want to do it, I really do not want
:-) As for me, the notation

   [1,[2,[3,[4,[5,nil]]]]]

better shows "nature" of linked list than

   (1 2 3 4 5)

> 
> and you presumably have to write your own code for looping over
> the elements of a linked list instead of having tools like DOLIST
> 
>     (dolist (x some-list)
>       (do-something-with x))
> 
> provided for you.

Really?

P := L;
while P <> null do 
begin
   Do-something-with (P);
   P := P[1];
end;

Seems to be not a very complicated looping over the elements :-)
Besides, if your language supports the procedural types, you can pass
a function as parameter:

Dolist(L, Do-something-with);

> 
> Anyway, I think it's naive to suggest that having "concise patterns"
> and "following the style" suffice to make effective, bugless programs.

My remark is rather related with the psychology, than with the
programming. This remark based on my experience, so it is true for me.
We are different, very different :-)

A.

Gareth McCaughan <················@pobox.com> wrote in message news:<··············@g.mccaughan.ntlworld.com>...
> ··@cable.netlux.org (Alexander Baranovsky) writes:
> 
> > > The abstract mathematical definition doesn't cope too well
> > > when you start having structures that are both mutable and
> > > sharable.
> > > 
> > >     a := [1,2]
> > >     b := [a,a]
> > >     a[0] := 99
> > > 
> > > I do not believe it is possible to give a coherent account
> > > of what's going on here without some concept that's basically
> > > equivalent to that of "pointers" or "references". (The
> > > latter term may be better because it doesn't have the same
> > > associations of pointer *arithmetic* as in C.)
> > 
> > Your example seems to be slightly unnatural.
> 
> Deliberately simplified, that's all.
> 
> >                                              Indeed, there are no
> > programming languages which do not allow to write bad programs, each
> > programming language has own style of programming (I mean a set of
> > base ideas and techniques), to follow this style means to avoid many
> > problems. For example, good style of imperative programming suppose
> > the replacement number constants with symbolic constants. As for the
> > sharable structures, it is necessary to avoid them (it if it is
> > possible, of course). But how these things contradict the "functional"
> > definition of array? :-)
> 
> If you think sharing mutable structure is bad programming,
> then I must respectfully but strongly disagree. Suppose you
> have, to take a hackneyed example, some objects representing
> employees of a company; and suppose each person's object has
> a reference to the person's boss and a list fo subordinates.
> Then you have tons of shared mutable structure, and a good
> thing too.
> 
> How do these things contradict the "functional" definition
> of an array? Maybe they don't; it depends on how you fill in
> various details of that definition that you've left vague.
> The ways of filling them in that don't lead to contradictions
> when you have shared mutable structure are exactly the ones
> that require some concept equivalent to that of "pointer" or
> "reference".
> 
> > > Indeed, the fact that C and C++ treat an array of "struct foo"
> > > objects as actually *containing* the structures rather than
> > > references to them (and likewise for other kinds of container
> > > other than arrays) is, I think, responsible for a substantial
> > > amount of the pain associated with writing complicated programs
> > > in those languages.
>  
> > > (That's not to say that it's entirely bad that C and C++
> > > do that; for some purposes it's a very good thing.)
> > 
> > Very good question.
> 
> I didn't ask a question. :-)
> 
> >                     Indeed, why "ugly" pointers do not prevent C
> > language be very attractive for programmers? There are many reasons,
> > but the most important for me is that C provides a possibility to
> > write concise algorithmes for wide set of problems of low-level
> > programming. For example:
> > 
> > while ((*s++ = *t++) != '\0');
> > 
> > I'm trying to provide concise and uniform algorithmes and minimal set
> > of concepts for the working with dynamic data structures in a
> > high-level language. I sure, the brevity determines style of
> > programming, if you have have concise patterns and follow the style,
> > you will able to write effective and free-of-bugs programs.
> 
> "Minimal set of concepts" and "brevity" work against one another.
> For instance, if you want to write a linked list in a language
> with no particular support for them where the best you can do is
> to use 2-element arrays, you get things like
> 
>     [1,[2,[3,[4,[5,nil]]]]]
> 
> instead of
> 
>     (1 2 3 4 5)
> 
> and you presumably have to write your own code for looping over
> the elements of a linked list instead of having tools like DOLIST
> 
>     (dolist (x some-list)
>       (do-something-with x))
> 
> provided for you.
> 
> Anyway, I think it's naive to suggest that having "concise patterns"
> and "following the style" suffice to make effective, bugless programs.
From: Tim Bradshaw
Subject: Re: LISPPA
Date: 
Message-ID: <fbc0f5d1.0404280727.4774e34f@posting.google.com>
··@cable.netlux.org (Alexander Baranovsky) wrote in message news:<····························@posting.google.com>...
> [Someone else wrote]
> > 
> > and you presumably have to write your own code for looping over
> > the elements of a linked list instead of having tools like DOLIST
> > 
> >     (dolist (x some-list)
> >       (do-something-with x))
> > 
> > provided for you.
> 
> Really?
> 
> P := L;
> while P <> null do 
> begin
>    Do-something-with (P);
>    P := P[1];
> end;

This is 6 lines rather than 2, and presumably it's actually some more
lines if you want P to be a local variable (I don't speak pascal well
enough to know how to scope variables to a block).  Say 3-4 times as
much code.  I couldn't have made the point better if I'd tried!
From: Peter Lewerin
Subject: Re: LISPPA
Date: 
Message-ID: <dbc03c5a.0404290018.7fb639f9@posting.google.com>
> > >     (dolist (x some-list)
> > >       (do-something-with x))

> > P := L;
> > while P <> null do 
> > begin
> >    Do-something-with (P);
> >    P := P[1];
> > end;

> This is 6 lines rather than 2, and presumably it's actually some more

I say never mind the line count.  Look at the correspondence to pseudocode instead:

For every x in some-list,
    do-something-with x.

We have a winner.
From: Kenny Tilton
Subject: Re: LISPPA
Date: 
Message-ID: <j46kc.69709$WA4.33007@twister.nyc.rr.com>
Peter Lewerin wrote:
>>>>    (dolist (x some-list)
>>>>      (do-something-with x))
> 
> 
>>>P := L;
>>>while P <> null do 
>>>begin
>>>   Do-something-with (P);
>>>   P := P[1];
>>>end;
> 
> 
>>This is 6 lines rather than 2, and presumably it's actually some more
> 
> 
> I say never mind the line count.  Look at the correspondence to pseudocode instead:
> 
> For every x in some-list,
>     do-something-with x.
> 
> We have a winner.

Assembly language?

kenny

-- 
Home? http://tilton-technology.com
Cells? http://www.common-lisp.net/project/cells/
Cello? http://www.common-lisp.net/project/cello/
Why Lisp? http://alu.cliki.net/RtL%20Highlight%20Film
Your Project Here! http://alu.cliki.net/Industry%20Application
From: Peter Lewerin
Subject: Re: LISPPA
Date: 
Message-ID: <b72f3640.0404291304.e9a2d73@posting.google.com>
> Assembly language?

It could be that your pseudocode is very different from mine, but it's
more likely that I didn't express myself as clearly as I should (or
else you could be making a joke that didn't get).

My point was simply that, given a near-natural language description of
the intended action, the Lisp code was almost identical to it, while
the Pascal code was very different from it.  The Lisp code was
immediately readable, while I had to read the Pascal version a couple
of times to grok it.  (And it's not because of familiarity.  I
actually only very rarely use Lisp; most of the time I use Tcl or C.)
From: Kenny Tilton
Subject: Re: LISPPA
Date: 
Message-ID: <Nyekc.69841$WA4.63576@twister.nyc.rr.com>
Peter Lewerin wrote:

>>Assembly language?
> 
> 
> It could be that your pseudocode is very different from mine, but it's
> more likely that I didn't express myself as clearly as I should (or
> else you could be making a joke that didn't get).

My bad. I am from Hell, English is my second language. I misread:

"I say never mind the line count.  Look at the correspondence to 
pseudocode instead:"

...as "Ignore the line count, just think about the /corresponding/ 
pseudocode".

Too bad, I liked that joke.

:)

kenny

-- 
Home? http://tilton-technology.com
Cells? http://www.common-lisp.net/project/cells/
Cello? http://www.common-lisp.net/project/cello/
Why Lisp? http://alu.cliki.net/RtL%20Highlight%20Film
Your Project Here! http://alu.cliki.net/Industry%20Application
From: Gareth McCaughan
Subject: Re: LISPPA
Date: 
Message-ID: <873c6v10a1.fsf@g.mccaughan.ntlworld.com>
··@cable.netlux.org (Alexander Baranovsky) writes:

[someone else:]
> > In all honesty it is a big deal, for Java's target market.  This
> > market is called "average programmers" and it is the same market that
> > VB was designed for.  Not playing with pointers *is* a big win there.

[Alexander:]
> I think the division on "experts" (Lisp programmers) and "averages"
> (the rest of programmers) exists exclusively in minds of some Lisp
> programmers. :-)

The division into "experts" and "averages" exists in plenty of
non-Lispy minds. And no one is claiming that all the "experts"
are Lisp programmers, nor that all Lisp programmers are "experts";
only that some languages (of which Lisp is one) are more suited
for "experts" and that some (of which Visual Basic is one) are
more suited for non-"experts".

Note that this doesn't imply that Lisp can only be used by experts
or that no expert would ever use VB. I have good reason to believe
that sometimes experts do use VB and that sometimes non-experts
do use Lisp.

I don't think the target markets for Java and for VB are
the same, by the way.

-- 
Gareth McCaughan
.sig under construc
From: Cameron MacKinnon
Subject: Re: LISPPA
Date: 
Message-ID: <_KidnS_JhIrbsBTd4p2dnA@golden.net>
Gareth McCaughan wrote:
> I don't think the target markets for Java and for VB are
> the same, by the way.

Which Java?
- the one for microcontrollers?
- the one for programming cute, sandboxed client-side web applets?
- the one for all client stuff, making Windows a buggy driver layer?
- the one for server side applications?

Or my personal favourite, the one that was going to be 32 bit in every 
implementation for all time, thus relieving the programmer of the need 
to test word size. Has Sun talked about a 64 bit Java yet, or is that 
promise still valid?

VB, say what else you will about it, hasn't changed focus every two 
years. To know the target market for Java, you have to be pretty regular 
in checking Sun's web site.

-- 
Cameron MacKinnon, fresh off a road trip.
Toronto, Canada
From: Karl A. Krueger
Subject: Re: LISPPA
Date: 
Message-ID: <c6bbj2$shj$1@baldur.whoi.edu>
Cameron MacKinnon <··········@clearspot.net> wrote:
> 
> VB, say what else you will about it, hasn't changed focus every two 
> years.

No?  Ever hear of "Visual Fred"?

-- 
Karl A. Krueger <········@example.edu>
Woods Hole Oceanographic Institution
Email address is spamtrapped.  s/example/whoi/
"Outlook not so good." -- Magic 8-Ball Software Reviews
From: Cameron MacKinnon
Subject: Re: LISPPA
Date: 
Message-ID: <-oudncaFvPjKqRTdRVn-hQ@golden.net>
Karl A. Krueger wrote:
> Cameron MacKinnon <··········@clearspot.net> wrote:
> 
>>VB, say what else you will about it, hasn't changed focus every two 
>>years.
> 
> 
> No?  Ever hear of "Visual Fred"?

Well, that's what I get for making an assertion about something I 
haven't been following for some time. Thanks for the laughs, generated 
by googling. I amend my assertion to say that VB has changed direction 
less often but more violently than Java.

-- 
Cameron MacKinnon
Toronto, Canada
From: Adam Warner
Subject: Re: LISPPA
Date: 
Message-ID: <pan.2004.04.25.03.37.05.162965@consulting.net.nz>
Hi Cameron MacKinnon,

> Gareth McCaughan wrote:
>> I don't think the target markets for Java and for VB are the same, by
>> the way.
> 
> Which Java?
> - the one for microcontrollers?
> - the one for programming cute, sandboxed client-side web applets?
> - the one for all client stuff, making Windows a buggy driver layer?
> - the one for server side applications?
> 
> Or my personal favourite, the one that was going to be 32 bit in every
> implementation for all time, thus relieving the programmer of the need
> to test word size. Has Sun talked about a 64 bit Java yet, or is that
> promise still valid?

64-bit Java Virtual Machine implementations are already available. There
are specification constraints that need to be fixed for the JVM to fully
support 64-bit indexing but the limitations could be less than you
suspect.

64-bit JVMs already permit the storage of vast numbers of objects in vast
heap sizes. Where the JVM specification bites is that vectors cannot be
created to index more than 2^31-1 elements. [Except for <char>, unsigned
integers don't exist as primitives in the JVM]:
<http://java.sun.com/docs/books/vmspec/2nd-edition/html/Overview.doc.html#22239>
<http://java.sun.com/docs/books/vmspec/2nd-edition/html/Concepts.doc.html#22854>

Thus it appears the largest contiguous vector that can be indexed is 16GiB
(i.e. 2GiB <long> elements). Fixing this would probably require backwards
incompatible changes to the JVM and the class file format so the decision
won't be hasty.

> VB, say what else you will about it, hasn't changed focus every two
> years.

<http://en.wikipedia.org/wiki/Visual_Basic_.NET>
"... some experts in Visual Basic .NET have characterized it as an
entirely new programming language."

I see you have already amended your position.

> To know the target market for Java, you have to be pretty
> regular in checking Sun's web site.

Sun wants Java to be used everywhere so in a sense the target market is
every market. At present people needing to efficiently index large
contiguous datasets are not being adequately served by Java technologies
(a relatively small but obviously important market).

Regards,
Adam
From: Tim Bradshaw
Subject: Re: LISPPA
Date: 
Message-ID: <ey37jw7gq0d.fsf@cley.com>
* David Steuber wrote:
> My personal experience with Java is that it does not really fix any of
> the perceived blemishes of C.  Sure, it adds a garbage collector.  Big
> deal.  

The tragedy of all this is that, in fact, getting a GCd language into
the mainstream - and not just one, but making the idea that GCd
languages acceptable - *is* a big deal.  In fact it's made more
difference than all the Lisp bigots have ever managed to do.

--tim
From: Espen Vestre
Subject: Re: LISPPA
Date: 
Message-ID: <kwfzavm5t1.fsf@merced.netfonds.no>
Tim Bradshaw <···@cley.com> writes:

> The tragedy of all this is that, in fact, getting a GCd language into
> the mainstream - and not just one, but making the idea that GCd
> languages acceptable - *is* a big deal.  In fact it's made more
> difference than all the Lisp bigots have ever managed to do.

It's utterly strange when you consider that GC-ed language has been
used for serious business purposes for more than 20 years.

But then, when I think about the status of religion more than 100
years after Nietsche, it's not strange at all. And both wrt. IT and
religion, I respect people's right to believe in strange things, but
when they try too hard to make me share their strange thoughts, I tend
to get angry.
-- 
  (espen)
From: Alexander Baranovsky
Subject: Re: LISPPA
Date: 
Message-ID: <27d7ccd8.0404220743.12f51390@posting.google.com>
> Take an unbiased and thorough look at Common Lisp and disprove my sig. It'll
> really be worth it.
> BTW, a good book about the power of lisp-macros is "On Lisp" by Paul Graham.
> It's freely downloadable.

I've downloaded the book in pdf format. It looks interesting, indeed.
Unfortunately I cannot print it properly with Acrobat Reader, it seems
some fonts are absent at my machine (I'm working under WinXP). Could
you help me with it, please? Is there html or rtf version of the book?

Thanks.

A.

> 
Thomas Schilling <······@yahoo.de> wrote in message news:<··············@uni-berlin.de>...
> Alexander Baranovsky wrote:
> 
> > As for the "the code<->data equivalence", I'm constantly hearing it.
> > The concept is great, indeed. But show me a real problem related to
> > this subject which I could not solve with procedural types in Pascal,
> > eval function in JavaScript or even the code blocks in Clipper and
> > lambda-expressions in VIRT.
> 
> Haha.
> 
> You must be very happy with C and Pascal and Basic. I was, too, with C++.
> But then I wanted to do a good code completion module for some kde ide. But
> to make it really useful I wanted to do a whole C++-parser--bad fault. You
> cannot do this painlessly. Since in C++ syntax and semantics are so tightly
> coupled it's really hard, especially when it comes to templates. (Yeah, ok,
> could use bison+flex, but had to be incremental and error tolerant and all
> that stuff.)
> 
> Of couse it all was _possible_. But it's not about that. It's about how much
> _effort_ it is.
> 
> Due to time probs and other reasons I gave up and wanted to do some own
> language with a very easy-to-parse syntax (inspired by python's) but before
> creating just another new language I first looked for a book about advanced
> programming language design, but before I finished reading it I luckily got
> introduced to Common Lisp.
> 
> And besides all that other great features already listed in other posts to
> this thread the code<->data equivalence is a _great_ concept. The great
> about it is that lisp code already is the lisp syntax tree, ie. there's no
> (real) parsing. Just take a look at the "loop"-macro it's a whole
> sublanguage. and one big statement of Graham's "On Lisp" is that in lisp
> due to Common Lisp's powerful macros (which are that powerful due to the
> code<->data-equivalence) you effectively create a problem-specific
> programming language. And all that without much effort and any sort of
> (self-written) lexer or parser.
> 
> Believe me. It is incredible how far above other languages common lisp is if
> you don't take a thorough look at it.
> 
> Take an unbiased and thorough look at Common Lisp and disprove my sig. It'll
> really be worth it.
> BTW, a good book about the power of lisp-macros is "On Lisp" by Paul Graham.
> It's freely downloadable.
> 
> regards
> 
> Thomas
From: Thomas Schilling
Subject: Re: LISPPA
Date: 
Message-ID: <c68qvv$9c9gd$1@uni-berlin.de>
Alexander Baranovsky wrote:

> I've downloaded the book in pdf format. It looks interesting, indeed.
> Unfortunately I cannot print it properly with Acrobat Reader, it seems
> some fonts are absent at my machine (I'm working under WinXP). Could
> you help me with it, please? Is there html or rtf version of the book?

Hm. Strange. Normally fonts get embedded in pdf. You can try to get a newer
acrobat version. But I doubt that helps. There's a ps version, too. You can
then use ghostscript+ghostview or use some converting tool from http:/
www.icdatamaster.com/convert.html (but I didn't test them). I think there's
no html/rtf-version available online.

BTW: all(?) the images in "On Lisp" are missing for some reason.

- ts
From: Gareth McCaughan
Subject: Re: LISPPA
Date: 
Message-ID: <87ad162ph1.fsf@g.mccaughan.ntlworld.com>
··@cable.netlux.org (Alexander Baranovsky) writes:

> > In short: Lisp is not a functional language with some
> > imperative bits; it is a multi-paradigm language that
> > supports both functional and imperative styles of
> > programming.
> 
> The acticle"History of Lisp" by John McCarthy 
> 
> http://www-formal.stanford.edu/jmc/history/lisp/node1.html
> 
> shows that you are not quite right :-)

Oh, sure, Lisp has functional roots. But from very early days
it has not been a pure functional language. I'm not disputing
that the language McCarthy introduced in an article back in
1959 was functional, but it took very little time after that
for the actual Lisp language, as implemented on real computers,
to cease to be pure functional; and, I repeat, modern Lisps
are *not* functional languages with a little imperative stuff
glued on.

> > > 1. NULL (undefined variable) is empty list.
> > > 2. If L is a list, the one-dimensional array A with two elements A(0)
> > > and A(1) = L is a list too.
> > > 
> > > We cannot produce such concise definition of the linked list using
> > > standard approach based on pointers. This is the start point, further
> > > LISPPA offers a uniform ways to process lists and other dynamic data
> > > structures.
> > 
> > I don't see why we can't produce a similarly concise definition
> > using pointers.
> > 
> >   1 The null pointer represents the empty list.
> >   2 If L is a list and X any object, then a block
> >     of memory containing two pointers, one of which
> >     points to X and the other of which points to L,
> >     is a list.
> 
> But what is the "memory block"? Can you explain it in a few words? :-)
> My definition does not know any "memory blocks". Besides, I can define
> the polymorphic array without any references on the "memory blocks".

Your definition makes reference to "arrays". My definition
makes reference to "pointers and blocks of memory". I don't
see how mine leaves anything more unexplained than yours
does. Mine has two unexplained concepts rather than one,
you say? But my unexplained concepts are much simpler than
yours. And, besides, your definition uses "=" in a way that
either is rather subtle in its own right or else requires
your definition of an "array" to include something equivalent
to a definition of "pointer", to explain what it means for
one array to be an element of another.

> > I think the reason
> > why the Lisp code is so much shorter (and, to my mind, neater)
> > is simply that Lisp is a better language than Pascal, even when
> > you add a handy new variant type to Pascal and provide some
> > features for building arrays of variants. Certainly adding
> > variants and nicer arrays is an improvement to Pascal, but
> > the deficiencies of Pascal in comparison to Lisp go so much
> > deeper than that.
> 
> Oh, in my opinion the Pascal is an exellent language even without OOP,
> i.e in his first edition by Niklaus Wirth. Representation of the data
> structures based on Hoare theory, clean algol-liked syntax, structured
> programming, I like it from the beginning. But pointers are not very
> good thing, the processing of simple recursive data structures is not
> adequate.

If you consider Pascal -- especially its early versions -- an
excellent language, then I fear your criteria for excellence and
mine are so different that any agreement will be hard to find.

> > Perhaps. And it may be that your languages with "LISPPA technology"
> > are useful to people who are (for whatever reason) unable to program
> > in Lisp or in other languages with polymorphic arrays such as Python.
> > But here in comp.lang.lisp, that's not your audience :-).
> 
> "Recruitment" was not my goal at comp.lang.lisp :-) Besides, I know
> the first love never die. You and subscribers of the newsgroup will
> like Lisp even after I'll able to prove you that these "improvements"
> to Pascal allows to program AI application as well with the same
> success or even better than Lisp it does  ;-)

You should resent your compelling evidence that your modified Pascal
makes AI programming as comfortable as Lisp does *before* you accuse
us of having minds closed to that claim, not after.

But, frankly, I think that (1) you are being entirely unfair when
you make that accusation, and (2) you will be unable to demonstrate
that Pascal+LISPPA is as good for AI programming (or indeed for
just about any sort of programming) as Lisp, because it isn't
true

Feel free to prove me wrong, but let me remind me that the last
testable assertion you made about the relative capabilities of
Lisp and Pascal+LISPPA was that your "Compress" function couldn't
be written any more concisely in Lisp, and that I showed you how
in Lisp you could halve its length and simultaneously provide a
powerful tool that can be used elsewhere in the program -- while
also making the important bit of the "compression" algorithm
(the set of transformations it does) much easier to understand.
This doesn't encourage me to put much trust in your claims about
what LISPPA does for the expressivity of Pascal.

-- 
Gareth McCaughan
.sig under construc
From: Alexander Baranovsky
Subject: Re: LISPPA
Date: 
Message-ID: <27d7ccd8.0404210142.671317a0@posting.google.com>
> > > > 1. NULL (undefined variable) is empty list.
> > > > 2. If L is a list, the one-dimensional array A with two elements A(0)
> > > > and A(1) = L is a list too.
> > > > 
> > > > We cannot produce such concise definition of the linked list using
> > > > standard approach based on pointers. This is the start point, further
> > > > LISPPA offers a uniform ways to process lists and other dynamic data
> > > > structures.
> > > 
> > > I don't see why we can't produce a similarly concise definition
> > > using pointers.
> > > 
> > >   1 The null pointer represents the empty list.
> > >   2 If L is a list and X any object, then a block
> > >     of memory containing two pointers, one of which
> > >     points to X and the other of which points to L,
> > >     is a list.
> > 
> > But what is the "memory block"? Can you explain it in a few words? :-)
> > My definition does not know any "memory blocks". Besides, I can define
> > the polymorphic array without any references on the "memory blocks".
> 
> Your definition makes reference to "arrays". My definition
> makes reference to "pointers and blocks of memory". I don't
> see how mine leaves anything more unexplained than yours
> does. Mine has two unexplained concepts rather than one,
> you say? But my unexplained concepts are much simpler than
> yours. And, besides, your definition uses "=" in a way that
> either is rather subtle in its own right or else requires
> your definition of an "array" to include something equivalent
> to a definition of "pointer", to explain what it means for
> one array to be an element of another.

Oh, let's do not discuss it once again :-). You are protesting against
obvious fact: my definition is "machine"-independent and concise,
yours  - not. As for the definition of polymorhic array, please read
my answer to Kaz Kylheku.

> You should resent your compelling evidence that your modified Pascal
> makes AI programming as comfortable as Lisp does *before* you accuse
> us of having minds closed to that claim, not after.

I'm completely agree with you. This is a subject a big future
discussion. At present time, I'm single programmer who uses the
modified Pascal, LISPPA exists only 12 days. So, will see ;-)

> 
> But, frankly, I think that (1) you are being entirely unfair when
> you make that accusation, 

Sorry, because I do not understand what you mean under "accusation". 
Imperative languages (ok, I mean C, Pascal, Java, Basic) are more
widely used than Lisp clones, but this is not my "accusation", it is
well known fact. I'm just stating that LISPPA extends possibility the
mentioned imperative languages in AI domain and the first results of
testing prove it.

and (2) you will be unable to demonstrate
> that Pascal+LISPPA is as good for AI programming (or indeed for
> just about any sort of programming) as Lisp, because it isn't
> true

But you do not prove the inverse :-) You show a few fragments of your
code without showing mine. If I did not discuss it, it does not mean
that I agree with you here. Consider for example your statement that
your variant of Compress procedure takes about 51% of mine. But I can
easyly reewrite my code to get the inverse result if I will replace 2
lines of my code

      else if IsNeg(X2) and (not IsNeg(X1)) then
        reduced R := [OP_SUB, X1, X2[1]]
by
      else if IsNeg(X2) and (not IsNeg(X1)) then reduced R := [OP_SUB,
X1, X2[1]]

It will reduce total number of lines twice. So, I cannot consider your
statement as a serious argument :-)

Another example, you consider function like

function IsZero(const Term: Variant): boolean;
begin
  result:= IsConstant(Term);
  if result then
    result := Term = 0;
end;

as too long. But I wote it in such way to avoid a possible problem
with short boolean evaluation. The short boolean evaluation is a
default parameter in paxScript, besides I can do not declare variants
explicitly, so the function can be rewritten

function IsZero(Term);
begin
  result:= IsConstant(Term) and (Term = 0);
end;

Sorry that I need to discuss these details here. 

Further, you state that Lisp code is more easy-to-understand. But what
do you think: what will opinion any programmer in Basic, Pascal, C or
Java? I do not sure they will choose Lisp instead of the modified
Basic, Pascal, C or Java.

As for the effectivity, the detailed benchmarks are necessary.

A.

Gareth McCaughan <················@pobox.com> wrote in message news:<··············@g.mccaughan.ntlworld.com>...
> ··@cable.netlux.org (Alexander Baranovsky) writes:
> 
> > > In short: Lisp is not a functional language with some
> > > imperative bits; it is a multi-paradigm language that
> > > supports both functional and imperative styles of
> > > programming.
> > 
> > The acticle"History of Lisp" by John McCarthy 
> > 
> > http://www-formal.stanford.edu/jmc/history/lisp/node1.html
> > 
> > shows that you are not quite right :-)
> 
> Oh, sure, Lisp has functional roots. But from very early days
> it has not been a pure functional language. I'm not disputing
> that the language McCarthy introduced in an article back in
> 1959 was functional, but it took very little time after that
> for the actual Lisp language, as implemented on real computers,
> to cease to be pure functional; and, I repeat, modern Lisps
> are *not* functional languages with a little imperative stuff
> glued on.
> 
> > > > 1. NULL (undefined variable) is empty list.
> > > > 2. If L is a list, the one-dimensional array A with two elements A(0)
> > > > and A(1) = L is a list too.
> > > > 
> > > > We cannot produce such concise definition of the linked list using
> > > > standard approach based on pointers. This is the start point, further
> > > > LISPPA offers a uniform ways to process lists and other dynamic data
> > > > structures.
> > > 
> > > I don't see why we can't produce a similarly concise definition
> > > using pointers.
> > > 
> > >   1 The null pointer represents the empty list.
> > >   2 If L is a list and X any object, then a block
> > >     of memory containing two pointers, one of which
> > >     points to X and the other of which points to L,
> > >     is a list.
> > 
> > But what is the "memory block"? Can you explain it in a few words? :-)
> > My definition does not know any "memory blocks". Besides, I can define
> > the polymorphic array without any references on the "memory blocks".
> 
> Your definition makes reference to "arrays". My definition
> makes reference to "pointers and blocks of memory". I don't
> see how mine leaves anything more unexplained than yours
> does. Mine has two unexplained concepts rather than one,
> you say? But my unexplained concepts are much simpler than
> yours. And, besides, your definition uses "=" in a way that
> either is rather subtle in its own right or else requires
> your definition of an "array" to include something equivalent
> to a definition of "pointer", to explain what it means for
> one array to be an element of another.
> 
> > > I think the reason
> > > why the Lisp code is so much shorter (and, to my mind, neater)
> > > is simply that Lisp is a better language than Pascal, even when
> > > you add a handy new variant type to Pascal and provide some
> > > features for building arrays of variants. Certainly adding
> > > variants and nicer arrays is an improvement to Pascal, but
> > > the deficiencies of Pascal in comparison to Lisp go so much
> > > deeper than that.
> > 
> > Oh, in my opinion the Pascal is an exellent language even without OOP,
> > i.e in his first edition by Niklaus Wirth. Representation of the data
> > structures based on Hoare theory, clean algol-liked syntax, structured
> > programming, I like it from the beginning. But pointers are not very
> > good thing, the processing of simple recursive data structures is not
> > adequate.
> 
> If you consider Pascal -- especially its early versions -- an
> excellent language, then I fear your criteria for excellence and
> mine are so different that any agreement will be hard to find.
> 
> > > Perhaps. And it may be that your languages with "LISPPA technology"
> > > are useful to people who are (for whatever reason) unable to program
> > > in Lisp or in other languages with polymorphic arrays such as Python.
> > > But here in comp.lang.lisp, that's not your audience :-).
> > 
> > "Recruitment" was not my goal at comp.lang.lisp :-) Besides, I know
> > the first love never die. You and subscribers of the newsgroup will
> > like Lisp even after I'll able to prove you that these "improvements"
> > to Pascal allows to program AI application as well with the same
> > success or even better than Lisp it does  ;-)
> 
> You should resent your compelling evidence that your modified Pascal
> makes AI programming as comfortable as Lisp does *before* you accuse
> us of having minds closed to that claim, not after.
> 
> But, frankly, I think that (1) you are being entirely unfair when
> you make that accusation, and (2) you will be unable to demonstrate
> that Pascal+LISPPA is as good for AI programming (or indeed for
> just about any sort of programming) as Lisp, because it isn't
> true
> 
> Feel free to prove me wrong, but let me remind me that the last
> testable assertion you made about the relative capabilities of
> Lisp and Pascal+LISPPA was that your "Compress" function couldn't
> be written any more concisely in Lisp, and that I showed you how
> in Lisp you could halve its length and simultaneously provide a
> powerful tool that can be used elsewhere in the program -- while
> also making the important bit of the "compression" algorithm
> (the set of transformations it does) much easier to understand.
> This doesn't encourage me to put much trust in your claims about
> what LISPPA does for the expressivity of Pascal.
From: Raymond Wiker
Subject: Re: LISPPA
Date: 
Message-ID: <86n055wo6q.fsf@raw.grenland.fast.no>
··@cable.netlux.org (Alexander Baranovsky) writes:

> Further, you state that Lisp code is more easy-to-understand. But what
> do you think: what will opinion any programmer in Basic, Pascal, C or
> Java? I do not sure they will choose Lisp instead of the modified
> Basic, Pascal, C or Java.

        The opinion of people who don't know Lisp does not count for
much in comp.lang.lisp, at least not when it comes to the readability
of other languages compared to Lisp.

-- 
Raymond Wiker                        Mail:  ·············@fast.no
Senior Software Engineer             Web:   http://www.fast.no/
Fast Search & Transfer ASA           Phone: +47 23 01 11 60
P.O. Box 1677 Vika                   Fax:   +47 35 54 87 99
NO-0120 Oslo, NORWAY                 Mob:   +47 48 01 11 60

Try FAST Search: http://alltheweb.com/
From: Alexander Baranovsky
Subject: Re: LISPPA
Date: 
Message-ID: <27d7ccd8.0404211143.3c3d2256@posting.google.com>
>         The opinion of people who don't know Lisp does not count for
> much in comp.lang.lisp, at least not when it comes to the readability
> of other languages compared to Lisp.

This is a practical programmer's opinion.

Let's suppose you are a Delphi or a Visual Basic programmer who is
developing and supporting an application a few years and now you want
to add some sort of AI functionality to your application. What you
will prefer: to solve the problem directly in Delphi or Visual Basic
or use Lisp and to solve extra problem of the cross-language
integration? Now suppose Delphi or Visual Basic are extended with
tools which allows you to program AI problems with the same success
like Lisp. Will you vacillate?

A.

Raymond Wiker <·············@fast.no> wrote in message news:<··············@raw.grenland.fast.no>...
> ··@cable.netlux.org (Alexander Baranovsky) writes:
> 
> > Further, you state that Lisp code is more easy-to-understand. But what
> > do you think: what will opinion any programmer in Basic, Pascal, C or
> > Java? I do not sure they will choose Lisp instead of the modified
> > Basic, Pascal, C or Java.
> 
>         The opinion of people who don't know Lisp does not count for
> much in comp.lang.lisp, at least not when it comes to the readability
> of other languages compared to Lisp.
From: Matthew Danish
Subject: Re: LISPPA
Date: 
Message-ID: <20040421201942.GM25328@mapcar.org>
On Wed, Apr 21, 2004 at 12:43:39PM -0700, Alexander Baranovsky wrote:
> Let's suppose you are a Delphi or a Visual Basic programmer who is
> developing and supporting an application a few years and now you want
> to add some sort of AI functionality to your application. What you
> will prefer: to solve the problem directly in Delphi or Visual Basic
> or use Lisp and to solve extra problem of the cross-language
> integration? Now suppose Delphi or Visual Basic are extended with
> tools which allows you to program AI problems with the same success
> like Lisp. Will you vacillate?

And you still don't quite understand what we are telling you:  lists are
not the secret to "AI solutions".  When you understand that, you will
understand why LISPPA doesn't really change anything.

-- 
; Matthew Danish <·······@andrew.cmu.edu>
; OpenPGP public key: C24B6010 on keyring.debian.org
; Signed or encrypted mail welcome.
; "There is no dark side of the moon really; matter of fact, it's all dark."
From: Marc Spitzer
Subject: Re: LISPPA
Date: 
Message-ID: <86ad15yszv.fsf@bogomips.optonline.net>
··@cable.netlux.org (Alexander Baranovsky) writes:

>>         The opinion of people who don't know Lisp does not count for
>> much in comp.lang.lisp, at least not when it comes to the readability
>> of other languages compared to Lisp.
>
> This is a practical programmer's opinion.
>
> Let's suppose you are a Delphi or a Visual Basic programmer who is
> developing and supporting an application a few years and now you want
> to add some sort of AI functionality to your application. What you
> will prefer: to solve the problem directly in Delphi or Visual Basic
> or use Lisp and to solve extra problem of the cross-language
> integration? Now suppose Delphi or Visual Basic are extended with
> tools which allows you to program AI problems with the same success
> like Lisp. Will you vacillate?

Umm window has this thing called COM so intergration is not a big deal
for CL<->VB, Delphi just use COM.  This would also have the advantage
of allowing for the development of the components seperate from the
applacation, which is in maintenance mode, so you do not have to make
sure that bug fixes and such get in to both source trees.  Or you
could just use a socket and define a protocol to talk.  

marc
From: Tim Bradshaw
Subject: Re: LISPPA
Date: 
Message-ID: <ey3vfjtrpe8.fsf@cley.com>
* Alexander Baranovsky wrote:
> Now suppose Delphi or Visual Basic are extended with
> tools which allows you to program AI problems with the same success
> like Lisp. Will you vacillate?

I don't think many people have thought that linked lists really helped
much towards solve any interesting AI problems since sometime in the
60s.

--tim
From: Raymond Wiker
Subject: Re: LISPPA
Date: 
Message-ID: <86isftvzm7.fsf@raw.grenland.fast.no>
··@cable.netlux.org (Alexander Baranovsky) writes:

>>         The opinion of people who don't know Lisp does not count for
>> much in comp.lang.lisp, at least not when it comes to the readability
>> of other languages compared to Lisp.
>
> This is a practical programmer's opinion.
>
> Let's suppose you are a Delphi or a Visual Basic programmer who is
> developing and supporting an application a few years and now you want
> to add some sort of AI functionality to your application. What you
> will prefer: to solve the problem directly in Delphi or Visual Basic
> or use Lisp and to solve extra problem of the cross-language
> integration? Now suppose Delphi or Visual Basic are extended with
> tools which allows you to program AI problems with the same success
> like Lisp. Will you vacillate?

        Visual Basic and Delphi do not exist in my universe.

        Given a _real_ choice between Common Lisp and all other
languages I am familiar with, I would choose Common Lisp for any
non-trivial programming task. 

-- 
Raymond Wiker                        Mail:  ·············@fast.no
Senior Software Engineer             Web:   http://www.fast.no/
Fast Search & Transfer ASA           Phone: +47 23 01 11 60
P.O. Box 1677 Vika                   Fax:   +47 35 54 87 99
NO-0120 Oslo, NORWAY                 Mob:   +47 48 01 11 60

Try FAST Search: http://alltheweb.com/
From: Alexander Baranovsky
Subject: Re: LISPPA
Date: 
Message-ID: <27d7ccd8.0404211853.59587b39@posting.google.com>
>         Visual Basic and Delphi do not exist in my universe.
> 
>         Given a _real_ choice between Common Lisp and all other
> languages I am familiar with, I would choose Common Lisp for any
> non-trivial programming task.

"O God, I could be bounded in a nut-shell and count myself a king of
infinite space..." (Ham., Act 2, Scene 2)

:-)

A.

Raymond Wiker <·············@fast.no> wrote in message news:<··············@raw.grenland.fast.no>...
> ··@cable.netlux.org (Alexander Baranovsky) writes:
> 
> >>         The opinion of people who don't know Lisp does not count for
> >> much in comp.lang.lisp, at least not when it comes to the readability
> >> of other languages compared to Lisp.
> >
> > This is a practical programmer's opinion.
> >
> > Let's suppose you are a Delphi or a Visual Basic programmer who is
> > developing and supporting an application a few years and now you want
> > to add some sort of AI functionality to your application. What you
> > will prefer: to solve the problem directly in Delphi or Visual Basic
> > or use Lisp and to solve extra problem of the cross-language
> > integration? Now suppose Delphi or Visual Basic are extended with
> > tools which allows you to program AI problems with the same success
> > like Lisp. Will you vacillate?
> 
>         Visual Basic and Delphi do not exist in my universe.
> 
>         Given a _real_ choice between Common Lisp and all other
> languages I am familiar with, I would choose Common Lisp for any
> non-trivial programming task.
From: Gareth McCaughan
Subject: Re: LISPPA
Date: 
Message-ID: <87isfs1hih.fsf@g.mccaughan.ntlworld.com>
··@cable.netlux.org (Alexander Baranovsky) writes:

[someone else:]
> >         Visual Basic and Delphi do not exist in my universe.
> > 
> >         Given a _real_ choice between Common Lisp and all other
> > languages I am familiar with, I would choose Common Lisp for any
> > non-trivial programming task.

[Alexander, quoting:]
> "O God, I could be bounded in a nut-shell and count myself a king of
> infinite space..." (Ham., Act 2, Scene 2)

So, it's official: Shakespeare invented garbage collection.

-- 
Gareth McCaughan
.sig under construc
From: Gareth McCaughan
Subject: Re: LISPPA
Date: 
Message-ID: <87r7uh0wt0.fsf@g.mccaughan.ntlworld.com>
Alexander Baranovsky wrote:

>>> But what is the "memory block"? Can you explain it in a few words? :-)
>>> My definition does not know any "memory blocks". Besides, I can define
>>> the polymorphic array without any references on the "memory blocks".
>> 
>> Your definition makes reference to "arrays". My definition
>> makes reference to "pointers and blocks of memory". I don't
>> see how mine leaves anything more unexplained than yours
>> does. Mine has two unexplained concepts rather than one,
>> you say? But my unexplained concepts are much simpler than
>> yours. And, besides, your definition uses "=" in a way that
>> either is rather subtle in its own right or else requires
>> your definition of an "array" to include something equivalent
>> to a definition of "pointer", to explain what it means for
>> one array to be an element of another.
> 
> Oh, let's do not discuss it once again :-). You are protesting against
> obvious fact: my definition is "machine"-independent and concise,
> yours  - not. As for the definition of polymorhic array, please read
> my answer to Kaz Kylheku.

If you want a concise, clean, machine-independent definition,
here is one:

    A list is either the-empty-list or an ordered pair whose
    second element is a list.

You can build ordered pairs out of "polymorphic arrays",
as in your LISPPA languages. You can have a special data
type for them, as in Lisp. You can build them out of
"structures" or "records", as you would do in plain ol'
C or Pascal. (The lack of variant types would hurt there.)
You can build them out of pointers and memory locations.
There are dozens of implementations of ordered pairs
on top of other substrates at various levels of abstraction.

>> You should resent your compelling evidence that your modified Pascal
>> makes AI programming as comfortable as Lisp does *before* you accuse
>> us of having minds closed to that claim, not after.
...
>> But, frankly, I think that (1) you are being entirely unfair when
>> you make that accusation, 
> 
> Sorry, because I do not understand what you mean under "accusation". 

You said that the inhabitants of comp.lang.lisp (and me in
particular, for some reason) "will like Lisp even after"
you've proved that Pascal+LISPPA is just as good as Lisp.
Now, it's not quite clear what you meant by that. If you
really meant only "will like Lisp" then obviously that's
true but it's also completely uninformative and irrelevant;
so I assumed that you meant something less trivial, namely
that we would still *prefer* Lisp to your slightly improved
version of Pascal. Which is, so far as I can see, the same
as saying that we have closed minds and will not accept the
superiority of your approach even once you've demonstrated
it.

> Imperative languages (ok, I mean C, Pascal, Java, Basic) are more
> widely used than Lisp clones, but this is not my "accusation", it is
> well known fact. I'm just stating that LISPPA extends possibility the
> mentioned imperative languages in AI domain and the first results of
> testing prove it.

To that, I can only reply as I have before, that what makes
Lisp suitable for solving difficult problems such as those
that arise in AI is much, much more than the ability to make
lists of diversely typed objects.

> > and (2) you will be unable to demonstrate
> > that Pascal+LISPPA is as good for AI programming (or indeed for
> > just about any sort of programming) as Lisp, because it isn't
> > true
> 
> But you do not prove the inverse :-) You show a few fragments of your
> code without showing mine.

You've already shown yours: it's on your web page.

>                            If I did not discuss it, it does not mean
> that I agree with you here. Consider for example your statement that
> your variant of Compress procedure takes about 51% of mine. But I can
> easyly reewrite my code to get the inverse result if I will replace 2
> lines of my code
> 
>       else if IsNeg(X2) and (not IsNeg(X1)) then
>         reduced R := [OP_SUB, X1, X2[1]]
> by
>       else if IsNeg(X2) and (not IsNeg(X1)) then reduced R := [OP_SUB,
> X1, X2[1]]
> 
> It will reduce total number of lines twice. So, I cannot consider your
> statement as a serious argument :-)

If the lines of my code were much longer than the lines of yours
then that might be a sensible response. But it seems to me that
turning

    else if IsMult(X1) then
    begin
    ...
      else if IsPow(X1[1]) and (X1[1][1] = X2) then
        reduced R := [OP_MULT, X1[2], [OP_POW, X2, [OP_ADD, X1[1][2], 1]]]
    ...
    end

into

    (('* ('* ('expt x a) y) x) `(* ,y (expt ,x (+ ,a 1))))

is an unambiguous improvement in conciseness and not merely
a result of jamming multiple things onto a single line. By
the way, it was 55%, not 51%.

Anyway: If you don't consider line-counting a reasonable way
of comparing the conciseness of two pieces of code, that's
fine. Why not propose another measure? Would you rather count
characters? Tokens? Syllables? I'll be surprised if you find
any measure according to which your LISPPA code -- and remember,
this is the code you specifically *chose* as a shining example
of the virtues of LISPPA -- is less than 1.5 times the size of
my Lisp code.

> Another example, you consider function like
> 
> function IsZero(const Term: Variant): boolean;
> begin
>   result:= IsConstant(Term);
>   if result then
>     result := Term = 0;
> end;
> 
> as too long.

No: I consider it *redundant*. 

>              But I wote it in such way to avoid a possible problem
> with short boolean evaluation. The short boolean evaluation is a
> default parameter in paxScript, besides I can do not declare variants
> explicitly, so the function can be rewritten
> 
> function IsZero(Term);
> begin
>   result:= IsConstant(Term) and (Term = 0);
> end;
> 
> Sorry that I need to discuss these details here. 
> 
> Further, you state that Lisp code is more easy-to-understand.

No, I don't state that. I said nothing about the comprehensibility
of Lisp *in general* as compared to that of Pascal *in general*.
And, further, I'll gladly admit that the first part of my code
(the general-purpose pattern-matching macro) is pretty obscure.
When I said that my code is "much easier to read", I mis-spoke;
I should have made that claim only for the mathematically
interesting part, namely the pattern-transforming bit.

>                                                               But what
> do you think: what will opinion any programmer in Basic, Pascal, C or
> Java? I do not sure they will choose Lisp instead of the modified
> Basic, Pascal, C or Java.

Well, duh, someone who is used to language X and has never
seen language Y is likely to find code in language X easier
to read. What you need to compare is how readable the Lisp
code is for someone who knows Lisp, versus how readable the
Pascal code is for someone who knows Pascal. Now, it just
so happens that I know both Pascal and Lisp, and I say the
Lisp is more readable :-).

-- 
Gareth McCaughan
.sig under construc
From: David Steuber
Subject: Re: LISPPA
Date: 
Message-ID: <87n054ih3n.fsf@david-steuber.com>
Gareth McCaughan <················@pobox.com> writes:

> Anyway: If you don't consider line-counting a reasonable way
> of comparing the conciseness of two pieces of code, that's
> fine. Why not propose another measure? Would you rather count
> characters? Tokens? Syllables? I'll be surprised if you find
> any measure according to which your LISPPA code -- and remember,
> this is the code you specifically *chose* as a shining example
> of the virtues of LISPPA -- is less than 1.5 times the size of
> my Lisp code.

Measuring code complexity (or size) is a slightly thorny issue.  It
gets much hairier when you try to measure programmer productivity by
some derivative function, ie LOC/day.

One measure I thought might be reasonable for code complexity would be
the number of expressions in the code.  In C, would probably cheat a
little and just count the number of #\; occurrences in the code while
in Lisp I would count the number of #\( occurrences.  It's not a fool
proof measure, but it might be a nearly fair way to compare.

You could go for character counting for fine granularity but I don't
think people think that way.  They think more along the lines of
phrases and sentences.  Also, word completion is a feature in many
editors, so token length doesn't seem like such a big deal.

It has already been mentioned in this thread that lists are not what
makes Lisp such a great language.  Common Lisp is the sum of all its
features.  I would say that is true of any language.  I may be more
productive in C + Perl now, but I don't think that will last.  The
available features in ANSI Common Lisp gives it a good leg up on the
competition.  I am also noticing that implementation portable CL
libraries are growing in number and diversity.

I get the feeling that the popularity of Lisp is on an uptick.  Maybe
it won't be the next Java in terms of users.  I think that is a loss
to the world regardless of what some have said about popularity in the
past.  At least improved popularity will mean better implementations
and more libraries.

Just one other note.  AI has been mentioned a number of times in this
thread.  AI is actually not why I am interested in Lisp at all.  My
interest in Lisp comes from my perception that it offers programmers
at least as good productivity as VHLL scripting languages with
performance and flexibility that is on par with performance oriented
compiled languages.  Those sound like compelling features to me.

In one of the MIT lectures on dynamic languages that was posted here a
little while back (those MOV files with the 'wizards'), the moderator
mentioned not having a language to allow him to both do music
composition and DSP at the same time.  That is, he needed two seperate
tools for the high level composing and the low level DSP stuff.  Maybe
Lisp can handle DSP.  I haven't looked into that yet.  But there seems
to be no shortage of high performance applications written in Lisp.

I also expect that in spite of Moore's Law, performance will become an
important feature in Lisp implementations.  There will always be
applications, including rather complex ones, that need every clock
cycle they can get at runtime.

-- 
I wouldn't mind the rat race so much if it wasn't for all the damn cats.
From: Raymond Wiker
Subject: Re: LISPPA
Date: 
Message-ID: <86smewv3jm.fsf@raw.grenland.fast.no>
David Steuber <·····@david-steuber.com> writes:

> In one of the MIT lectures on dynamic languages that was posted here a
> little while back (those MOV files with the 'wizards'), the moderator
> mentioned not having a language to allow him to both do music
> composition and DSP at the same time.  That is, he needed two seperate
> tools for the high level composing and the low level DSP stuff.  Maybe
> Lisp can handle DSP.  I haven't looked into that yet.  But there seems
> to be no shortage of high performance applications written in Lisp.

        Lisp *can* do DSP... CL Music and Audacity (via the "Nyquist"
subsystem) are two examples.

-- 
Raymond Wiker                        Mail:  ·············@fast.no
Senior Software Engineer             Web:   http://www.fast.no/
Fast Search & Transfer ASA           Phone: +47 23 01 11 60
P.O. Box 1677 Vika                   Fax:   +47 35 54 87 99
NO-0120 Oslo, NORWAY                 Mob:   +47 48 01 11 60

Try FAST Search: http://alltheweb.com/
From: rif
Subject: Re: LISPPA
Date: 
Message-ID: <wj0vfjsceh8.fsf@five-percent-nation.mit.edu>
Raymond Wiker <·············@fast.no> writes:

> David Steuber <·····@david-steuber.com> writes:
> 
> > In one of the MIT lectures on dynamic languages that was posted here a
> > little while back (those MOV files with the 'wizards'), the moderator
> > mentioned not having a language to allow him to both do music
> > composition and DSP at the same time.  That is, he needed two seperate
> > tools for the high level composing and the low level DSP stuff.  Maybe
> > Lisp can handle DSP.  I haven't looked into that yet.  But there seems
> > to be no shortage of high performance applications written in Lisp.
> 
>         Lisp *can* do DSP... CL Music and Audacity (via the "Nyquist"
> subsystem) are two examples.
> 

I've been using CL for most of my DSP needs for awhile.  I've written
an audio feature-extraction front end (extracting cepstral features
for speech applications), and some voice activity detection
algorithms.  I coded up a usably fast FFT in a few hours, although now
I use fftw, called via CMUCL's FFI.

Cheers,

rif
From: Kenny Tilton
Subject: Re: LISPPA
Date: 
Message-ID: <TRtic.62038$WA4.12927@twister.nyc.rr.com>
Raymond Wiker wrote:

> David Steuber <·····@david-steuber.com> writes:
> 
> 
>>In one of the MIT lectures on dynamic languages that was posted here a
>>little while back (those MOV files with the 'wizards'), the moderator
>>mentioned not having a language to allow him to both do music
>>composition and DSP at the same time.  That is, he needed two seperate
>>tools for the high level composing and the low level DSP stuff.  Maybe
>>Lisp can handle DSP.  I haven't looked into that yet.  But there seems
>>to be no shortage of high performance applications written in Lisp.
> 
> 
>         Lisp *can* do DSP... CL Music and Audacity (via the "Nyquist"
> subsystem) are two examples.
> 

I am not disagreeing with "Lisp can do DSP", but CLM's definstrument 
writes out a C file, compiles it, links it into a dynamic shared file, 
then loads it. Even that just writes out a midi or wav file, which onne 
plays using a utility program. otoh, some has written a Scheme 
implementation of SuperCollider's SCLang, which talks to the 
Supercollider server to generate sound in real-time.

kenny



-- 
Home? http://tilton-technology.com
Cells? http://www.common-lisp.net/project/cells/
Cello? http://www.common-lisp.net/project/cello/
Why Lisp? http://alu.cliki.net/RtL%20Highlight%20Film
Your Project Here! http://alu.cliki.net/Industry%20Application
From: Kaz Kylheku
Subject: Re: LISPPA
Date: 
Message-ID: <cf333042.0404201214.1a712e35@posting.google.com>
··@cable.netlux.org (Alexander Baranovsky) wrote in message news:<····························@posting.google.com>...
> LISPPA (List Processing based on the Polymorphic Arrays) technology
> provides a way to process dynamic data structures (lists, trees and
> more) without using pointers. LISPPA uses polymorphic arrays as a base
> of  the data representation.

This contains a fallacy. When you are working with arrays, you are in
fact working with pointers. An array index is a pointer. Alternately,
when you are working with pointers, you are working with an array: the
address space.

The differences are largely notational. For instance, the well known
textbook _Introduction to Algorithms_ by Cormen, Leisserson, Rivest &
Stein, uses array notation throughout. Given some object X, such as a
tree node, the pseudo-code language uses expressions like:

   left[parent[x]]

which is really just different syntax for x.parent.left or (left
(parent x)) or what have you.

I once had to work in a scripting language with no dynamic data
structures, just arrays. I hacked up dynamic binary search trees in it
anyway.

Once you do that, all the problems associated with pointers crop up in
your array indices: leaks, danglings, out of bounds, etc.
From: Alexander Baranovsky
Subject: Re: LISPPA
Date: 
Message-ID: <27d7ccd8.0404202343.6dce69a6@posting.google.com>
> This contains a fallacy. When you are working with arrays, you are in
> fact working with pointers. An array index is a pointer. Alternately,
> when you are working with pointers, you are working with an array: the
> address space.

Ok. Once again:

Polymorphic array A is an ordered sequence of n elements [E1, E2,
....., En]. Bijective operator [] defined on set of numbers 0, 1, ...,
n - 1 returns array element, so A[i] = Ei.  Each array element is
either an unassigned value (null) or number or string literal or
true/false or polymorphic array.

The linked list has the definition.

1. null is the list.
2. If L is a list, the polymorphic array A with 2 elements [E0, E1] is
a list too, at that E0 is any value and E1 is L.

Where you see "pointers" and "address space" here? 

> Once you do that, all the problems associated with pointers crop up in
> your array indices: leaks, danglings, out of bounds, etc.

Before state it, you have to prove it. My article shows that all
well-known dynamic data structures: linked list,  trees, linked stacks
and queues can be effectively processed using concise and uniform
expressions. For example, operation of insertion into a linked list
can be expressed by one statement:

P = [NewItem, P]

The position of insertion does not matter: at the head, at the tail or
at the middle of list, you use the statement above in all these cases.

You delete item from the list by means of statement

reduced P = P[1]

Again, it does not matter where you delete element: at the head, at
the tail or at the middle of the list. The statement does not produce
garbage.

Where you see unsafe programming?

A.

···@ashi.footprints.net (Kaz Kylheku) wrote in message news:<····························@posting.google.com>...
> ··@cable.netlux.org (Alexander Baranovsky) wrote in message news:<····························@posting.google.com>...
> > LISPPA (List Processing based on the Polymorphic Arrays) technology
> > provides a way to process dynamic data structures (lists, trees and
> > more) without using pointers. LISPPA uses polymorphic arrays as a base
> > of  the data representation.
> 
> This contains a fallacy. When you are working with arrays, you are in
> fact working with pointers. An array index is a pointer. Alternately,
> when you are working with pointers, you are working with an array: the
> address space.
> 
> The differences are largely notational. For instance, the well known
> textbook _Introduction to Algorithms_ by Cormen, Leisserson, Rivest &
> Stein, uses array notation throughout. Given some object X, such as a
> tree node, the pseudo-code language uses expressions like:
> 
>    left[parent[x]]
> 
> which is really just different syntax for x.parent.left or (left
> (parent x)) or what have you.
> 
> I once had to work in a scripting language with no dynamic data
> structures, just arrays. I hacked up dynamic binary search trees in it
> anyway.
> 
> Once you do that, all the problems associated with pointers crop up in
> your array indices: leaks, danglings, out of bounds, etc.
From: Ray Dillinger
Subject: Re: LISPPA
Date: 
Message-ID: <4085548A.9D990C97@sonic.net>
Alexander Baranovsky wrote:
> 
> LISPPA (List Processing based on the Polymorphic Arrays) technology
> provides a way to process dynamic data structures (lists, trees and
> more) without using pointers. LISPPA uses polymorphic arrays as a base
> of  the data representation.
> 
> LISPPA considerably extends the applicability of imperative
> programming languages in the symbolic computations. So, now you can
> use Pascal, Basic or C languages in the "Lisp data domain".
> 

APL did this a lot better, actually.

			Bear