From: Shayne Wissler
Subject: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <xNX_8.93191$Wt3.66181@rwcrnsc53>
At http://www.paulgraham.com/diff.html, Paul Graham says that the feature of
having the "whole language always available" is unique to lisp.

With lisp, he iterates that you can:
 1) Compile and run code while reading
 2) Read and run code while compiling
 3) Read and compile code at run-time

Although I'm clear on what is technologically possible with respect to
having a compiler available at run-time, I'm not precisely clear on what
Paul means in the above. I *think* he means:

1. While parsing a lisp program, you can compile and invoke some of the code
you already parsed to help parse the subsequent code, allowing you to modify
the syntax on the fly. If this is correct, what are the limitations? Can I
write a lisp program that starts out as lisp and starts looking like C++?

2. While compiling a lisp program, you can execute special-purpose code that
makes semantic transformations of the lisp executable. When you see a given
input program, you can generate whatever code you want as a function of it.
This is called "macros" in lisp, but it is nothing like textual macros;
really it's a program generating another program at the binary or byte-code
level.

3. A running lisp program can read (or write), compile, and execute any
program, in the current program's execution context. I.e., if the newly
loaded program makes references to variables and functions that are only
defined in the original program, they are dynamically linked and accessible
as the program is compiled.

A few questions:
 - Have I got it right?
 - What are the limitations?
 - What other languages have at least one of the above?
 - Is it really true that no other language has all of the above?


Shayne Wissler

From: Thomas Bushnell, BSG
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <87znwjy6qv.fsf@becket.becket.net>
"Shayne Wissler" <·········@yahoo.com> writes:

> A few questions:
>  - Have I got it right?
>  - What are the limitations?
>  - What other languages have at least one of the above?
>  - Is it really true that no other language has all of the above?

Smalltalk, self, and other such systems have also always had those
capabilities.
From: Topmind
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <MPG.17a6019f5814ef5b98a6c9@news.earthlink.net>
> "Shayne Wissler" <·········@yahoo.com> writes:
> 
> > A few questions:
> >  - Have I got it right?
> >  - What are the limitations?
> >  - What other languages have at least one of the above?
> >  - Is it really true that no other language has all of the above?
> 
> Smalltalk, self, and other such systems have also always had those
> capabilities.
> 

Most scripting languages that have operations like Eval(), Execute(),
and Include() also have a similar dynamic feel. (Some argue that
these lack closures and thus certain scoping power, but I don't want to 
get into that argument yet again.)

-T-
From: Marco Antoniotti
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <y6csn2bzfcz.fsf@octagon.mrl.nyu.edu>
·······@technologist.com (Topmind) writes:

> > "Shayne Wissler" <·········@yahoo.com> writes:
> > 
> > > A few questions:
> > >  - Have I got it right?
> > >  - What are the limitations?
> > >  - What other languages have at least one of the above?
> > >  - Is it really true that no other language has all of the above?
> > 
> > Smalltalk, self, and other such systems have also always had those
> > capabilities.
> > 
> 
> Most scripting languages that have operations like Eval(), Execute(),
> and Include() also have a similar dynamic feel. (Some argue that
> these lack closures and thus certain scoping power, but I don't want to 
> get into that argument yet again.)

Yes.  But the main difference with (Common) Lisp is that this has all
the mentioned features available all the time.  AFAIK only Prolog and
Smalltalk come close.  The so-called scripting languages cannot really
deal with their own code directly and as nicely as CL does, since
their basic data structure to manipulate this sort of things is a
string.  Besides, the function `compile' in many CL implementations
produces x86, Sparc, MIPS, HPPA, PPC and other assembly du jour.

... of course, you can always take your favourite scripting language
and apply for the nth time Greenspun's Tenth Rule of Programming to
it. :)

Cheers

-- 
Marco Antoniotti ========================================================
NYU Courant Bioinformatics Group        tel. +1 - 212 - 998 3488
719 Broadway 12th Floor                 fax  +1 - 212 - 995 4122
New York, NY 10003, USA                 http://bioinformatics.cat.nyu.edu
                    "Hello New York! We'll do what we can!"
                           Bill Murray in `Ghostbusters'.
From: Topmind
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <MPG.17a619bc32aaa1a098a6cb@news.earthlink.net>
> 
> ·······@technologist.com (Topmind) writes:
> 
> > > "Shayne Wissler" <·········@yahoo.com> writes:
> > > 
> > > > A few questions:
> > > >  - Have I got it right?
> > > >  - What are the limitations?
> > > >  - What other languages have at least one of the above?
> > > >  - Is it really true that no other language has all of the above?
> > > 
> > > Smalltalk, self, and other such systems have also always had those
> > > capabilities.
> > > 
> > 
> > Most scripting languages that have operations like Eval(), Execute(),
> > and Include() also have a similar dynamic feel. (Some argue that
> > these lack closures and thus certain scoping power, but I don't want to 
> > get into that argument yet again.)
> 
> Yes.  But the main difference with (Common) Lisp is that this has all
> the mentioned features available all the time.  AFAIK only Prolog and
> Smalltalk come close.  The so-called scripting languages cannot really
> deal with their own code directly and as nicely as CL does, since
> their basic data structure to manipulate this sort of things is a
> string.  Besides, the function `compile' in many CL implementations
> produces x86, Sparc, MIPS, HPPA, PPC and other assembly du jour.

Are you saying that it runs faster, or that it makes
for 'nicer code'?

> 
> ... of course, you can always take your favourite scripting language
> and apply for the nth time Greenspun's Tenth Rule of Programming to
> it. :)
> 
> Cheers
> 
> -- 

-T-
From: Marco Antoniotti
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <y6cn0sjz8zv.fsf@octagon.mrl.nyu.edu>
·······@technologist.com (Topmind) writes:

        ...

> > > Most scripting languages that have operations like Eval(), Execute(),
> > > and Include() also have a similar dynamic feel. (Some argue that
> > > these lack closures and thus certain scoping power, but I don't want to 
> > > get into that argument yet again.)
> > 
> > Yes.  But the main difference with (Common) Lisp is that this has all
> > the mentioned features available all the time.  AFAIK only Prolog and
> > Smalltalk come close.  The so-called scripting languages cannot really
> > deal with their own code directly and as nicely as CL does, since
> > their basic data structure to manipulate this sort of things is a
> > string.  Besides, the function `compile' in many CL implementations
> > produces x86, Sparc, MIPS, HPPA, PPC and other assembly du jour.
> 
> Are you saying that it runs faster, or that it makes
> for 'nicer code'?

Both. First of all it runs as fast as any assembly code produced by a
native compiler (i.e. not a byte compiler, given all the necessary
qualifications).  Second, (Common) Lisp machinery to deal with
Lisp-like languages (hence itself) is simply unparalled in other
environments (save maybe, AFAIK, Smalltalk and Prolog environments).

Current "scripting languages" simply do not come close both in
manipulation easiness and expressiveness *and* speed of the generated
code.

> > ... of course, you can always take your favourite scripting language
> > and apply for the nth time Greenspun's Tenth Rule of Programming to
> > it. :)

... and the above still stands :)

Cheers

-- 
Marco Antoniotti ========================================================
NYU Courant Bioinformatics Group        tel. +1 - 212 - 998 3488
719 Broadway 12th Floor                 fax  +1 - 212 - 995 4122
New York, NY 10003, USA                 http://bioinformatics.cat.nyu.edu
                    "Hello New York! We'll do what we can!"
                           Bill Murray in `Ghostbusters'.
From: Topmind
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <MPG.17a6431c1f00e09898a6cd@news.earthlink.net>
> 
> ·······@technologist.com (Topmind) writes:
> 
>         ...
> 
> > > > Most scripting languages that have operations like Eval(), Execute(),
> > > > and Include() also have a similar dynamic feel. (Some argue that
> > > > these lack closures and thus certain scoping power, but I don't want to 
> > > > get into that argument yet again.)
> > > 
> > > Yes.  But the main difference with (Common) Lisp is that this has all
> > > the mentioned features available all the time.  AFAIK only Prolog and
> > > Smalltalk come close.  The so-called scripting languages cannot really
> > > deal with their own code directly and as nicely as CL does, since
> > > their basic data structure to manipulate this sort of things is a
> > > string.  Besides, the function `compile' in many CL implementations
> > > produces x86, Sparc, MIPS, HPPA, PPC and other assembly du jour.
> > 
> > Are you saying that it runs faster, or that it makes
> > for 'nicer code'?
> 
> Both. First of all it runs as fast as any assembly code produced by a
> native compiler (i.e. not a byte compiler, given all the necessary
> qualifications).  Second, (Common) Lisp machinery to deal with
> Lisp-like languages (hence itself) is simply unparalled in other
> environments (save maybe, AFAIK, Smalltalk and Prolog environments).


Do you have a specific semi-realistic example of
this by chance?


> 
> Current "scripting languages" simply do not come close both in
> manipulation easiness and expressiveness *and* speed of the generated
> code.

I am not questioning the machine speed issue here. 
I'll leave that to others who like to play with
stopwatches.

> 
> > > ... of course, you can always take your favourite scripting language
> > > and apply for the nth time Greenspun's Tenth Rule of Programming to
> > > it. :)
> 
> ... and the above still stands :)
> 
> Cheers
> 
> -- 
> Marco Antoniotti ========================================================
> NYU Courant Bioinformatics Group        tel. +1 - 212 - 998 3488
> 719 Broadway 12th Floor                 fax  +1 - 212 - 995 4122
> New York, NY 10003, USA                 http://bioinformatics.cat.nyu.edu
>                     "Hello New York! We'll do what we can!"
>                            Bill Murray in `Ghostbusters'.
> 

-T-
From: Bulent Murtezaoglu
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <87ofcznu2j.fsf@nkapi.internal>
>>>>> "Topmind" == Topmind  <·······@technologist.com> writes:

[... Attribution lost here by topmind, but probably Marco A. ]
    >> Both. First of all it runs as fast as any assembly code
    >> produced by a native compiler (i.e. not a byte compiler, given
    >> all the necessary qualifications).  Second, (Common) Lisp
    >> machinery to deal with Lisp-like languages (hence itself) is
    >> simply unparalled in other environments (save maybe, AFAIK,
    >> Smalltalk and Prolog environments).


    Topmind> Do you have a specific semi-realistic example of this by
    Topmind> chance?


Several come to my mind, I hope others can come up with more: 

Check out Paul Grahams On Lisp (free download) 
http://www.paulgraham.com/onlisp.html for macrology.  Chapter 25 builds a
toy OO-lisp with a few pages of code in Lisp.  

For writing mini lispy languages which you then can compile, Peter Norvig's 
Paradigms of Artificial Intelligence Programming has extensively worked out 
examples.  http://www.norvig.com/paip.html

As for calling the compiler at run time for efficiency, there should be 
posted examples in the group but I only could find what I posted a while 
back:

http://groups.google.com/groups?oi=djq&selm=an_269334305

cheers,

BM
From: Marco Antoniotti
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <y6c3cua7c25.fsf@octagon.mrl.nyu.edu>
·······@technologist.com (Topmind) writes:

        ...

> > Both. First of all it runs as fast as any assembly code produced by a
> > native compiler (i.e. not a byte compiler, given all the necessary
> > qualifications).  Second, (Common) Lisp machinery to deal with
> > Lisp-like languages (hence itself) is simply unparalled in other
> > environments (save maybe, AFAIK, Smalltalk and Prolog environments).
> 
> 
> Do you have a specific semi-realistic example of
> this by chance?

CLOS? :)  Is that realistic enough?

> > Current "scripting languages" simply do not come close both in
> > manipulation easiness and expressiveness *and* speed of the generated
> > code.
> 
> I am not questioning the machine speed issue here. 
> I'll leave that to others who like to play with
> stopwatches.

To extend Python in any significant way, you need to hack in the
non-Python portions of the release.  To extend (Common) Lisp you hack
(Common) Lisp itself.

A case in point.  Not many people know Setl, but it is a very
interesting high level language that deals with "sets" and that gives
you a "set-based" notation to express your algorithms.  E.g. you can
write things like

        {x in [1 .. 100] | oddp(x) }

This is similar to "list comprehension" available in several older
functional languages (Haskell comes to mind), and recently adopted by
Python and (I beleive) Ruby.  (Setl machinery is more powerful, but
that is beyond the point.)  Now.  To have this notation in your
scripting language du jour you have to go through a lot of hoops end
eventually hack the parser and the C/C++ code underneath.

The above notation is rendered in Common Lisp as

        {x in (range 1 100) / (oddp x) }

A better example is a generator of the tuple (denoted by '[' and ']')
of prime numbers from 1 to `max'.

==============================================================================
(in-package "SETL-USER")

(defun primes (max)
  [n in (range 3 max 2)
      / (not (exist m in (range 3 (min (1- n) (+ 2 (sqrt n))) 2)
                 / (= (mod n m) 0)))])
==============================================================================

The above program is perfecly legal CL and it compiles (I repeat: it
compiles! And for people who did not know this: it compiles :) ) down
to x86, Sparc, MIPS and whatever CPU is currently supported by one or
the CL implementations.

The code to support this sort of extensions is pure Common Lisp and it
is (to my regret :} ) slightly more than 300 lines including comments.
Adding a few more lines will allow you to change the 'range'
expressions and to reuse the INFIX package.

So the bottom line is: to the best of my knowledge, only CL (and
probably Prolog and Smalltalk) gives you all you need all the time.
Not scripting language du jour gets you that much.

Having said so, do I program in CL all the time?  No. I use what I have
to use given the task at hand.  Only there are very few tasks unsuited
to CL :)

... and Greenspun's Tenth Rules! :)

Cheers


-- 
Marco Antoniotti ========================================================
NYU Courant Bioinformatics Group        tel. +1 - 212 - 998 3488
719 Broadway 12th Floor                 fax  +1 - 212 - 995 4122
New York, NY 10003, USA                 http://bioinformatics.cat.nyu.edu
                    "Hello New York! We'll do what we can!"
                           Bill Murray in `Ghostbusters'.
From: Dave Bakhash
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <c29it37pq1v.fsf@nerd-xing.mit.edu>
"Shayne Wissler" <·········@yahoo.com> writes:

> With lisp, he iterates that you can:
>  3) Read and compile code at run-time
           ^^^^^^^^^^^
>  - Have I got it right?

Not for the commercial implementations.

>  - What are the limitations?

Some implementations (e.g. ACL, LispWorks) limit the use of the compiler
at run-time in delivered applications.  You can compile a form, but not
file.  

dave
From: Carl Shapiro
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ouyptxf8chr.fsf@panix3.panix.com>
Dave Bakhash <·····@alum.mit.edu> writes:

> "Shayne Wissler" <·········@yahoo.com> writes:

> >  - What are the limitations?
> 
> Some implementations (e.g. ACL, LispWorks) limit the use of the compiler
> at run-time in delivered applications.  You can compile a form, but not
> file.  

This is merely an economic issue, not a practical one.
From: Thaddeus L Olczyk
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <u40pjugo0cvpnhhrnn7i74nu01u6nvf01s@4ax.com>
On 22 Jul 2002 14:37:16 -0400, Dave Bakhash <·····@alum.mit.edu>
wrote:

>"Shayne Wissler" <·········@yahoo.com> writes:
>
>> With lisp, he iterates that you can:
>>  3) Read and compile code at run-time
>           ^^^^^^^^^^^
>>  - Have I got it right?
>
>Not for the commercial implementations.
>
>>  - What are the limitations?
>
>Some implementations (e.g. ACL, LispWorks) limit the use of the compiler
>at run-time in delivered applications.  You can compile a form, but not
>file.  
>
Actually, I believe that they require a different license ( to be read
as "more expensive" ) if you want to deliver the full compiler.


As to the original question, this feature is not unique. Read Eric
Raymonds support of Python, where he points this out as one of Pythons
better feature ( and gives the example of fetchmailrc ).
From: Topmind
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <MPG.17a6426689651bcd98a6cc@news.earthlink.net>
> On 22 Jul 2002 14:37:16 -0400, Dave Bakhash <·····@alum.mit.edu>
> wrote:
> 
> >"Shayne Wissler" <·········@yahoo.com> writes:
> >
> >> With lisp, he iterates that you can:
> >>  3) Read and compile code at run-time
> >           ^^^^^^^^^^^
> >>  - Have I got it right?
> >
> >Not for the commercial implementations.
> >
> >>  - What are the limitations?
> >
> >Some implementations (e.g. ACL, LispWorks) limit the use of the compiler
> >at run-time in delivered applications.  You can compile a form, but not
> >file.  
> >
> Actually, I believe that they require a different license ( to be read
> as "more expensive" ) if you want to deliver the full compiler.
> 
> 
> As to the original question, this feature is not unique. Read Eric
> Raymonds support of Python, where he points this out as one of Pythons
> better feature ( and gives the example of fetchmailrc ).
> 

Is this the article?

http://www.linuxjournal.com/article.php?sid=3882

I am not an email expert, but it looks like he is bragging about how easy 
Python is at reinventing a database for user configurations. His classes 
would be relational tables if I had my way (if anybody by chance cares).

Quote: "An important measure of effort in coding is the frequency with 
which you write something that doesn't actually match your mental 
representation of the problem, and have to backtrack...."

He is assuming everybody thinks like him. Common fallacy among 
many programmers.

WRT LISP, it appears that he is complaining about the library
docs and user fragmentation, and not really the language itself.

-T-
From: Johan Kullstam
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <m3d6tfmwio.fsf@sysengr.res.ray.com>
"Shayne Wissler" <·········@yahoo.com> writes:

> At http://www.paulgraham.com/diff.html, Paul Graham says that the feature of
> having the "whole language always available" is unique to lisp.
> 
> With lisp, he iterates that you can:
>  1) Compile and run code while reading
>  2) Read and run code while compiling
>  3) Read and compile code at run-time
> 
> Although I'm clear on what is technologically possible with respect to
> having a compiler available at run-time, I'm not precisely clear on what
> Paul means in the above. I *think* he means:
> 
> 1. While parsing a lisp program, you can compile and invoke some of the code
> you already parsed to help parse the subsequent code, allowing you to modify
> the syntax on the fly. If this is correct, what are the limitations? Can I
> write a lisp program that starts out as lisp and starts looking like
> C++?

I am not sure that you could easily warp Lisp's reader into a C++
parser.  Lisp has a very simple and regular syntax which makes reading
and parsing the Lisp source much easier than most languages.

> 2. While compiling a lisp program, you can execute special-purpose code that
> makes semantic transformations of the lisp executable. When you see a given
> input program, you can generate whatever code you want as a function of it.
> This is called "macros" in lisp, but it is nothing like textual macros;
> really it's a program generating another program at the binary or byte-code
> level.

Macros are text engines.  Basically (someone correct me if I make a
technical error) a macro is a program (or function) which is called
right before compile time.  Macros look like regular Lisp functions
(they appear at the front of a list) but are run as the source is
processed.  C has a preprocessor which does much the same thing but
Lisp has 1) a very regular syntax 2) resonably good built-in
functionality for handling Lisp code (which look like nested lists)
and 3) can be extended by user functions and macros available at
compile time.

> 3. A running lisp program can read (or write), compile, and execute any
> program, in the current program's execution context. I.e., if the newly
> loaded program makes references to variables and functions that are only
> defined in the original program, they are dynamically linked and accessible
> as the program is compiled.

There isn't really any "linking" per se.  That's a static language
concept.  What Lisp will let you do is write a function and compile it
right there.  From what I've encountered in a limited experience, two
kinds of this

1) closure

In this style you have most of the structure of the function but at
compile time do not know some variables.  Later on, you can fill in
these parameters and compile a function with these hard-coded in.  You
could have many functions with various parameter settings based off of
one general function form.

(defun make-adder (s)
  (compile nil (lambda (x) (+ x s))))

calling make-adder will return a function which adds s.

this function creates and compiles another function at _run_ time
[4] (defun make-adder (s)
  (compile nil (lambda (x) (+ x s))))
MAKE-ADDER

now attach functions to labels
[5]> (defvar foo (make-adder 3))
FOO
[6]> (defvar bar (make-adder 14))
BAR

call the functions we just created in steps 5 and 6
[7]> (funcall foo 39)
42
[8]> (funcall bar 12)
26

see what foo and bar, they are compiled functions
[9]> foo
#<COMPILED-CLOSURE NIL>
[10]> bar
#<COMPILED-CLOSURE NIL>


2) eval

In C and other languages it is easy to read, e.g., a number.  C has
trouble with strings because it has a hard time with dynamic sized
objects but these are also very doable.  Reading a _function_ at
run-time is hard.  How do you supply
   double func(double x) { return x * x; }
at _run-time_ in C or C++?  You don't.

Lisp can read, evaluate and compile
   (lambda (x) (* x x))
as easily as it can do 3.14.


> A few questions:
>  - Have I got it right?
>  - What are the limitations?
>  - What other languages have at least one of the above?
>  - Is it really true that no other language has all of the above?

I believe other languages have all the above.  They are just not
called C or C++.  Hope this helps.

-- 
Johan KULLSTAM <··········@attbi.com> sysengr
From: Joel Ray Holveck
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <y7celdvgvhb.fsf@sindri.juniper.net>
>> 1. While parsing a lisp program, you can compile and invoke some of the code
>> you already parsed to help parse the subsequent code, allowing you to modify
>> the syntax on the fly. If this is correct, what are the limitations? Can I
>> write a lisp program that starts out as lisp and starts looking like
>> C++?
> I am not sure that you could easily warp Lisp's reader into a C++
> parser.  Lisp has a very simple and regular syntax which makes reading
> and parsing the Lisp source much easier than most languages.

Yes, but the reader is completely modifiable in any arbitrary manner.
Some are easier than others, but there's been a number of posts to
comp.lang.lisp to create C-like syntaxes.  C++ is not going to be a
trivial change, since it's a complicated syntax, but it's still
possible.

>> 2. While compiling a lisp program, you can execute special-purpose code that
>> makes semantic transformations of the lisp executable. When you see a given
>> input program, you can generate whatever code you want as a function of it.
>> This is called "macros" in lisp, but it is nothing like textual macros;
>> really it's a program generating another program at the binary or byte-code
>> level.
> Macros are text engines.

This is true in most languages, but not in Lisp.

Lisp macros operate on the structure, not on the text.  The reader
will read the program fragment, then pass the already-parsed structure
into the macro handling function.

This has a few significant benefits.  First, macros are much easier to
write, because you don't need to do any parsing at all: everything's
been done by the reader.  You can make powerful macros with just a few
lines of code.

Second, if the reader is extended, macros still work.  Since they are
called after the reader is finished, then syntactic extensions have
already been applied before the macro is called.  The macro can be
blissfully unaware that it's operating on something that was created
with a syntactic extension.

>> 3. A running lisp program can read (or write), compile, and execute any
>> program, in the current program's execution context. I.e., if the newly
>> loaded program makes references to variables and functions that are only
>> defined in the original program, they are dynamically linked and accessible
>> as the program is compiled.
> There isn't really any "linking" per se.  That's a static language
> concept.  What Lisp will let you do is write a function and compile it
> right there.  From what I've encountered in a limited experience, two
> kinds of this
> 1) closure

Closures are possible in statically linked languages, too, but aren't
often used.  It's just like an indirected function call.  Admittedly,
that's not static linking, but it's something that most statically
linked languages like C already have.  You can actually find a paper
on how to add closures to C++ at
http://master.debian.org/~karlheg/Usenix88-lexic.pdf

(BTW: Your example called COMPILE on the closure.  That's not normally
necessary: the compiler will create the compiled code when you compile
the outer function, and the runtime code binds the variables, so you
get a compiled closure.)

> 2) eval

This is a powerful feature of Lisp, but is only (properly) used in a
few circumstances.

Neither one of these is necessarily an enemy of static linking.  The
basic complication is that functions can be redefined at runtime.

Here's a trivial example.  Consider:

    (export add-bias set-bias)
    (defvar *bias* 0)
    (defun set-bias (new-bias)
      (setq *bias* new-bias))
    (defun add-bias (sig)
      (+ *bias* sig))

Here's an example of it in use:

    * (add-bias 5)
    5
    * (set-bias 3)
    3
    * (add-bias 5)
    8

Now, for rhetorical purposes, I'm going to rewrite this, so that it
does the same thing, but using a different technique:

    (export '(add-bias set-bias))
    (defun set-bias (new-bias)
      (defun add-bias (sig)
        (+ new-bias sig)))
    (set-bias 0)

    * (add-bias 5)
    5
    * (set-bias 3)
    ADD-BIAS
    * (add-bias 5)
    8

Now, the point here isn't that the code created a closure, but that it
redefines the ADD-BIAS function.

So, why can ADD-BIAS not be statically linked?  Because it can be
redefined at runtime.

In Lisp, functions are generally attached to symbols.  The function
cell of a symbol can be changed at any time.  (That's what DEFUN
does.)  That's why static linking is not trivially feasible.

Now, this is a trivial example, that I wrote for rhetorical purposes.
A much more complex example may be a program that figures out
improvements to its own functions as it runs.

Hope this helps,
joelh
From: Marco Antoniotti
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <y6cy9c25wyi.fsf@octagon.mrl.nyu.edu>
Nitpicking....

Joel Ray Holveck <·····@juniper.net> writes:

> Here's a trivial example.  Consider:
> 
>     (export add-bias set-bias)

The above is incorrect.  `export' is a function that takes a list of
symbols.

      (in-package "BIAS-PACKAGE")

      (export (list 'add-bias 'set-bias))

is more correct.

Of course you could have written

      (export* add-bias set-bias)

had defined

        (defmacro export* (&rest symbols)
           (assert (every #'symbolp symbols))
           `(export ',symbols))



>     (defvar *bias* 0)
>     (defun set-bias (new-bias)
>       (setq *bias* new-bias))
>     (defun add-bias (sig)
>       (+ *bias* sig))
> 

Cheers

-- 
Marco Antoniotti ========================================================
NYU Courant Bioinformatics Group        tel. +1 - 212 - 998 3488
719 Broadway 12th Floor                 fax  +1 - 212 - 995 4122
New York, NY 10003, USA                 http://bioinformatics.cat.nyu.edu
                    "Hello New York! We'll do what we can!"
                           Bill Murray in `Ghostbusters'.
From: Hannah Schroeter
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ahmd1h$8eg$1@c3po.schlund.de>
Hello!

Joel Ray Holveck  <·····@juniper.net> wrote:
>[...]

>Closures are possible in statically linked languages, too, but aren't
>often used.  It's just like an indirected function call.  Admittedly,
>that's not static linking, but it's something that most statically
>linked languages like C already have.  You can actually find a paper
>on how to add closures to C++ at
>http://master.debian.org/~karlheg/Usenix88-lexic.pdf

What about SML, ocaml, Haskell? All have and use closures quite much
and you're able to compile and link them statically.

Or current Java's (anonymous) inner classes?

Not all of the static world is C or C++ :-)

>[...]

Kind regards,

Hannah.
From: Joel Ray Holveck
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <y7cptxcg0na.fsf@sindri.juniper.net>
>> Closures are possible in statically linked languages, too, but aren't
>> often used.  It's just like an indirected function call.  Admittedly,
>> that's not static linking, but it's something that most statically
>> linked languages like C already have.  You can actually find a paper
>> on how to add closures to C++ at
>> http://master.debian.org/~karlheg/Usenix88-lexic.pdf
> What about SML, ocaml, Haskell? All have and use closures quite much
> and you're able to compile and link them statically.
> Or current Java's (anonymous) inner classes?
> Not all of the static world is C or C++ :-)

Indeed.  I just was using C as an example of a statically-linked
language.  I didn't mean to imply that it was the only one.  My point
was that closures are possible in statically-linked languages.

Thanks for the examples.

joelh
From: Bob Bane
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <3D404D01.6080105@removeme.gst.com>
Hannah Schroeter wrote:


> 
> Or current Java's (anonymous) inner classes?
> 

For me, Java's closure-like anonymous classes have been more of an 
annoyance than either no closures at all, or better closures like CL. 
The annoyance is that Java's inner classes will only close over local 
variables that are declared "final" - this is apparently to make life 
easier for the JVM implementors, who thus don't have to worry about 
cross-thread local variable access.

A typical worse-is-better decision, trading implementor effort for user 
effort.  Bleah.
From: Hannah Schroeter
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ahq0er$jjc$1@c3po.schlund.de>
Hello!


I wrote:

>> Or current Java's (anonymous) inner classes?

Bob Bane  <····@removeme.gst.com> wrote:

>For me, Java's closure-like anonymous classes have been more of an 
>annoyance than either no closures at all, or better closures like CL. 
>The annoyance is that Java's inner classes will only close over local 
>variables that are declared "final" - this is apparently to make life 
>easier for the JVM implementors, who thus don't have to worry about 
>cross-thread local variable access.

That I didn't know. If you just want a normal function closure,
those inner classes are annoying anyway.

>A typical worse-is-better decision, trading implementor effort for user 
>effort.  Bleah.

Seems so.

Kind regards,

Hannah.
From: Software Scavenger
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <a6789134.0207222254.375e3662@posting.google.com>
Johan Kullstam <··········@attbi.com> wrote in message news:<··············@sysengr.res.ray.com>...

> Macros are text engines.  Basically (someone correct me if I make a

In CL, macros are higher level than ordinary "text engine" macros.  CL
code is represented as lists of lists, symbols, etc.  You basically
have complete freedom and power to quickly and easily change the
programming language any way you want to.  The same kinds of macros
you can write in CL easily are often very difficult or impossible in
C++.

Making good use of CL macros can make a big improvement in your
programming productivity, because you get so much bang for your buck,
so to speak.

Some people object to this on the grounds that changing the
programming language makes it harder for other programmers to
understand.  But they're way off base.  The hardest thing to
understand is the application.  The technical details of the
application, and how those details are represented in the program, are
a much bigger factor in being able to understand the program than the
exact syntax is.  By making good use of macros, you can make the
program much clearer, and make it a much closer match to the existing
technical jargon of the domain of the application, so you can
communciate fluently about the program and how it relates to the
application domain.
From: Shayne Wissler
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <DzC%8.118237$Wt3.100093@rwcrnsc53>
"Johan Kullstam" <··········@attbi.com> wrote in message
···················@sysengr.res.ray.com...

> > 1. While parsing a lisp program, you can compile and invoke some of the
code
> > you already parsed to help parse the subsequent code, allowing you to
modify
> > the syntax on the fly. If this is correct, what are the limitations? Can
I
> > write a lisp program that starts out as lisp and starts looking like
> > C++?
>
> I am not sure that you could easily warp Lisp's reader into a C++
> parser.  Lisp has a very simple and regular syntax which makes reading
> and parsing the Lisp source much easier than most languages.

Suppose I'm writing a lisp program, and then hit a spot in the code where
I'm going to do a lot of arithmetic computations, and so I want to use the
standard mathematical syntax. In these expressions, I'm going to refer to
and define variables that are also defined/referred to in the lispy sections
of code.

Can I easily do that? What would the code look like?


Shayne Wissler
From: Jochen Schmidt
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ahmu9f$8q4$1@rznews2.rrze.uni-erlangen.de>
Shayne Wissler wrote:

> 
> "Johan Kullstam" <··········@attbi.com> wrote in message
> ···················@sysengr.res.ray.com...
> 
>> > 1. While parsing a lisp program, you can compile and invoke some of the
> code
>> > you already parsed to help parse the subsequent code, allowing you to
> modify
>> > the syntax on the fly. If this is correct, what are the limitations?
>> > Can
> I
>> > write a lisp program that starts out as lisp and starts looking like
>> > C++?
>>
>> I am not sure that you could easily warp Lisp's reader into a C++
>> parser.  Lisp has a very simple and regular syntax which makes reading
>> and parsing the Lisp source much easier than most languages.
> 
> Suppose I'm writing a lisp program, and then hit a spot in the code where
> I'm going to do a lot of arithmetic computations, and so I want to use the
> standard mathematical syntax. In these expressions, I'm going to refer to
> and define variables that are also defined/referred to in the lispy
> sections of code.
> 
> Can I easily do that? What would the code look like?

I never needed it but you can use the free "infix.cl" to do that.

(defun some-computation (a b)
 (let ((c 2))
   #I(
         c += a[0,1]^^b,
         a = floor(3.14),
         c = c / a
      )
   c))

Where the part in the #I(...) gets translated to:

(PROGN 
  (INCF C (EXPT (AREF A 0 1) B))
  (SETQ A (FLOOR 3.14))
  (SETQ C (/ C A)))

ciao,
Jochen

--
http://www.dataheaven.de
From: Rainer Joswig
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <joswig-F868AD.11354928072002@news.fu-berlin.de>
In article <·······················@rwcrnsc53>,
 "Shayne Wissler" <·········@yahoo.com> wrote:

> "Johan Kullstam" <··········@attbi.com> wrote in message
> ···················@sysengr.res.ray.com...
> 
> > > 1. While parsing a lisp program, you can compile and invoke some of the
> code
> > > you already parsed to help parse the subsequent code, allowing you to
> modify
> > > the syntax on the fly. If this is correct, what are the limitations? Can
> I
> > > write a lisp program that starts out as lisp and starts looking like
> > > C++?
> >
> > I am not sure that you could easily warp Lisp's reader into a C++
> > parser.  Lisp has a very simple and regular syntax which makes reading
> > and parsing the Lisp source much easier than most languages.
> 
> Suppose I'm writing a lisp program, and then hit a spot in the code where
> I'm going to do a lot of arithmetic computations, and so I want to use the
> standard mathematical syntax. In these expressions, I'm going to refer to
> and define variables that are also defined/referred to in the lispy sections
> of code.
> 
> Can I easily do that? What would the code look like?
> 
> 
> Shayne Wissler
> 
> 
> 

Using some mixfix reader code:

(defun foo (array x y)
  (+ (* 3 x)
     #I(if x > y
        then  3* x^^2 + y^^2 + array[x]
        else x * y)))
From: Seth Gordon
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <3D4029C8.47A561EA@genome.wi.mit.edu>
Johan Kullstam wrote:
> 
> "Shayne Wissler" <·········@yahoo.com> writes:
> 
> > 1. While parsing a lisp program, you can compile and invoke some of the code
> > you already parsed to help parse the subsequent code, allowing you to modify
> > the syntax on the fly. If this is correct, what are the limitations? Can I
> > write a lisp program that starts out as lisp and starts looking like
> > C++?
> 
> I am not sure that you could easily warp Lisp's reader into a C++
> parser.  Lisp has a very simple and regular syntax which makes reading
> and parsing the Lisp source much easier than most languages.

See Henry Baker's "Pragmatic Parsing in Common Lisp":
http://home.pipeline.com/~hbaker1/Prag-Parse.html

-- 
"I haven't convinced myself that it is in the best
 interest of our shareholders."  --Scott McNealy,
 CEO of Sun, on a requirement that CEOs attest to
 the accuracy of their company's financial statements
// seth gordon // wi/mit ctr for genome research //
// ····@genome.wi.mit.edu // standard disclaimer //
From: Knut Arild Erstad
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <slrnak00ai.h7f.knute+news@apal.ii.uib.no>
[Johan Kullstam]
: 
: (defun make-adder (s)
:   (compile nil (lambda (x) (+ x s))))
: 
: calling make-adder will return a function which adds s.

Or you can simply do

(defun make-adder (s)
  (lambda (x) (+ x s)))

and if make-adder is compiled, the returned function will also be 
compiled.  In fact, there is hardly ever any reason to call 'compile' 
explicitly except during development (and with a proper IDE, not even 
then).

Actually, I have used 'compile' in my code, *once*.  It involved
generating optimized code for spline functions and compiling them 
on-the-fly.  It was kind of neat. :)

-- 
Knut Arild Erstad

But if less is more, then just think how much more more will be.
    -- from "Frasier"
From: Johan Kullstam
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <m31y9rn9cy.fsf@sysengr.res.ray.com>
··········@ii.uib.no (Knut Arild Erstad) writes:

> [Johan Kullstam]
> : 
> : (defun make-adder (s)
> :   (compile nil (lambda (x) (+ x s))))
> : 
> : calling make-adder will return a function which adds s.
> 
> Or you can simply do
> 
> (defun make-adder (s)
>   (lambda (x) (+ x s)))
> 
> and if make-adder is compiled, the returned function will also be 
> compiled.  In fact, there is hardly ever any reason to call 'compile' 
> explicitly except during development (and with a proper IDE, not even 
> then).

Thanks.  I was unaware of this.

> Actually, I have used 'compile' in my code, *once*.  It involved
> generating optimized code for spline functions and compiling them 
> on-the-fly.  It was kind of neat. :)

Nod.  I am currently working with arbitrary trellis code.  I want to
generate a function (at run time) from a trellis code table.  Then,
rather than using a bunch of loops, I will unroll them.  I guess this
is a kind of run-time rather than compile-time macro.  The result, I
want to eventually compile for speed.

Closing over

(let ((s 4))
   (lambda (x) (* x s)))

is different from closing over

   (lambda (x) (.....))

where i fill ..... with + and 4 x-es.

   (lambda (x) (+ x x x x))

the 4 will be constant for a particular code, but unknown in the
general case.  I am still early in this effort, but I suspect I will
learn something from it.  What something is, I do not yet know. ;-)

-- 
Johan KULLSTAM <··········@attbi.com> sysengr
From: Christopher Barber
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <psoznwjqnjf.fsf@boris.curl.com>
"Shayne Wissler" <·········@yahoo.com> writes:

> At http://www.paulgraham.com/diff.html, Paul Graham says that the feature of
> having the "whole language always available" is unique to lisp.
> 
> With lisp, he iterates that you can:
>  1) Compile and run code while reading
>  2) Read and run code while compiling
>  3) Read and compile code at run-time
> 
> ...
>
>  - Have I got it right?
>  - What are the limitations?
>  - What other languages have at least one of the above?

Curl is one.  Since Curl is a client-side language, there are some security
restrictions on what you can do while expanding a macro in Curl.

>  - Is it really true that no other language has all of the above?

No, it's just hyperbole.

- Christopher
From: David Golden
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <Xj%_8.1804$lC5.13791@news.iol.ie>
Shayne Wissler wrote:

> At http://www.paulgraham.com/diff.html, Paul Graham says that the feature
> of having the "whole language always available" is unique to lisp.
> 

Hyperbole at best.   Forth, for example, also has the whole language
available, including "parser", such as it is.  Through CREATE DOES>
one can do a whole lot of funny business reminiscent of lisp, only
it will happily run on an 8-bit embedded microcontroller :-)

That said, I'd rather use Lisp for most things, but Forth is something
that programmers should be aware of, if only for the different 
perspective...

See http://www.forth.org/


-- 
Don't eat yellow snow.
From: Thaddeus L Olczyk
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <m01pjusnss9p9mc4fq50vjjumamc615eep@4ax.com>
On Mon, 22 Jul 2002 22:04:39 GMT, David Golden
<············@oceanfree.net> wrote:

>Shayne Wissler wrote:
>
>> At http://www.paulgraham.com/diff.html, Paul Graham says that the feature
>> of having the "whole language always available" is unique to lisp.
>> 
>
>Hyperbole at best.   Forth, for example, also has the whole language
>available, including "parser", such as it is.  Through CREATE DOES>
>one can do a whole lot of funny business reminiscent of lisp, only
>it will happily run on an 8-bit embedded microcontroller :-)
>
>That said, I'd rather use Lisp for most things, but Forth is something
>that programmers should be aware of, if only for the different 
>perspective...
>
>See http://www.forth.org/
And to learn to think sideways.
From: JRStern
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <3d3c8f1c.28907536@news.verizon.net>
On Mon, 22 Jul 2002 18:03:09 GMT, "Shayne Wissler"
<·········@yahoo.com> wrote:
>At http://www.paulgraham.com/diff.html, Paul Graham says that the feature of
>having the "whole language always available" is unique to lisp.
>
>With lisp, he iterates that you can:
> 1) Compile and run code while reading
> 2) Read and run code while compiling
> 3) Read and compile code at run-time
>
>Although I'm clear on what is technologically possible with respect to
>having a compiler available at run-time, I'm not precisely clear on what
>Paul means in the above. I *think* he means:
>
>1. While parsing a lisp program, you can compile and invoke some of the code
>you already parsed to help parse the subsequent code, allowing you to modify
>the syntax on the fly. If this is correct, what are the limitations? Can I
>write a lisp program that starts out as lisp and starts looking like C++?

No, it's not meant to modify the language on the fly.

As others have said, you just build up source text in a variable, then
Eval() the variable.

>2. While compiling a lisp program, you can execute special-purpose code that
>makes semantic transformations of the lisp executable. When you see a given
>input program, you can generate whatever code you want as a function of it.
>This is called "macros" in lisp, but it is nothing like textual macros;
>really it's a program generating another program at the binary or byte-code
>level.

Lisp has macros, too, but that's another subject.

>3. A running lisp program can read (or write), compile, and execute any
>program, in the current program's execution context. I.e., if the newly
>loaded program makes references to variables and functions that are only
>defined in the original program, they are dynamically linked and accessible
>as the program is compiled.

Some Lisps had that kind of dynamic scoping, others like Common Lisp
did not.

>A few questions:
> - Have I got it right?

Not especially.

> - What are the limitations?

As in any system, it's cheaper to compile once and run many times.  In
the early 1960s it seemed like it might be useful to have
self-modifying code.  However, this, IMHO, was an immature opinion
based on an insufficient understanding of how the Von Neuman computer
architecture works best, with separate code and data.  It's hard for
us twenty-first century types to get the feel of what it was like to
do computers in the early days, not that we've come all that far,
maybe, but over the decades we've learned a few things!

> - What other languages have at least one of the above?

GWBasic could dynamically load source into a running program.  Heck,
most scripting languages can write text to a disk file then invoke the
file, and it's not all that different.

> - Is it really true that no other language has all of the above?

No.  Even most SQL implementations let you build and Eval() text at
runtime.  The only thing Lisp has that other languages lack is a high
proportion of parenthesis.

Joshua Stern
From: Marco Antoniotti
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <y6cd6tfz4v1.fsf@octagon.mrl.nyu.edu>
················@gte.net (JRStern) writes:

> On Mon, 22 Jul 2002 18:03:09 GMT, "Shayne Wissler"

        ...

> >1. While parsing a lisp program, you can compile and invoke some of the code
> >you already parsed to help parse the subsequent code, allowing you to modify
> >the syntax on the fly. If this is correct, what are the limitations? Can I
> >write a lisp program that starts out as lisp and starts looking like C++?
> 
> No, it's not meant to modify the language on the fly.

Actually, you can do that.  You *can* modify th elanguage on
the fly.

> As others have said, you just build up source text in a variable, then
> Eval() the variable.

Yes. But the source text is totally unstructured (i.e. it is a string)
and not soething that you can manipulate easily and directly.
Moreover, the equivalent of the Compile() functionality in (Common)
Lisp gets you native code.

        ...

> Lisp has macros, too, but that's another subject.

Yes and no.  (Common) Lisp Macros are more flexible and useful than
other construct present in other languages.

> >3. A running lisp program can read (or write), compile, and execute any
> >program, in the current program's execution context. I.e., if the newly
> >loaded program makes references to variables and functions that are only
> >defined in the original program, they are dynamically linked and accessible
> >as the program is compiled.
> 
> Some Lisps had that kind of dynamic scoping, others like Common Lisp
> did not.

Common Lisp offers you both.  You can mark variables to be looked up
in the dynamic environment via the `special' declaration.

> 
> > - What are the limitations?
> 
> As in any system, it's cheaper to compile once and run many times. 

Let's define "run many times".  Suppose I have a "small language
definition" that allows me to run a "function" over the elements of some
large data structure (say a graph).  Suppose that such functions can
be built up in a web form.

Now, in a "regular" language, you have to (1) parse the string that
makes up the "function", (2) embed it in the "driver" (which may be
large) as text (the "driver" is text as well), (3) compile the
resulting text as a file, (4) run the newly compiled program.

In Common Lisp, you have your driver compiled and available all the
time. You just have to do (1) compile the function after minimal
manipulation and (2) run the driver which calls the compiled function
automagically.

Since the graph is large, in this case, you win by saving compilation
time.

> > - What other languages have at least one of the above?
> 
> GWBasic could dynamically load source into a running program.  Heck,
> most scripting languages can write text to a disk file then invoke the
> file, and it's not all that different.

I think the above example shows how different it is.  Moreover, you
loose the compiler.

> > - Is it really true that no other language has all of the above?
> 
> No.  Even most SQL implementations let you build and Eval() text at
> runtime.  The only thing Lisp has that other languages lack is a high
> proportion of parenthesis.

Plus the compiler, multiple method dispatch, plus read-write
equivalence, plus Greenspun's Tenth Rule. :)

Cheers


-- 
Marco Antoniotti ========================================================
NYU Courant Bioinformatics Group        tel. +1 - 212 - 998 3488
719 Broadway 12th Floor                 fax  +1 - 212 - 995 4122
New York, NY 10003, USA                 http://bioinformatics.cat.nyu.edu
                    "Hello New York! We'll do what we can!"
                           Bill Murray in `Ghostbusters'.
From: Shayne Wissler
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <6w0%8.130690$uw.73405@rwcrnsc51.ops.asp.att.net>
JRStern wrote:

>> - What other languages have at least one of the above?
> 
> GWBasic could dynamically load source into a running program.  Heck,
> most scripting languages can write text to a disk file then invoke the
> file, and it's not all that different.

Hold on--the requirement is that the generated program can access the 
variables and functions of the generating program. Otherwise C would count 
as having this feature because you can generate C source, compile it, and 
execute it from a C program.


Shayne Wissler
From: Greg Menke
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <m3sn2bwc6w.fsf@europa.pienet>
> >> - What other languages have at least one of the above?
> > 
> > GWBasic could dynamically load source into a running program.  Heck,
> > most scripting languages can write text to a disk file then invoke the
> > file, and it's not all that different.
> 

Atari Basic did it a bit better, as I recall you can print new lines
of basic to the screen, then tweak the screen device driver to cause
the Basic listener to read and accept them.  The result of which was
you could have your program rewrite parts of itself as desired without
terminating the running program- IIRC, all that was needed is a STOP
then a CONT after the new code to get back in.  Debugging it was a
different story.  Naturally all the syntax errors occured when the
tweaked driver was doing its thing, rendering the console unusable-
more or less requiring a reboot.

Greg Menke
From: Topmind
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <MPG.17a644eb1ec8eb0598a6ce@news.earthlink.net>
> JRStern wrote:
> 
> >> - What other languages have at least one of the above?
> > 
> > GWBasic could dynamically load source into a running program.  Heck,
> > most scripting languages can write text to a disk file then invoke the
> > file, and it's not all that different.
> 
> Hold on--the requirement is that the generated program can access the 
> variables and functions of the generating program. Otherwise C would count 
> as having this feature because you can generate C source, compile it, and 
> execute it from a C program.

I know a few scripting languages that can execute code in
context.

if foo
   include("myLines.txt")
end if

// Or:

myCode = "delete(scope='everything');listThem()...."
if foo
   execute(myCode)
end if

As if the included code was already in there.
It can reference any vars in that existing
scope.

(I won't vouch for the speed, however. It is not
an issue most of the time.)

> 
> 
> Shayne Wissler
> 
> 

-T-
From: Marco Antoniotti
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <y6cptxe5ut3.fsf@octagon.mrl.nyu.edu>
·······@technologist.com (Topmind) writes:

> > JRStern wrote:
> > 
> > >> - What other languages have at least one of the above?
> > > 
> > > GWBasic could dynamically load source into a running program.  Heck,
> > > most scripting languages can write text to a disk file then invoke the
> > > file, and it's not all that different.
> > 
> > Hold on--the requirement is that the generated program can access the 
> > variables and functions of the generating program. Otherwise C would count 
> > as having this feature because you can generate C source, compile it, and 
> > execute it from a C program.
> 
> I know a few scripting languages that can execute code in
> context.
> 
> if foo
>    include("myLines.txt")
> end if

The above is equivalent in CL to.

        (when foo (load "mySexprs.txt"))

however, it is not equivalent to CL's ability to extend itself.

> 
> // Or:
> 
> myCode = "delete(scope='everything');listThem()...."
> if foo
>    execute(myCode)
> end if

The above is equivalent to 

        (let ((my-code '(delete-file "/usr/bin/vi")))
           (eval my-code))

So the "scripting language" code above code is an application of
Greenspun's Tenth. (I am a Lisp zealot! What editor do you think I
use?!?)

Of course you can do things in CL in an easier way than you can in the
scripting language du jour.

        (let ((my-code '(delete-file "/usr/bin/vi")))
           (if (string= (second my-code) "/usr/bin/vi")
               (error "Come on! Be nice! You are trying to delete ~S."
                      (second my-code))
               (eval my-code)))

It is the `(second my-code)' form that gives away the power of CL.
You do not handle strings.  You handle CL data here (i.e. `cons'
cells: programs are data).  Hence this kind of "extension programming"
does not mix text and regular code in CL. Instead, you must do that in
any other environment (except Prolog and Smalltalk AFAIK).  (And the
above gets compiled down to machine language).

The above would be equivalent in your SLDJ (scripting language du jour)
to

        myCode = AST.function_call("delete_file",
                                    [AST.string("/usr/bin/vi")]);
        myStringCode = myCode.operator()
                       + "("
                       + myCode.arguments().first()
                       + ")"
        if myCode.arguments().first().equal("/usr/bin/vi")
           error("Are you mad? Why would you delete "
                 + myCode.arguments().first()
                 + "? Are you a Lisper?");
        else
           execute(myStringCode);
        endif

of course I am assuming that you have an AST package available (as you
do in some of the SLDJs out there).

.... get the message? :)

> As if the included code was already in there.
> It can reference any vars in that existing
> scope.

The notion of `scope' you seem to refer to here seem to reflect some
of the bizarre (thankfully semi-fixed now) scoping rules of Python.
I am not sure what you mean here.

> (I won't vouch for the speed, however. It is not
> an issue most of the time.)

Exactly. So why program in C/C++ when you can program in CL and get much
more speed out of your program than in SLDJ? :)

Cheers

-- 
Marco Antoniotti ========================================================
NYU Courant Bioinformatics Group        tel. +1 - 212 - 998 3488
719 Broadway 12th Floor                 fax  +1 - 212 - 995 4122
New York, NY 10003, USA                 http://bioinformatics.cat.nyu.edu
                    "Hello New York! We'll do what we can!"
                           Bill Murray in `Ghostbusters'.
From: Nicolas Neuss
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <87ptxd4jhn.fsf@ortler.iwr.uni-heidelberg.de>
Marco Antoniotti <·······@cs.nyu.edu> writes:

> It is the `(second my-code)' form that gives away the power of CL.
> You do not handle strings.  You handle CL data here (i.e. `cons'
> cells: programs are data).  Hence this kind of "extension programming"
> does not mix text and regular code in CL. Instead, you must do that in
> any other environment (except Prolog and Smalltalk AFAIK).  (And the
> above gets compiled down to machine language).

I know that Prolog is as flexible or even more flexible than Lisp in
extending its Syntax.  But could you point me to some link where this
is described for Smalltalk?

Thanks, Nicolas,
From: Marco Antoniotti
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <y6cptxd9o89.fsf@octagon.mrl.nyu.edu>
Nicolas Neuss <·············@iwr.uni-heidelberg.de> writes:

> Marco Antoniotti <·······@cs.nyu.edu> writes:
> 
> > It is the `(second my-code)' form that gives away the power of CL.
> > You do not handle strings.  You handle CL data here (i.e. `cons'
> > cells: programs are data).  Hence this kind of "extension programming"
> > does not mix text and regular code in CL. Instead, you must do that in
> > any other environment (except Prolog and Smalltalk AFAIK).  (And the
> > above gets compiled down to machine language).
> 
> I know that Prolog is as flexible or even more flexible than Lisp in
> extending its Syntax.

The Prolog 'reader' essentially has provisions to extend an infix
reader with operators.  I.e. it gives you hooks to manipulates an
expression grammar.  The CL reader coupled with the Macro system give
you more flexibility at a lower level but does not give you "high
level" control in the form of a "grammar parser" (please extrapolate
as necessary from my description).

> But could you point me to some link where this
> is described for Smalltalk?

I have no direct experience on this, but I have seen Smalltalk code
handling "blocks" of code in a rather straighforward way.  A search
frm the Squeak pages will help.  All in all, this may be akin to
having an AST manipulation library, like (I hear) is available in
Python.  (Of course, I bet several minds in the Python/Ruby/SLDJ camp
are busy re-implementing these features, thus reconfirming
Greenspun's :) ).

Cheers

-- 
Marco Antoniotti ========================================================
NYU Courant Bioinformatics Group        tel. +1 - 212 - 998 3488
719 Broadway 12th Floor                 fax  +1 - 212 - 995 4122
New York, NY 10003, USA                 http://bioinformatics.cat.nyu.edu
                    "Hello New York! We'll do what we can!"
                           Bill Murray in `Ghostbusters'.
From: JRStern
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <3d3ce837.51718847@news.verizon.net>
On Mon, 22 Jul 2002 23:25:54 GMT, Shayne Wissler <·········@yahoo.com>
wrote:
>> GWBasic could dynamically load source into a running program.  Heck,
>> most scripting languages can write text to a disk file then invoke the
>> file, and it's not all that different.
>
>Hold on--the requirement is that the generated program can access the 
>variables and functions of the generating program. Otherwise C would count 
>as having this feature because you can generate C source, compile it, and 
>execute it from a C program.

Well, if that's *the* requirement, ...

... maybe C does qualify, if crudely, since you can always persist the
state, gen the new program, initiate it, propagate the state, yada
yada.  In SQL systems, most or all of the state resides in the
database, so generating and eval'ing SQL source on the fly is quite
useful.

Actually, I was confusing static scoping with the context for run-time
evaluation in my previous message.  I don't recall just what Common
Lisp allowed, but I'm pretty sure that Interlisp allowed all the
dynamism you wanted.

Anyhow, GWBasic still seems to qualify.

Is there a practical or a principled motivation for these questions?
Pretty much whatever you can do by generating code and dynamically
loading and evaluating it, can be done pretty nearly the same way by
interpreting within a fixed body of code.

J.
From: Shayne Wissler
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <Of6%8.82643$_51.76753@rwcrnsc52.ops.asp.att.net>
"JRStern" <················@gte.net> wrote in message
······················@news.verizon.net...
> On Mon, 22 Jul 2002 23:25:54 GMT, Shayne Wissler <·········@yahoo.com>
> wrote:
> >> GWBasic could dynamically load source into a running program.  Heck,
> >> most scripting languages can write text to a disk file then invoke the
> >> file, and it's not all that different.
> >
> >Hold on--the requirement is that the generated program can access the
> >variables and functions of the generating program. Otherwise C would
count
> >as having this feature because you can generate C source, compile it, and
> >execute it from a C program.
>
> Well, if that's *the* requirement, ...
>
> ... maybe C does qualify, if crudely, since you can always persist the
> state, gen the new program, initiate it, propagate the state, yada
> yada.

C doesn't qualify.

> Is there a practical or a principled motivation for these questions?

Yes.

> Pretty much whatever you can do by generating code and dynamically
> loading and evaluating it, can be done pretty nearly the same way by
> interpreting within a fixed body of code.

Sure, and anything you can do in Lisp you can do in C.


Shayne Wissler
From: Marco Antoniotti
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <y6cu1mq5wu8.fsf@octagon.mrl.nyu.edu>
"Shayne Wissler" <·········@yahoo.com> writes:

> 
> Sure, and anything you can do in Lisp you can do in C.

Tape and read/write heads also work. :)

Cheers

-- 
Marco Antoniotti ========================================================
NYU Courant Bioinformatics Group        tel. +1 - 212 - 998 3488
719 Broadway 12th Floor                 fax  +1 - 212 - 995 4122
New York, NY 10003, USA                 http://bioinformatics.cat.nyu.edu
                    "Hello New York! We'll do what we can!"
                           Bill Murray in `Ghostbusters'.
From: JRStern
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <3d3d7109.3418265@news.verizon.net>
On Tue, 23 Jul 2002 05:58:06 GMT, "Shayne Wissler"
<·········@yahoo.com> wrote:
>> ... maybe C does qualify, if crudely, since you can always persist the
>> state, gen the new program, initiate it, propagate the state, yada
>> yada.
>
>C doesn't qualify.

Party-pooper.

>> Is there a practical or a principled motivation for these questions?
>
>Yes.

Then just for the record, what is it?

>> Pretty much whatever you can do by generating code and dynamically
>> loading and evaluating it, can be done pretty nearly the same way by
>> interpreting within a fixed body of code.
>
>Sure, and anything you can do in Lisp you can do in C.

Anything along the lines of these interpretations.  Anyway, it ain't
me telling anybody to use Lisp for anything.

J.
From: Shayne Wissler
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <UQe%8.87767$_51.76688@rwcrnsc52.ops.asp.att.net>
"JRStern" <················@gte.net> wrote in message
·····················@news.verizon.net...

> >> Is there a practical or a principled motivation for these questions?
> >
> >Yes.
>
> Then just for the record, what is it?

I had come to the conclusion that the ideal language ought to be able to
seamlessly extend itself, dynamically. I read about Lisp, which sounded like
it had that feature, but it wasn't clear from my reading that it worked the
way I'd expect.


Shayne Wissler
From: JRStern
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <3d3decb1.35074344@news.verizon.net>
On Tue, 23 Jul 2002 15:43:48 GMT, "Shayne Wissler"
<·········@yahoo.com> wrote:
>I had come to the conclusion that the ideal language ought to be able to
>seamlessly extend itself, dynamically. I read about Lisp, which sounded like
>it had that feature, but it wasn't clear from my reading that it worked the
>way I'd expect.

Pretty much any language that supports functions or methods can be
looked at as extensible.  C++'s overloaded operators are very powerful
that way.  I keep wishing someone had a SQL that allowed overloaded
operators (does anybody?) for user-defined types, but I guess the
solution is to use Java for database procedures.

There was a book about a zillion years ago, "The Art of the Metaobject
Protocol", about using CLOS and the art and science of modifying its
basic syntax and semantics, whether on the fly or offline.  I reviewed
the book for SIGArt and took the same line then, that the authors
never quite explained why it was a good idea to do that -- not just
extend the language, but warp even the basic features.

So, I guess I agree with you, but it's unclear to me that the language
has to be able to extend itself on the fly.  More important to me is
the ability to change class definitions on the fly, which some
scripting languages now support.

Joshua Stern
From: Doug McNaught
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <m3lm81ucxt.fsf@abbadon.mcnaught.org>
················@gte.net (JRStern) writes:

> On Tue, 23 Jul 2002 15:43:48 GMT, "Shayne Wissler"
> <·········@yahoo.com> wrote:
> >I had come to the conclusion that the ideal language ought to be able to
> >seamlessly extend itself, dynamically. I read about Lisp, which sounded like
> >it had that feature, but it wasn't clear from my reading that it worked the
> >way I'd expect.
> 
> Pretty much any language that supports functions or methods can be
> looked at as extensible.  C++'s overloaded operators are very powerful
> that way.  I keep wishing someone had a SQL that allowed overloaded
> operators (does anybody?) for user-defined types, but I guess the
> solution is to use Java for database procedures.

PostgreSQL has this:

http://www3.us.postgresql.org/users-lounge/docs/7.2/postgres/sql-createoperator.html

-Doug
From: Jochen Schmidt
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ahl02c$i4s$1@rznews2.rrze.uni-erlangen.de>
JRStern wrote:

> On Tue, 23 Jul 2002 15:43:48 GMT, "Shayne Wissler"
> <·········@yahoo.com> wrote:
>>I had come to the conclusion that the ideal language ought to be able to
>>seamlessly extend itself, dynamically. I read about Lisp, which sounded
>>like it had that feature, but it wasn't clear from my reading that it
>>worked the way I'd expect.
> 
> Pretty much any language that supports functions or methods can be
> looked at as extensible.  C++'s overloaded operators are very powerful
> that way.  I keep wishing someone had a SQL that allowed overloaded
> operators (does anybody?) for user-defined types, but I guess the
> solution is to use Java for database procedures.

Or to use Common Lisp. OnShores "UncommonSQL" (called after Xanalys' 
"CommonSQL" from which it took many ideas) is a really nice database 
binding. It enables you for example to access tables as CLOS classes and to 
create tables out of CLOS classes. It uses the MOP to do that.

> There was a book about a zillion years ago, "The Art of the Metaobject
> Protocol", about using CLOS and the art and science of modifying its
> basic syntax and semantics, whether on the fly or offline.  I reviewed
> the book for SIGArt and took the same line then, that the authors
> never quite explained why it was a good idea to do that -- not just
> extend the language, but warp even the basic features.

It's not zillion years ;-) - mine (fourth print) is from 1995. The first 
edition was printed 1991. 
AMOP actually describes how CLOS can be implemented on itself and following 
from that offering a protocol to extend/modify it's behaviour. The above 
mentioned CommonSQL/UncommonSQL are based on that facility (the MOP). 

> So, I guess I agree with you, but it's unclear to me that the language
> has to be able to extend itself on the fly.  More important to me is
> the ability to change class definitions on the fly, which some
> scripting languages now support.

Saying "The language is able to extend itself" seems to be  just to short to 
show the whole consequences.  It is not so much the "on the fly" part that 
is so interesting. That would imply that one wants to extend the language 
while some readily deployed application is running which _may_ be useful 
too (Emacs, AutoCAD...) but is not the whole point. One important thing is 
that Lisp development happens while what later will be the application is 
already running (you incrementally add functionality to the running 
system). The nice thing is, regardless of what you want to change - syntax, 
code-generation, I/O, object-system... you can do it in Lisp itself having 
the whole language available all the time.

ciao,
Jochen

--
http://www.dataheaven.de
From: JRStern
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <3d3e0eb4.43781905@news.verizon.net>
On Wed, 24 Jul 2002 03:34:59 +0200, Jochen Schmidt <···@dataheaven.de>
wrote:
>> There was a book about a zillion years ago, "The Art of the Metaobject
>> Protocol", about using CLOS and the art and science of modifying its
>> basic syntax and semantics, whether on the fly or offline....

>It's not zillion years ;-) - mine (fourth print) is from 1995. The first 
>edition was printed 1991. 

I think I was running a 25mhz system with 1mb RAM back then.  That's
about the same technology as Fred Flintstone used.

When I graduated from college, there was no life on earth other than
blue-green algae, and 2 MIP mainframes from IBM with two megabytes of
2 microsecond ferrite memory.

J.
From: cr88192
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ujs9v0eppef482@corp.supernews.com>
JRStern wrote:

> On Wed, 24 Jul 2002 03:34:59 +0200, Jochen Schmidt <···@dataheaven.de>
> wrote:
>>> There was a book about a zillion years ago, "The Art of the Metaobject
>>> Protocol", about using CLOS and the art and science of modifying its
>>> basic syntax and semantics, whether on the fly or offline....
> 
>>It's not zillion years ;-) - mine (fourth print) is from 1995. The first
>>edition was printed 1991.
> 
> I think I was running a 25mhz system with 1mb RAM back then.  That's
> about the same technology as Fred Flintstone used.
> 
and what would that be? cpu's at that speed tended to have a bit more ram, 
and systems with that amount of ram tended to run at slower speeds...

> When I graduated from college, there was no life on earth other than
> blue-green algae, and 2 MIP mainframes from IBM with two megabytes of
> 2 microsecond ferrite memory.
> 
I get quite tired of that "oh, when from dinasaur time" type crap... 
computers are better hardware wise but software now is crap, and if the 
software wasn't crap then we could do a whole lot more with the hardware.

software in general (usefulness and raw power) has not improved much. is 
"java" much better than anything that came before?

I have not even been to college yet, so really I should have more grounds 
for this argument. however, I don't make this argument as I can appriciate 
older technologies.
pc's were comming on strong by the time I was born, I spent much of the 
first part of my life with "dinasaur" technologies, thus I can appriciate 
the work that has gone into the hardware.
however the software industry is stagnent and backwater, just riding the 
wave of hardware improvement...

I think it is just pitiful that hobbyists can nearly defeat the companies, 
and even more pitiful how the companies respond, and this is from me, a 
hobbyist.

so please, don't use those arguments...

-- 
<·······@hotmail.spam.com>
<http://bgb1.hypermart.net/>
From: Marco Antoniotti
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <y6clm819mwl.fsf@octagon.mrl.nyu.edu>
················@gte.net (JRStern) writes:

> On Tue, 23 Jul 2002 15:43:48 GMT, "Shayne Wissler"
> <·········@yahoo.com> wrote:
> >I had come to the conclusion that the ideal language ought to be able to
> >seamlessly extend itself, dynamically. I read about Lisp, which sounded like
> >it had that feature, but it wasn't clear from my reading that it worked the
> >way I'd expect.
> 
> Pretty much any language that supports functions or methods can be
> looked at as extensible.  C++'s overloaded operators are very powerful
> that way.  I keep wishing someone had a SQL that allowed overloaded
> operators (does anybody?) for user-defined types, but I guess the
> solution is to use Java for database procedures.

There are degrees of extensibility.  C++ operator overloading (sans
multiple dispatching, something you have in CLOS :) ) goes a long way
to allow you to "extend" the language.  But, always given that Tape,
and Read/Write heads essentially allow for "extensibility", CL is
still unparalled in its ability to "extend itself".

Suppose you want to open a file and make sure you close it when you
are done, while handling errors.

The C++/Java/SLDJ (Scripting Language Du Jour) idiom to do this is

        FILE f = NULL;
        try
          {
                f = open("f.dat", "w");
                print(f, "something");
          }
        finally
          {
                close(f);
          }

apart from the fact that this sort of control structure was present in
Lisp and CL way before it appeared in C/C++ (hence Java and your SLDJ)
the CL equivalent is

        (let ((f nil))
          (unwind-protect
             (progn
                (setf f (open "f.dat" :direction :output))
                (print "something" f))
             (close f)))

Now, the interesting thing about CL is that you can write a macro that
wraps the above piece of code in a nice way.

        (defmacro with-file-open ((fvar file &key (direction :output))
                                  &body forms)
          `(let ((,fvar nil))
             (unwind-protect
                 (progn
                    (setf ,fvar (open ,file :direction ,direction))
                    ,@forms)
                 (close ,fvar))))

after you have defined your macro you can say

        (with-file-open (f "f.dat")
           (print "something" f))

There is no way to do that as easily in any other language. Especially
your SLDJ!  Certainly, you do not get it compiled down to native
machine code.

Essentially you have added a completely new construct to CL.  To do so
in Java you need to write a complete parser (I know! I did it).  Busy
minds are surely working right now to get this sort of things in your
SLDJ (thus reinforcing Greenspun's Tenth).

BTW. `with-open-file' is a standard ANSI CL macro.

Moreover! Do you know that you can write a `try' macro in CL to make
C++/Java/SLDJ programmers happy?

Since you are so interested in data base programming, can you see the
implications for a `with-transaction' macro?

> There was a book about a zillion years ago, "The Art of the Metaobject
> Protocol", about using CLOS and the art and science of modifying its
> basic syntax and semantics, whether on the fly or offline.  I reviewed
> the book for SIGArt and took the same line then, that the authors
> never quite explained why it was a good idea to do that -- not just
> extend the language, but warp even the basic features.

You are entitled to this opinion.  Not that I agree with you.
Especially when you do not explain what are "the basic features".

> So, I guess I agree with you, but it's unclear to me that the language
> has to be able to extend itself on the fly.  More important to me is
> the ability to change class definitions on the fly, which some
> scripting languages now support.

Great! Another application of Greenspun's Tenth Rule of
Programming applied to SLDJ!

`change-class' has been part of Common Lisp for ages (since the
inception of CLOS).  *And* you still get the calls to `change-class'
compiled to native machine language in Common Lisp.

Moreover, can you explain to us how you did not appreciate "The Art of
the MOP" and at the same time like one of the operations that fully
require all its power to be understood correctly?

Cheers

-- 
Marco Antoniotti ========================================================
NYU Courant Bioinformatics Group        tel. +1 - 212 - 998 3488
719 Broadway 12th Floor                 fax  +1 - 212 - 995 4122
New York, NY 10003, USA                 http://bioinformatics.cat.nyu.edu
                    "Hello New York! We'll do what we can!"
                           Bill Murray in `Ghostbusters'.
From: Steven E. Harris
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <877kjlnn5s.fsf@harris.sdo.us.ray.com>
Marco Antoniotti <·······@cs.nyu.edu> writes:

> Suppose you want to open a file and make sure you close it when you
> are done, while handling errors.
>
> The C++/Java/SLDJ (Scripting Language Du Jour) idiom to do this is
>
>         FILE f = NULL;
>         try
>           {
>                 f = open("f.dat", "w");
>                 print(f, "something");
>           }
>         finally
>           {
>                 close(f);
>           }

Please don't include C++ in your pejorative set. Its destructors take
care of your criticism:

std::ifstream f( "f.dat", ios_base::out );
f << "something";

The file is automatically closed in f's destructor, provided the file
had been opened successfully in the constructor. Omitted are the
checks whether the file was opened successfully and whether the write
succeeded. Failure in the former will of course manifest in the
latter.

-- 
Steven E. Harris        :: ········@raytheon.com
Raytheon                :: http://www.raytheon.com
From: Steven E. Harris
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <873cu9nm1d.fsf@harris.sdo.us.ray.com>
Steven E. Harris <········@raytheon.com> writes:

> std::ifstream f( "f.dat", ios_base::out );
       ^

And I should read my own writing. Correction:

std::ofstream f( "f.dat" );

-- 
Steven E. Harris        :: ········@raytheon.com
Raytheon                :: http://www.raytheon.com
From: Marco Antoniotti
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <y6cadoh9ihz.fsf@octagon.mrl.nyu.edu>
Steven E. Harris <········@raytheon.com> writes:

> Marco Antoniotti <·······@cs.nyu.edu> writes:
> 
> > Suppose you want to open a file and make sure you close it when you
> > are done, while handling errors.
> >
> > The C++/Java/SLDJ (Scripting Language Du Jour) idiom to do this is
> >
> >         FILE f = NULL;
> >         try
> >           {
> >                 f = open("f.dat", "w");
> >                 print(f, "something");
> >           }
> >         finally
> >           {
> >                 close(f);
> >           }
> 
> Please don't include C++ in your pejorative set. Its destructors take
> care of your criticism:
> 
> std::ifstream f( "f.dat", ios_base::out );
> f << "something";
> 
> The file is automatically closed in f's destructor, provided the file
> had been opened successfully in the constructor. Omitted are the
> checks whether the file was opened successfully and whether the write
> succeeded. Failure in the former will of course manifest in the
> latter.

You are right and I of course should have mentioned that.  However, my
point was about the "overall extensibility" of the language, not about
file open/close semantics and how it is implemented in any language.
The fact that the file open/close idiom with errors is such a useful
one to explain the effect of "extensibility" motivated my example.

To further comment on your note, I must point out that (as you imply)
since you do not check for errors the example is not completely
equivalent.

To be "equivalent" you have to do at least the following (of corse you
can do it differently)

     {  // Note the block starting here to ensure the call of std::~ofstream()
        std::ofstream f("f.dat", ios_base::out);
        
        if (f && f.good())
          {
             f << "something" << endl;
          }
        else
          {
             ; // Some error signaling condition I suppose.
          }
      }

which may be better than the `try ... catch ...' idiom, unless you
want to actually `throw' some exception in the `else' branch.

Nevertheless, you cannot pack all of this in C++ into a

        with_open_file (f, "f.dat", std::ofstream)
          {
             f << "somthing" << endl;
          }

unless you rewrite your C++ parser.

You can play tricks with CPP... (untested).

==============================================================================

#define with_open_file(_f_, _file_, _class_, _iosbase_flag_) \\
   {  // Note the block starting here to ensure the call of std::~ofstream() \\
        _class_ _f_(_file_, _class_, _iosbase_flag_);                        \\
                                                                             \\
        if (_f_ && _f_.good())                                               \\

#define fow \\
        else                                                                 \\
          {                                                                  \\
             ; // Some error signaling condition I suppose.                  \\
          }                                                                  \\
    }

==============================================================================

and the say

        with_open_file(f, "f.dat", std::ofstream, ios_base::out)
             f << "something" << endl;
        fow;

but you do that with less guarantees of correctness (since you
essentially split up the block where the scope of '_f_' is defined)
and without the full power of the language (compiler included) at your
fingertips; and it is this power that allows you to insert syntactic
and semantics checks at "macro expansion" time.  Finally, the above
does not look and feel like C++.

Cheers

-- 
Marco Antoniotti ========================================================
NYU Courant Bioinformatics Group        tel. +1 - 212 - 998 3488
719 Broadway 12th Floor                 fax  +1 - 212 - 995 4122
New York, NY 10003, USA                 http://bioinformatics.cat.nyu.edu
                    "Hello New York! We'll do what we can!"
                           Bill Murray in `Ghostbusters'.
From: Shayne Wissler
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <YoA%8.100031$_51.86268@rwcrnsc52.ops.asp.att.net>
"Steven E. Harris" <········@raytheon.com> wrote in message
···················@harris.sdo.us.ray.com...

> Please don't include C++ in your pejorative set. Its destructors take
> care of your criticism:
>
> std::ifstream f( "f.dat", ios_base::out );
> f << "something";

I would say "touche", except for the fact that you have ignored the broader
point: with lisp, if one needs a new construct, one has the liberty to write
it.


Shayne Wissler
From: Steven E. Harris
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <87u1mpm538.fsf@harris.sdo.us.ray.com>
"Shayne Wissler" <·········@yahoo.com> writes:

> with lisp, if one needs a new construct, one has the liberty to
> write it.

Agreed. The closest we come in C++ is to emulate some "around"
behavior with paired ctor-dtor RAII types. That is, the simple
creation of an instance of a type with a nontrivial constructor and/or
destructor guarantees that some instructions will run.

-- 
Steven E. Harris        :: ········@raytheon.com
Raytheon                :: http://www.raytheon.com
From: Christopher Barber
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <psoy9c1c9pp.fsf@boris.curl.com>
Marco Antoniotti <·······@cs.nyu.edu> writes:

> Now, the interesting thing about CL is that you can write a macro that
> wraps the above piece of code in a nice way.
> 
>         (defmacro with-file-open ((fvar file &key (direction :output))
>                                   &body forms)
>           `(let ((,fvar nil))
>              (unwind-protect
>                  (progn
>                     (setf ,fvar (open ,file :direction ,direction))
>                     ,@forms)
>                  (close ,fvar))))
> 
> after you have defined your macro you can say
> 
>         (with-file-open (f "f.dat")
>            (print "something" f))
> 
> There is no way to do that as easily in any other language. 

You can do this this in any language with a sufficiently powerful macro
system, not just CL.

- Christopher
From: Shayne Wissler
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <3PB%8.117430$Wt3.99978@rwcrnsc53>
"Christopher Barber" <·······@curl.com> wrote in message
····················@boris.curl.com...

> > There is no way to do that as easily in any other language.
>
> You can do this this in any language with a sufficiently powerful macro
> system, not just CL.

What other languages have such a system (Curl, I presume?)?


Shayne Wissler
From: JRStern
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <3d3eec2b.11959336@news.verizon.net>
On Wed, 24 Jul 2002 17:51:59 GMT, "Shayne Wissler"
<·········@yahoo.com> wrote:
>> > There is no way to do that as easily in any other language.
>>
>> You can do this this in any language with a sufficiently powerful macro
>> system, not just CL.
>
>What other languages have such a system (Curl, I presume?)?

Who cares?

You can do it in a method/function/procedure, instead of a macro in
virtually any language, or wrap it in a class, which is probably
neater still.

J.
From: Marco Antoniotti
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <y6c8z41nfe8.fsf@octagon.mrl.nyu.edu>
················@gte.net (JRStern) writes:

> On Wed, 24 Jul 2002 17:51:59 GMT, "Shayne Wissler"
> <·········@yahoo.com> wrote:
> >> > There is no way to do that as easily in any other language.
> >>
> >> You can do this this in any language with a sufficiently powerful macro
> >> system, not just CL.
> >
> >What other languages have such a system (Curl, I presume?)?
> 
> Who cares?

Some of us do.

> 
> You can do it in a method/function/procedure, instead of a macro in
> virtually any language, or wrap it in a class, which is probably
> neater still.

Sure.  It even become neater when you have multiple dispatching in
your Object System.

Cheers


-- 
Marco Antoniotti ========================================================
NYU Courant Bioinformatics Group        tel. +1 - 212 - 998 3488
715 Broadway 10th Floor                 fax  +1 - 212 - 995 4122
New York, NY 10003, USA                 http://bioinformatics.cat.nyu.edu
                    "Hello New York! We'll do what we can!"
                           Bill Murray in `Ghostbusters'.
From: Shayne Wissler
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <xaC%8.102153$_51.86478@rwcrnsc52.ops.asp.att.net>
"JRStern" <················@gte.net> wrote in message
······················@news.verizon.net...
> On Wed, 24 Jul 2002 17:51:59 GMT, "Shayne Wissler"
> <·········@yahoo.com> wrote:
> >> > There is no way to do that as easily in any other language.
> >>
> >> You can do this this in any language with a sufficiently powerful macro
> >> system, not just CL.
> >
> >What other languages have such a system (Curl, I presume?)?
>
> Who cares?
>
> You can do it in a method/function/procedure, instead of a macro in
> virtually any language, or wrap it in a class, which is probably
> neater still.

It takes some vision to see why one might want to tailor the semantics the
language designer happened to provide. Not much, but some.


Shayne Wissler
From: Tim Bradshaw
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ey3k7nk7p9e.fsf@cley.com>
* JXSternChangeX2R  wrote:

> You can do it in a method/function/procedure, instead of a macro in
> virtually any language, or wrap it in a class, which is probably
> neater still.

Yes, but can you write an iteration construct using this technique, or
at least one which is concise in use?  WITH-x macros are kind of a
classic case of something which, in most cases you *can* implement
with a CALL-WITH-x function and first-class functions (which kind of
gives the lie to the `virtually any language' claim), without
introducing too much extra syntax.  I don't think you could implement
LOOP that way.

Note the `too much extra syntax': arguments from Turing equivalence
are not very interesting.

--tim
From: Stephen J. Bevan
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <m3it343eu2.fsf@dino.dnsalias.com>
Tim Bradshaw <···@cley.com> writes:
> Yes, but can you write an iteration construct using this technique, or
> at least one which is concise in use?  WITH-x macros are kind of a
> classic case of something which, in most cases you *can* implement
> with a CALL-WITH-x function and first-class functions (which kind of
> gives the lie to the `virtually any language' claim), without
> introducing too much extra syntax.  I don't think you could implement
> LOOP that way.

I don't either.  However, whether that is good or bad depends on
whether you think LOOP, or something similar, is a good way to do
iteration or not.  Smalltalk uses blocks (closures) to do its
iteration with messages like collect: and reject: to some of the
things that LOOP does.  Consequently, the inability to write an
equivalent of the LOOP macro in Smalltalk might not be particularly
convincing case for macros to a Smalltalker.
From: Hannah Schroeter
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ahorgu$ana$1@c3po.schlund.de>
Hello!

Stephen J. Bevan <·······@dino.dnsalias.com> wrote:
>Tim Bradshaw <···@cley.com> writes:
>> Yes, but can you write an iteration construct using this technique, or
>> at least one which is concise in use?  WITH-x macros are kind of a
>> classic case of something which, in most cases you *can* implement
>> with a CALL-WITH-x function and first-class functions (which kind of
>> gives the lie to the `virtually any language' claim), without
>> introducing too much extra syntax.  I don't think you could implement
>> LOOP that way.

>I don't either.  However, whether that is good or bad depends on
>whether you think LOOP, or something similar, is a good way to do
>iteration or not.  Smalltalk uses blocks (closures) to do its
>iteration with messages like collect: and reject: to some of the
>things that LOOP does.  Consequently, the inability to write an
>equivalent of the LOOP macro in Smalltalk might not be particularly
>convincing case for macros to a Smalltalker.

Loop is one thing. And in Lisp, there's map and reduce and remove-if
and so on, too, which roughly matches Collection do: aBlock, etc.
of Smalltalk.

How about seamless integration of e.g. scanner or parser generators
into the used programming language?

I could in fact envision e.g. an LALR(1) generator, written as functions,
loaded in at compile time, interfaced in with a frontend macro,
and thus seamlessly integrated into the rest of CL compilation.

Kind regards,

Hannah.
From: Stephen J. Bevan
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <m31y9ryj01.fsf@dino.dnsalias.com>
······@schlund.de (Hannah Schroeter) writes:
> How about seamless integration of e.g. scanner or parser generators
> into the used programming language?
> 
> I could in fact envision e.g. an LALR(1) generator, written as functions,
> loaded in at compile time, interfaced in with a frontend macro,
> and thus seamlessly integrated into the rest of CL compilation.

Agreed.  A (very common) alternative in macro-less languages is to
define the parser generator language and write a compiler for it
(either from scratch, using an existing parser generator or build it
around some "generic" format such as XML).  Of course, this doesn't
have the seamless integration that macros can provide.

A less common alternative is not to use a parser generator at all and
use parser combinators which are just higher order functions.  This
gives you all the seamless integration of the macro approach though at
the cost of restricting yourself to top-down parsing.

So while both approaches solve many problems neither quite give you
what macros in Common Lisp can provide.  Therefore this shows that
macros are a more powerful, general solution.  Where it becomes
subjective is whether each person thinks that macros provide them with
something that is significantly better than they've already got.
The key words being "significantly better".  Like it or not, people
are doing to disagree on that.
From: Nils Goesche
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <lk7kjjnb6m.fsf@pc022.bln.elmeg.de>
······@schlund.de (Hannah Schroeter) writes:

> I could in fact envision e.g. an LALR(1) generator, written as
> functions, loaded in at compile time, interfaced in with a frontend
> macro, and thus seamlessly integrated into the rest of CL
> compilation.

That's just what Lispworks' DEFPARSER macro does, BTW.

Regards,
-- 
Nils Goesche
"Don't ask for whom the <CTRL-G> tolls."

PGP key ID 0x42B32FC9
From: Tim Bradshaw
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ey3bs8uztrt.fsf@cley.com>
* Stephen J Bevan wrote:

> I don't either.  However, whether that is good or bad depends on
> whether you think LOOP, or something similar, is a good way to do
> iteration or not.  Smalltalk uses blocks (closures) to do its
> iteration with messages like collect: and reject: to some of the
> things that LOOP does.  Consequently, the inability to write an
> equivalent of the LOOP macro in Smalltalk might not be particularly
> convincing case for macros to a Smalltalker.

A stronger case, I think would be this:  Take a dialect of Lisp which
has *no* iteration constructs, but does have GO (and BLOCK,
RETURN-FROM, TAGBODY), and write some set of iteration constructs of
your choice.  That's essentially what CL does, of course...

--tim
From: Stephen J. Bevan
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <m3r8ho683n.fsf@dino.dnsalias.com>
Tim Bradshaw <···@cley.com> writes:
> A stronger case, I think would be this:  Take a dialect of Lisp which
> has *no* iteration constructs, but does have GO (and BLOCK,
> RETURN-FROM, TAGBODY), and write some set of iteration constructs of
> your choice.  That's essentially what CL does, of course...

In what way is that a stronger case?  Smalltalk already builds up
methods like select: and reject: based on simpler iteration
constructs.  If you need something that doesn't fit into the existing
methods that exist, then you are free to define your own, just as you
are in Lisp if you find that LOOP is not satisfactory (is that
possible :-).  The critical difference being that no macros are used
in Smalltalk.
From: Tim Bradshaw
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ey3bs8sxjfg.fsf@cley.com>
* Stephen J Bevan wrote:

> In what way is that a stronger case?  Smalltalk already builds up
> methods like select: and reject: based on simpler iteration
> constructs.  If you need something that doesn't fit into the existing
> methods that exist, then you are free to define your own, just as you
> are in Lisp if you find that LOOP is not satisfactory (is that
> possible :-).  The critical difference being that no macros are used
> in Smalltalk.

I'd like to cheat at this point by not counting languages which
already have weird syntax.  SmallTalk and Forth-family languages can
do iteration constructs without macros, but they do it by making
everything look funny (and Lisp makes everything look a bit funny too,
of course, in order to support macros).

Really, Lisp macros are about syntactic transformation, and therefore
the ability to create a programming language to your own taste.  If
you don't think that's important, then you won't think Lisp macros are
important.

--tim
From: Stephen J. Bevan
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <m3wurf4q5h.fsf@dino.dnsalias.com>
Tim Bradshaw <···@cley.com> writes:
> I'd like to cheat at this point by not counting languages which
> already have weird syntax.  SmallTalk and Forth-family languages can
> do iteration constructs without macros, but they do it by making
> everything look funny (and Lisp makes everything look a bit funny too,
> of course, in order to support macros).

In Forth you'd probably do it with macros (immediate words) but other
than that I take your point.

> Really, Lisp macros are about syntactic transformation, and therefore
> the ability to create a programming language to your own taste.  If
> you don't think that's important, then you won't think Lisp macros are
> important.

It is not that I don't think it is important.  It is rather that I
don't think the examples that were given in this thread did anything
that you couldn't do naturally without syntactic transformation in a
suitable language (e.g. Smalltalk).  I'm not claiming this is always
true, only trying to point out why someone might not think it is that
important if they don't see examples that they can't achieve some
other way (without having to invoke turing completness to do it).
From: Kent M Pitman
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <sfw4reosf6l.fsf@shell01.TheWorld.com>
[ replying to comp.lang.lisp only
  http://world.std.com/~pitman/pfaq/cross-posting.html ]

·······@dino.dnsalias.com (Stephen J. Bevan) writes:

> Tim Bradshaw <···@cley.com> writes:
> > Yes, but can you write an iteration construct using this technique, or
> > at least one which is concise in use?  WITH-x macros are kind of a
> > classic case of something which, in most cases you *can* implement
> > with a CALL-WITH-x function and first-class functions (which kind of
> > gives the lie to the `virtually any language' claim), without
> > introducing too much extra syntax.  I don't think you could implement
> > LOOP that way.
> 
> I don't either.  However, whether that is good or bad depends on
> whether you think LOOP, or something similar, is a good way to do
> iteration or not.  Smalltalk uses blocks (closures) to do its
> iteration with messages like collect: and reject: to some of the
> things that LOOP does.  Consequently, the inability to write an
> equivalent of the LOOP macro in Smalltalk might not be particularly
> convincing case for macros to a Smalltalker.

The paradox of LOOP is that if you confine yourself only to the loops
that look stylistically clean and simple (which are the ones that Smalltalk
probably does well), you get a subset that would never have caused LOOP
to come into existence in the first place.

Like it or not, LOOP was invented to allow you to write the complex stuff
that is hard to express in terms of simple functional iterators because
the control structure is a tangle.  LOOP's real power comes when you are
mixing multiple sources and collection styles at once, and I daresay that
for all these may look contorted in LOOP, they probably look worse in
Smalltalk.

The world is not always pretty.  When it is, one must not blame the lack
of prettiness on the language that tries to emulate it.  Sometimes tangled
webs of reality lead to tangled webs in programs.  Saying that Smalltalk
programmers might be content not to deal with such cases resonates in
my brain as if you're saying Smalltalk programmers might content not to deal
with such cases of reality.  Your mileage will doubtless vary.
From: Stephen J. Bevan
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <m3bs8wxzbr.fsf@dino.dnsalias.com>
Kent M Pitman <······@world.std.com> writes:
> The world is not always pretty.  When it is, one must not blame the lack
> of prettiness on the language that tries to emulate it.  Sometimes tangled
> webs of reality lead to tangled webs in programs.  Saying that Smalltalk
> programmers might be content not to deal with such cases resonates in
> my brain as if you're saying Smalltalk programmers might content not to deal
> with such cases of reality.  Your mileage will doubtless vary.

I didn't say that Smalltalk programmers are content not to deal with
the cases.  I said that the block/closures approach gets you a long
way without having to create a language for iteration.  For the cases
that are left then it is a judgement call whether using LOOP or
defining your own special purpose iterator macro is better than
griding it out using the facilities that you do have.  The more and
more times you have to deal with such complex loops the more
attractive LOOP (and the ability to define your own iteration macros)
becomes.  However, for those not familiar with LOOP and/or defining
special purpose iteration macros (which shouldn't be anyone now that
this has been restricted to comp.lang.lisp) then simply mentioning
LOOP or macros for iteration isn't necessarily going to mean much.
For them, concrete examples are needed, preferably ones that aren't
easy to do using a blocks/closure approach to iteration.
From: Ingvar Mattsson
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <87vg74qbf6.fsf@gruk.tech.ensign.ftech.net>
················@gte.net (JRStern) writes:

> On Wed, 24 Jul 2002 17:51:59 GMT, "Shayne Wissler"
> <·········@yahoo.com> wrote:
> >> > There is no way to do that as easily in any other language.
> >>
> >> You can do this this in any language with a sufficiently powerful macro
> >> system, not just CL.
> >
> >What other languages have such a system (Curl, I presume?)?
> 
> Who cares?
> 
> You can do it in a method/function/procedure, instead of a macro in
> virtually any language, or wrap it in a class, which is probably
> neater still.

One macro I wrote recently (in patr of implementing an NNTP library
for Common Lisp, it's not ready yet, the previous version was *very*
much too icky to let anyone look at) is:

(defmacro prefixcase (var &rest body)
  (let ((lenvar (gensym))
        (gvar (gensym)))
    `(let* ((,gvar ,var)
            (,lenvar (length ,gvar)))
       (cond
        ,@(loop for expr in body
                if (listp (car expr))
                collect `((or ,@(loop for str in (car expr)
                                     collect `(string= ,gvar ,str :end1 (min ,le
nvar ,(length str)) :end2 (min ,lenvar ,(length str))))) ,@(cdr expr))
                else
                collect `((string= ,gvar ,(car expr) :end1 (min ,lenvar ,(length
 (car expr))) :end2 (min ,lenvar ,(length (car expr)))) ,@(cdr expr)))))))

This is basically used to do something similar to:
(prefixcase header
  ("Subject: " ...)
  ("From: " ...)
  ...)

in order to simplify access of some (not all) headers in a news
article.

Now, I could have done one of "write the code out in full", "hunted
down a more suitable pre-existing macro" (CASE? Does it take a
specific test function, if so I could've just written a string-prefix=
and gone from there, I guess) or used a function taking (say) a
header and a list of prefix/lambda pairs.

I can't see how wrapping this in an object would be clearer and I do
think this is more easy to read than calling a separate function to do
the checking.

//Ingvar
-- 
When C++ is your hammer, everything looks like a thumb
	Latest seen from Steven M. Haflich, in c.l.l
From: Christopher Barber
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <psou1mocy2x.fsf@boris.curl.com>
"Shayne Wissler" <·········@yahoo.com> writes:

> "Christopher Barber" <·······@curl.com> wrote in message
> ····················@boris.curl.com...
> 
> > > There is no way to do that as easily in any other language.
> >
> > You can do this this in any language with a sufficiently powerful macro
> > system, not just CL.
> 
> What other languages have such a system (Curl, I presume?)?

Yes, Curl is one, although macros are new with the latest release (macros have
actually been part of the internal implementation of Curl since the beginning
-- much of the standard syntax is actually implemented using macros -- but the
ability for user code to define macros was not supported until we had time to
clean the feature up for public use).

Dylan and Scheme are other language with macros.  Of course, the C
preprocessor also provides a crude, inelegant and yet still useful macro
system.

Many scripting languages have macros or macro-like capabilities.  For
instance, it is fairly easy to implement arbitrary syntax extensions in Tcl,
although I don't think it has formal macros.

There are also extensions to various non-macro languages providing macro
capabilities.

Finally, aren't CL macros non-hygienic?

- Christopher
From: JRStern
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <3d40138e.3654404@news.verizon.net>
On 24 Jul 2002 22:59:02 -0400, Christopher Barber <·······@curl.com>
wrote:
>Finally, aren't CL macros non-hygienic?

Could you expand on that?

J.
From: Jochen Schmidt
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ahpd23$rcd$1@rznews2.rrze.uni-erlangen.de>
JRStern wrote:

> On 24 Jul 2002 22:59:02 -0400, Christopher Barber <·······@curl.com>
> wrote:
>>Finally, aren't CL macros non-hygienic?
> 
> Could you expand on that?

The (rather dumb) name "hygienic" in this context means that the macro 
system does not (not to say "is not able" to  ;-) capture variables in the 
context where the macro gets expanded.

Lets look at an example:

(defmacro foo (a)
  `(+ ,a b))

This macro called with (foo 3) would expand to (+ 3 b) and therefore would 
capture any variable "b" in the context of the expansion.

(let ((b 1))
  (foo 3))
 
==

(let ((b 1))
  (+ 3 b))

While such behaviour can sometimes be what one wants there is the 
possibility that one captures a variable without wanting so.

(defmacro dovector ((item vector) &body body)
 `(let ((vector ,vector))
    (dotimes (index (length vector))
       (let ((,item (aref vector index)))
          ,@body)))) 

(dovector (i #(1 2 3)) (print i))

would expand to

(LET ((VECTOR #(1 2 3)))
  (DOTIMES (INDEX (LENGTH VECTOR))
    (LET ((I (AREF VECTOR INDEX))) 
      (PRINT I))))

As you can see there are three variables captured in the body of dovector - 
the symbol given as the argument "item", vector and index.  Only the one 
given through "item" is really wanted.
Now imagine you have a variable binding with one of those names around a 
dovector.

(let ((index 7))
  (dovector (i #(1 2 3))
     ...))

Now the programmer might be confused that within the dovector his index is 
shadowed.

We can solve this problem by creating symbols with GENSYM for this purpose 
here:

(defmacro dovector ((item vector) &body body)
  (let ((vector-sym (gensym))
         (index-sym (gensym)))
   `(let ((,vector-sym ,vector))
      (dotimes (,index-sym (length ,vector-sym))
         (let ((,item (aref ,vector-sym ,index-sym)))
            ,@body)))) 

Now neither index nor vector are captured any longer.

In "Hygienic" macrosystems such variable capture cannot occur but you  buy 
that opportunity with a _huge_ reduction in flexibility in the macrosystem.

The "Hygienic" macrosystem issue is actually a very old point of hot 
discussions between some Common Lisp and Scheme people. In Scheme being a 
Lisp dialect with one namespace for functions and values unintented 
variable capture is much more critical than in CL (which has distinct 
namespaces). Actually it seems as if the "Hygienic Macros" issue does not 
seem to be a real problem for Common Lisp as it is for Scheme...

ciao,
Jochen

--
http://www.dataheaven.de
From: Christopher Barber
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <psolm7zzls8.fsf@clock-king.curl.com>
Jochen Schmidt <···@dataheaven.de> writes:

<excellent description of macro hygiene ...>

> In "Hygienic" macrosystems such variable capture cannot occur but you  buy 
> that opportunity with a _huge_ reduction in flexibility in the macrosystem.

Not necessarily.  As long as the macro system lets you capture variables when
you want to, you lose no flexibility.  The question is really whether code
expanded in macros should be hygienic by default.

- Christopher
From: Jochen Schmidt
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ahphtp$bc6$1@rznews2.rrze.uni-erlangen.de>
Christopher Barber wrote:

> Jochen Schmidt <···@dataheaven.de> writes:
> 
> <excellent description of macro hygiene ...>
> 
>> In "Hygienic" macrosystems such variable capture cannot occur but you 
>> buy that opportunity with a _huge_ reduction in flexibility in the
>> macrosystem.
> 
> Not necessarily.  As long as the macro system lets you capture variables
> when
> you want to, you lose no flexibility.  The question is really whether code
> expanded in macros should be hygienic by default.

To me the question is more like - "Is this really a problem?". At least for 
CL I would say it is no real problem and there are means that make it easy 
to cope with such things.

ciao,
Jochen

--
http://www.dataheaven.de
From: JRStern
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <3d40500f.19143817@news.verizon.net>
On Thu, 25 Jul 2002 19:41:35 +0200, Jochen Schmidt <···@dataheaven.de>
wrote:
>> On 24 Jul 2002 22:59:02 -0400, Christopher Barber <·······@curl.com>
>> wrote:
>>>Finally, aren't CL macros non-hygienic?
>> 
>> Could you expand on that?
>
>The (rather dumb) name "hygienic" in this context means that the macro 
>system does not (not to say "is not able" to  ;-) capture variables in the 
>context where the macro gets expanded.
...

Very clear explanation, thanks.

J.

PS - I'm reading this on comp.object, and your message follow-up seems
to be set only to comp.lang.lisp.  Haven't seen many recent
discussions cross those two groups, it's refreshing.
From: Jochen Schmidt
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ahpdcs$rkv$1@rznews2.rrze.uni-erlangen.de>
Jochen Schmidt wrote:

> (defmacro dovector ((item vector) &body body)
>   (let ((vector-sym (gensym))
>          (index-sym (gensym)))
>    `(let ((,vector-sym ,vector))
>       (dotimes (,index-sym (length ,vector-sym))
>          (let ((,item (aref ,vector-sym ,index-sym)))
>             ,@body))))
     
I missed a closing paren here

ciao,
Jochen

--
http://www.dataheaven.de
From: Christopher Browne
Subject: Washing Your Hands Befor Eating...
Date: 
Message-ID: <aif36b$13ljkp$13@ID-125932.news.dfncis.de>
A long time ago, in a galaxy far, far away, ················@gte.net (JRStern) wrote:
> On 24 Jul 2002 22:59:02 -0400, Christopher Barber <·······@curl.com>
> wrote:
>>Finally, aren't CL macros non-hygienic?
>
> Could you expand on that?

They don't wash their hands before they eat.

More specifically, they don't automagically intern their own symbols
so as to guarantee that you won't have clashes between macro-generated
symbols and your own.

For instance, if you use a FOR loop thus:
(defmacro for ((var start stop) &body body)
  `(do ((,var ,start (1+ ,var))
	(limit ,stop))
    ((> ,var limit))
    ,@body))

> (princ (macroexpand-1 
	'(for (foo 1 10)
	  (format t "Foo is: ~D~%" foo))))
(DO ((FOO 1 (1+ FOO)) (LIMIT 10)) ((> FOO LIMIT)) (FORMAT T Foo is: ~D~% FOO))

which is well and fine.  Not so well and fine is...

> (princ (macroexpand-1 
	'(for (limit 1 10)
	  (format t "Limit is: ~D~%" limit))))

(DO ((LIMIT 1 (1+ LIMIT)) (LIMIT 10)) ((> LIMIT LIMIT)) (FORMAT T Limit is: ~D~% LIMIT))

Not being "hygenic," the names generated inside the macro aren't
independent of "the rest of the world."

This macro didn't "wash its hands," and if you run the second FOR
loop, you'll find that it never terminates.

In Scheme, the "define-syntax" system would automagically do some
GENSYM calls behind the scenes so that you need not worry about
whether the names you use for things in the macro conflict with
anything.  Things are kept "hygenic."

There are legitimate arguments in favor of both approaches.  DEFMACRO
is likely to sometimes be nicer than define-syntax, and vice-versa.
-- 
(reverse (concatenate 'string ····················@" "454aa"))
http://cbbrowne.com/info/lisp.html
Early to rise,
And early to bed,
Makes a man healthy
But socially dead. 
From: Matthias Blume
Subject: Re: Washing Your Hands Befor Eating...
Date: 
Message-ID: <m3fzxw8yur.fsf@hana.shimizu-blume.com>
Christopher Browne <········@acm.org> writes:

> A long time ago, in a galaxy far, far away, ················@gte.net (JRStern) wrote:
> > On 24 Jul 2002 22:59:02 -0400, Christopher Barber <·······@curl.com>
> > wrote:
> >>Finally, aren't CL macros non-hygienic?
> >
> > Could you expand on that?
> 
> They don't wash their hands before they eat.
> 
> More specifically, they don't automagically intern their own symbols
> so as to guarantee that you won't have clashes between macro-generated
> symbols and your own.
> 
> For instance, if you use a FOR loop thus:
> (defmacro for ((var start stop) &body body)
>   `(do ((,var ,start (1+ ,var))
> 	(limit ,stop))
>     ((> ,var limit))
>     ,@body))
> 
> > (princ (macroexpand-1 
> 	'(for (foo 1 10)
> 	  (format t "Foo is: ~D~%" foo))))
> (DO ((FOO 1 (1+ FOO)) (LIMIT 10)) ((> FOO LIMIT)) (FORMAT T Foo is: ~D~% FOO))
> 
> which is well and fine.  Not so well and fine is...
> 
> > (princ (macroexpand-1 
> 	'(for (limit 1 10)
> 	  (format t "Limit is: ~D~%" limit))))
> 
> (DO ((LIMIT 1 (1+ LIMIT)) (LIMIT 10)) ((> LIMIT LIMIT)) (FORMAT T Limit is: ~D~% LIMIT))
> 
> Not being "hygenic," the names generated inside the macro aren't
> independent of "the rest of the world."
> 
> This macro didn't "wash its hands," and if you run the second FOR
> loop, you'll find that it never terminates.
> 
> In Scheme, the "define-syntax" system would automagically do some
> GENSYM calls behind the scenes so that you need not worry about
> whether the names you use for things in the macro conflict with
> anything.  Things are kept "hygenic."

This is not the whole (and, in fact, just the easy part of the) story.

Hygienic macros can insert identifiers into their expansion that refer
to bindings in the context of the macro's definition -- and they can
do that even if apart from the macro system there is (because of
scoping issues) no other way of referring to these bindings at the
source location where the macro gets instantiated.
From: Kaz Kylheku
Subject: Re: Washing Your Hands Befor Eating...
Date: 
Message-ID: <aihbjb$lnq$5@luna.vcn.bc.ca>
In article <··············@hana.shimizu-blume.com>, Matthias Blume wrote:
> Christopher Browne <········@acm.org> writes:
>> In Scheme, the "define-syntax" system would automagically do some
>> GENSYM calls behind the scenes so that you need not worry about
>> whether the names you use for things in the macro conflict with
>> anything.  Things are kept "hygenic."
> 
> This is not the whole (and, in fact, just the easy part of the) story.
> 
> Hygienic macros can insert identifiers into their expansion that refer
> to bindings in the context of the macro's definition -- and they can

Hygiene must defined by what a given *instance* of a macro
*does not* do, not by what the *class* of all macros *can* potentially do.

If you write a macro that does not try to access lexical bindings in
its defining scope, it doesn't matter whether this is possible or not;
the macro is not *prevented* from being hygienic on the account of
something that it does not try to do.

In Lisp, a macro whose expansion utters an identifier in hopes of
reaching a binding in the macro definition's lexical scope, 
is simply an incorrect macro. The language provides alternate
ways for macros to safely refer to context data maintained near
the site of their definition.  These alternate ways lead to macros
that produce hygienic expansions, and yet do what is needed.

A hygienic macro expansion is one which doesn't produce any surprising
interactions with the surrounding program, or any embedded pieces of it.
It is correct in every situation of its use.  That's it.

The real question is this: in a given macro system, what, or how much, can the
programmer accomplish, without having to give up hygiene? 

When you have to give up hygiene, it doesn't mean that your macros are
incorrect, only that they are fragile. Their use requires cooperation from the
user, who must obey certain rules.   The lack of hygiene is understood and
documented.  Namespace rules may have to be introduced; for example the macro
module may claim all identifiers having a given prefix. The user agrees not to
use identifiers in this space, in exchange for the guarantee that the macros
won't produce surprises.

In the C language, for instance, the language standard defines certain prefixes
which a C program cannot use for identifiers. There are a few classes of
prefixes, with various rules outlining when you can and cannot get away with
them. Designers of additional library extensions do similar things, e.g POSIX.
An implementor of a standard C library can write macros that have internal
identifiers; these just have to start with two underscores, or an underscore
and a capital letter. The macros are hygienic, because a C program which uses
these identifiers invokes undefined behavior; the same people who defined
the namespace also decide what constitutes a correct C program, thereby
tying their macro hygiene to program correctness.

Lisp implementations do the same thing, incidentally.  For example,
some macros provided by CLISP make references to symbols
in a system package, rather than using gensyms.

  [1]> (macroexpand '(destructuring-bind (a) nil))
  (LET ((SYSTEM::<DESTRUCTURING-FORM> NIL))
   (IF (/= (LENGTH SYSTEM::<DESTRUCTURING-FORM>) 1)
     (SYSTEM::DESTRUCTURING-ERROR SYSTEM::<DESTRUCTURING-FORM> '(1 . 1))
       (LET* ((A (CAR SYSTEM::<DESTRUCTURING-FORM>)))))) ;

Can we call this unhygienic, on account of introducing a binding
for SYSTEM::<DESTRUCTURING-FORM>? No, because the user has no business using
symbols from that package. 

A Lisp macro which mistakenly refers to an identifier in its defining context
is not one of these macros whose correct use depends on the observation of
rules---it's simply an incorrect one.  The mistake is not one of failing to
provide hygiene, but one of requesting nonexistent semantics from the language.
From: Frederic Brunel
Subject: Re: Washing Your Hands Befor Eating...
Date: 
Message-ID: <la7kj56ycr.fsf@buzz.in-fusio.com>
Christopher Browne <········@acm.org> writes:

> More specifically, they don't automagically intern their own symbols
> so as to guarantee that you won't have clashes between macro-generated
> symbols and your own.

Sometimes, it may be interesting to have this behavior. Look at the
anaphoric macros defined by Paul Graham in OnLisp.

-- 
Frederic Brunel
Software Engineer
In-Fusio - Mobile Game Connections
From: Christopher Browne
Subject: Re: Washing Your Hands Befor Eating...
Date: 
Message-ID: <aimls3$15il1t$1@ID-125932.news.dfncis.de>
Frederic Brunel <···············@in-fusio.com> wrote:
> Christopher Browne <········@acm.org> writes:
>> More specifically, they don't automagically intern their own
>> symbols so as to guarantee that you won't have clashes between
>> macro-generated symbols and your own.

> Sometimes, it may be interesting to have this behavior. Look at the
> anaphoric macros defined by Paul Graham in OnLisp.

It's "interesting," yes.  I find that I hate the idea of actually
using those macros.
-- 
(reverse (concatenate 'string ··········@" "enworbbc"))
http://www.ntlug.org/~cbbrowne/nonrdbms.html
REALITY is an illusion that stays put.
From: Hannah Schroeter
Subject: Re: Washing Your Hands Befor Eating...
Date: 
Message-ID: <aimm74$rdo$1@c3po.schlund.de>
Hello!

Christopher Browne  <········@acm.org> wrote:
>[...]

>> Sometimes, it may be interesting to have this behavior. Look at the
>> anaphoric macros defined by Paul Graham in OnLisp.

>It's "interesting," yes.  I find that I hate the idea of actually
>using those macros.

And it'd be not too difficult to transform them into "hygienic"
variants by requiring the bound symbol to be passed as additional
macro parameter.

However, there are other macros which define other functions
or macros locally in their expansion. Look for "Local Macro LOOP-FINISH"
in the CLHS for an example.

Kind regards,

Hannah.
From: Michael Hudson
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <lk8z3zlu4x.fsf@pc150.maths.bris.ac.uk>
Christopher Barber <·······@curl.com> writes:

> Finally, aren't CL macros non-hygienic?

Yes, but I singularly fail to understand why this is considered a
problem so long as you aren't really crap at writing macros...

Cheers,
M.

-- 
  I've even been known to get Marmite *near* my mouth -- but never
  actually in it yet.  Vegamite is right out.
 UnicodeError: ASCII unpalatable error: vegamite found, ham expected
                                       -- Tim Peters, comp.lang.python
From: Thomas Bushnell, BSG
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <87bs8vqik3.fsf@becket.becket.net>
Michael Hudson <···@python.net> writes:

> Christopher Barber <·······@curl.com> writes:
> 
> > Finally, aren't CL macros non-hygienic?
> 
> Yes, but I singularly fail to understand why this is considered a
> problem so long as you aren't really crap at writing macros...

Um, this is the argument that was used to explain why FORTRAN wasn't
an improvement on assembly, why Algol wasn't an improvement on
FORTRAN, etc., etc.

The whole problem is that people *do* make silly stupid mistakes all
the time with the most trivial of programming tasks, and that higher
level languages do a great service by preventing the need for us to
keep bothering with thinking about how to code bignum multiplication.

I'm surprised especially to see the argument on a Lisp newsgroup that
there is no real advantage to high-level facilities, since they are
only needed if you are "crap" at writing things yourself.
From: Jochen Schmidt
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ahqltc$q46$1@rznews2.rrze.uni-erlangen.de>
Thomas Bushnell, BSG wrote:

> Michael Hudson <···@python.net> writes:
> 
>> Christopher Barber <·······@curl.com> writes:
>> 
>> > Finally, aren't CL macros non-hygienic?
>> 
>> Yes, but I singularly fail to understand why this is considered a
>> problem so long as you aren't really crap at writing macros...
> 
> Um, this is the argument that was used to explain why FORTRAN wasn't
> an improvement on assembly, why Algol wasn't an improvement on
> FORTRAN, etc., etc.
> 
> The whole problem is that people *do* make silly stupid mistakes all
> the time with the most trivial of programming tasks, and that higher
> level languages do a great service by preventing the need for us to
> keep bothering with thinking about how to code bignum multiplication.
> 
> I'm surprised especially to see the argument on a Lisp newsgroup that
> there is no real advantage to high-level facilities, since they are
> only needed if you are "crap" at writing things yourself.

I cannot speak for the original poster but I don't think that the point was 
really like this.
I often wondered too why hygienic macros with all their cost are necessary.
There are mainly two problems to solve with macros "multiple evaluation" and 
"variable capture". CL offers all you need to cope with this problems. If 
you want it a bit nicer you can even use macros like WITH-UNIQUE-NAMES and 
REBINDING to make it a bit nicer.

Using this facilities my example of the other post looks like this.

(defmacro dovector ((item vector) &body body)
  (with-unique-names (index)
    (rebinding (vector)
      `(dotimes (,index (length ,vector))
         (let ((,item (aref ,vector ,index)))
           ,@body)))))
 
What you get is "hygienic if wanted" without any lossage of flexibility in 
the macro system. Schemes solution to that is like cutting off your legs to 
circumvent the possible need to wear trousers in public...

What many do not understand is why some people still see problems with this 
in CL when there really seems to be none. Every here and then some guy 
comes along saying "but CL does not have hygienic macros" completely 
failing to see that something like that does not exist because the 
community simply has no need for it (there would be something for sure if 
it is useful - hygienic macros are no rocket science).

This has nothing to do with "high-level facilities considered harmful" or 
Assembler vs. FORTRAN stories. I really do not see Schemes pattern matching 
macros as a higher level macro facility.

ciao,
Jochen

--
http://www.dataheaven.de
From: Paul D. Lathrop
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <Xns925677E5DD815pdlathrocharterminet@216.168.3.40>
Christopher Barber <·······@curl.com> wrote in
····················@boris.curl.com: 
> Finally, aren't CL macros non-hygienic?
> 
> - Christopher

Sorry to sound ignorant, but what does non-hygienic mean in 
this context?

Paul D. Lathrop
From: Matthias Blume
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <m3n0sdrcr8.fsf@hana.shimizu-blume.com>
"Paul D. Lathrop" <········@chartermi.net> writes:

> Christopher Barber <·······@curl.com> wrote in
> ····················@boris.curl.com: 
> > Finally, aren't CL macros non-hygienic?
> > 
> > - Christopher
> 
> Sorry to sound ignorant, but what does non-hygienic mean in 
> this context?

It means two things:

Easy part: A free occurence of an identifier within a macro argument
can get captured by a local binding that the macro inserts (if the
name bound in that binding happens to clash with the name of said
identifier).  This part is easy to handle in non-hygienic systems by
being careful when writing the macro definition: One just needs to
make sure that fresh identifiers are being used for such local
bindings (gensym).  A hygienic system will automatically handle this
without need for (explicit) gensyms.

  Classic example (illustrated using C's crappy macro system):

      # define swap(typ,x,y) { typ tmp = x; x = y; y = tmp; }
      ...
      doublp temp, oldtemp;  /* current, old temperature */
      ...
          swap(double,temp,oldtemp);
      ...
  (The solution -- which in C unlike in Lisp is unavailable -- is to
   avoid the name "tmp" within the macro's definition in favor of
   a freshly generated name.)

Difficult part: A free occurence of an identifier within a macro's
expansion (when the occurence does not stem from a corresponding free
occurence of this identifier in the macro instantiation's argument)
can get captured by a local binding of the same identifier when the
binding's scope contains the macro's instantiation but not the macro's
definition.
I know of no reliable way (short of having control over the entire
program -- or, at least all of the code that occurs "between" a
macro's definition and the corresponding use) to reliably avoid this
situation.  (A hygienic macro expander works by effectively
maintaining global control over the entire program, rewriting even
pieces of code that are not part of any macro instantiations.)  Being
"careful" with or "good at" writing macros alone definitely does not
solve this problem.

   Example:

      int num_errors = 0;
      #define error(s) (num_errors++; fputs (s, stderr))
      ...
      double f (double x, double y, double num_error /* numerical error */)
      {
         ...
         if (<some abnormal condition>)
            error("abnormal condition occured");
         ...
      }

   Notice how the call of "error" within f will fail to increment the
   global error counter while instead screwing with a totally unrelated
   variable...
From: Joel Ray Holveck
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <y7csn25xxs5.fsf@sindri.juniper.net>
> Difficult part:

While difficult, it is not impossible to overcome.  Before I go into
that, I'm going to go into a little more detail about the problem, for
the sake of the readers.

(Prescript: This message is a good example of Kent's paper against
crossposting.  If I were talking to just Lispers, it'd be a lot
shorter since I wouldn't have to explain the Lisp!)

> A free occurence of an identifier within a macro's expansion (when
> the occurence does not stem from a corresponding free occurence of
> this identifier in the macro instantiation's argument) can get
> captured by a local binding of the same identifier when the
> binding's scope contains the macro's instantiation but not the
> macro's definition.

Here's a trivial example of what you're talking about:

    ;; We define a function FN.
    (defun fn (x)
      (+ x 5))

    ;; Now we write a macro which uses it.  This transforms calls to
    ;; (mymacro foo) into (fn foo).
    (defmacro mymacro (arg)
      `(fn ,arg))

    ;; Somewhere else, we lexically redefine FN.
    (flet ((fn (x) (+ x 10)))
      ;; With this in scope, we call MYMACRO.
      (mymacro 1))

This last will return 11, not 6.  If whoever calls MYMACRO doesn't
know that it relies on FN, this may be a surprise.

Presumably, the writer of MYMACRO expected #'FN to be the (+ x 5)
version.  Some consider this a feature.  (I'm not taking a position on
this question.)

With a working package system, it's much less likely to be a problem,
since somebody redefining PACKAGE::FN is probably familiar with its
usage in the rest of PACKAGE, so problems from reassignments are not
as likely to happen.  In our above example, if the FLET form was read
into a different package, then it would be defining a different FN,
and the problem wouldn't occur.

One aspect of this is that functions and variables have separate
namespaces in Common Lisp.  This means that people don't create new
function bindings arbitrarily.

The issue of hygienic macros is normally heard among Scheme
programmers.  Scheme does not have a package system[1].  Moreover,
Scheme uses a single namespace for variables and functions[2], which
means collisions are more likely, since all variable and argument
bindings are also bindings into the function namespace.  (Normally, in
Common Lisp, the variable namespace is dealt with by creating
temporary symbols-- called "gensyms"-- that are in no package, thus
can't be referenced elsewhere.)

> I know of no reliable way (short of having control over the entire
> program -- or, at least all of the code that occurs "between" a
> macro's definition and the corresponding use) to reliably avoid this
> situation.  (A hygienic macro expander works by effectively
> maintaining global control over the entire program, rewriting even
> pieces of code that are not part of any macro instantiations.)
> Being "careful" with or "good at" writing macros alone definitely
> does not solve this problem.

No, I think that gensyms can be used for that too.  To adapt our
previous example:

    (defun fn (x)
      (+ x 5))

    (defmacro mymacro (arg)
      ;; We create a temporary symbol, and name it "FN" during the
      ;; macro processing (but not its execution).
      (let ((fn (gensym)))
        ;; We set the function definition of our temporary symbol
        ;; to the global function definition of FN.
        (setf (symbol-function fn) #'fn)
        ;; Now, instead of putting FN down as the function to call,
        ;; we inject our temporary symbol.
        `(,fn ,arg)))

    (flet ((fn (x) (+ x 10)))
      (mymacro 1))

Cheers,
joelh

[1] I'm referring to R5RS-standard Scheme.  It's entirely feasible for
an implementation to define a package system, but this would be
non-standard and non-portable.

[2] I don't believe it's possible for a Scheme implementation to
define separate namespaces for these and still be compatible with all
R5RS-compilant programs.  At least, I've never seen it done, but I
haven't surveyed many Scheme implementations.
From: Fredrik Sandstrom
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <slrnakacc3.gbh.fredrik@deepthought.coax>
In article <···············@sindri.juniper.net>, Joel Ray Holveck wrote:
>     (defmacro mymacro (arg)
>       ;; We create a temporary symbol, and name it "FN" during the
>       ;; macro processing (but not its execution).
>       (let ((fn (gensym)))
>         ;; We set the function definition of our temporary symbol
>         ;; to the global function definition of FN.
>         (setf (symbol-function fn) #'fn)
>         ;; Now, instead of putting FN down as the function to call,
>         ;; we inject our temporary symbol.
>         `(,fn ,arg)))

Hmm, why does Common Lisp not accept the function itself in the car of a
form to be evaluated? If that were the case we could simply write

(defmacro mymacro (arg)
  `(,#'fn ,arg))

not needing a gensym at all in this case.

(Followups set to comp.lang.lisp)


-- 
- Fredrik Sandstrom   ·······@infa.abo.fi   http://infa.abo.fi/~fredrik -
               Computer Science at Abo Akademi University              --
From: Kaz Kylheku
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ai5ak1$hvf$1@luna.vcn.bc.ca>
In article <······················@deepthought.coax>, Fredrik Sandstrom wrote:
> In article <···············@sindri.juniper.net>, Joel Ray Holveck wrote:
>>     (defmacro mymacro (arg)
>>       ;; We create a temporary symbol, and name it "FN" during the
>>       ;; macro processing (but not its execution).
>>       (let ((fn (gensym)))
>>         ;; We set the function definition of our temporary symbol
>>         ;; to the global function definition of FN.
>>         (setf (symbol-function fn) #'fn)
>>         ;; Now, instead of putting FN down as the function to call,
>>         ;; we inject our temporary symbol.
>>         `(,fn ,arg)))
> 
> Hmm, why does Common Lisp not accept the function itself in the car of a
> form to be evaluated? If that were the case we could simply write

Because a function *object* is not a function *name*. Remember, Lisp has
a division of namespaces; the same symbol can be simultaneously bound
to a function. In other words, you can have this:

  (let ((x #'(lambda () 10)))
    (flet ((x () 20))
      (x)))

Here, x is the name of a value which happens to be a function, and is also the
name of a function.  Which one should the expression (x) call? The answer is
that it the x in that context is interpreted as a function name, and so the
flet is called, yielding the result 20. To call the closure, you would use, for
example, (funcall x).

There are some languages which resemble Lisp, but which have one binding
namespace, such as EuLisp and Scheme.  In these, this issue does not
arise, since the symbol x can refer to only one entity in a given lexical
scope. The designers of Lisp decided that it's advantageous to have separate
namespaces, so that users don't, for instance, have to worry about using a
symbol such as ``list'' for a variable, and that the slight inconveniences such
as having to explicitly funcall are worth it. Moreover, each symbol has
a property list, which is a way to invent your own bindings to new kinds of
entities. For example, in a music composition environment, you might 
simultaneously use the symbol x to denote a value, function or musical 
score. :)

> (defmacro mymacro (arg)
>   `(,#'fn ,arg))
> 
> not needing a gensym at all in this case.

Or you could use

  `(funcall #'fn ,arg)

It's not right to evaluate the #'fn using the comma, because
then you are trying to compute the function object at macroexpansion time!
The function might not exist at that time, or be visible in that
context (if for instance it is a flet or labels that is meant to be
accessed by the expanded code in its context).

The above expression, instead, inserts the unevaluated (function fn) *form*
into the expansion; that (function fn) will be evaluated later, when the
funcall expression is evaluated.
From: Joel Ray Holveck
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <y7cofcpag51.fsf@sindri.juniper.net>
>> Hmm, why does Common Lisp not accept the function itself in the car of a
>> form to be evaluated? If that were the case we could simply write
> Because a function *object* is not a function *name*.

Yes, but a lambda expression is not a function name either, and that's
accepted as the car of a form, ie:
  ((lambda (x) (+ x 1)) 5)

He's not saying, "why does it not work when I do this", he's saying
"why didn't the designers of CL make this valid".

> Remember, Lisp has a division of namespaces; the same symbol can be
> simultaneously bound to a function. In other words, you can have
> this:

I'm not sure what the division of namespaces has to do with the topic,
or why a function object is not valid as the car of a cons to be
evaluated.

>> (defmacro mymacro (arg)
>>   `(,#'fn ,arg))
>> not needing a gensym at all in this case.
> Or you could use
>   `(funcall #'fn ,arg)

No, since the discussion was about how to make a macro not depend on
what functions exist at the eval time.  By using funcall, we've
required a function to exist.  (Mind you, it's illegal to redefine
funcall anyway, so it would probably be okay for the purpose in
question.)

> It's not right to evaluate the #'fn using the comma, because then
> you are trying to compute the function object at macroexpansion
> time!

That was more or less the point of the exercise.

Cheers,
joelh
From: Marco Antoniotti
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <y6cd6t5e1g1.fsf@octagon.mrl.nyu.edu>
Joel Ray Holveck <·····@juniper.net> writes:

> >> Hmm, why does Common Lisp not accept the function itself in the car of a
> >> form to be evaluated? If that were the case we could simply write
> > Because a function *object* is not a function *name*.
> 
> Yes, but a lambda expression is not a function name either, and that's
> accepted as the car of a form, ie:
>   ((lambda (x) (+ x 1)) 5)
> 
> He's not saying, "why does it not work when I do this", he's saying
> "why didn't the designers of CL make this valid".

cl-prompt> ((lambda (x) (+ 1 x)) 5)

6

This is part of the CL standard.  Section 3.1.2.1.2 Conses as Forms.

Cheers

-- 
Marco Antoniotti ========================================================
NYU Courant Bioinformatics Group        tel. +1 - 212 - 998 3488
715 Broadway 10th Floor                 fax  +1 - 212 - 995 4122
New York, NY 10003, USA                 http://bioinformatics.cat.nyu.edu
                    "Hello New York! We'll do what we can!"
                           Bill Murray in `Ghostbusters'.
From: Joel Ray Holveck
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <y7cn0s93ucs.fsf@sindri.juniper.net>
>>>> Hmm, why does Common Lisp not accept the function itself in the car of a
>>>>  form to be evaluated? If that were the case we could simply write
>>>  Because a function *object* is not a function *name*.
>> Yes, but a lambda expression is not a function name either, and that's
>> accepted as the car of a form, ie:
>>   ((lambda (x) (+ x 1)) 5)
>> He's not saying, "why does it not work when I do this", he's saying
>> "why didn't the designers of CL make this valid".
> cl-prompt>((lambda (x) (+ 1 x)) 5)
> 6
> This is part of the CL standard.  Section 3.1.2.1.2 Conses as Forms.

Right, right... I'm fine with that part.  (I was using the lambda
expression as an example that a symbol does not necessarily have to be
the car of a cons form.)

1. I know that a lambda as the car is valid.

2. I know that a symbol as the car is valid.

3. I know that a function as the car is invalid.

4. I don't know why the designers of CL instituted point 3.

Am I being clear yet?  Sorry about the vagueness before.

joelh
From: Barry Margolin
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <R2C19.21$jG2.1611@paloalto-snr1.gtei.net>
In article <···············@sindri.juniper.net>,
Joel Ray Holveck  <·····@juniper.net> wrote:
>1. I know that a lambda as the car is valid.
>
>2. I know that a symbol as the car is valid.
>
>3. I know that a function as the car is invalid.
>
>4. I don't know why the designers of CL instituted point 3.
>
>Am I being clear yet?  Sorry about the vagueness before.

Because there's no need for it.  Before LET was invented, #1 was the only
way to get local variables, so it was needed.  #2 is needed to call named
functions.  #3 isn't need because FUNCALL and APPLY exist for that purpose.

Also, in general the only way to make use of an expression of type #3 would
be to construct it on the fly and then execute it using EVAL.  Since
FUNCALL and APPLY are much clearer ways of expressing this, what would be
the justification for allowing #3?

-- 
Barry Margolin, ······@genuity.net
Genuity, Woburn, MA
*** DON'T SEND TECHNICAL QUESTIONS DIRECTLY TO ME, post them to newsgroups.
Please DON'T copy followups to me -- I'll assume it wasn't posted to the group.
From: Joel Ray Holveck
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <y7c4reh3ry9.fsf@sindri.juniper.net>
>> 3. I know that a function as the car is invalid.
>> 4. I don't know why the designers of CL instituted point 3.
>> Am I being clear yet?  Sorry about the vagueness before.
> Also, in general the only way to make use of an expression of type #3 would
> be to construct it on the fly and then execute it using EVAL.  Since
> FUNCALL and APPLY are much clearer ways of expressing this, what would be
> the justification for allowing #3?

The discussion was in the context of a macro, whereby the macro
fetches (or generates) the function object at expand time and inserts
it in the appropriate position literally.

At the time, I was trying to avoid any funcall calls by name, to avoid
flet-trapping.  (It was a discussion of hygiene in macros.)  That's
why I deliberately avoided funcall/apply.

I ended up with this example:

    (defmacro mymacro (&rest args)
      (let ((fn (gensym)))
        (setf (symbol-function fn) #'foo)
          `(,fn ,@args)))
    (defun foo (x)
      (+ x 1))
    (flet ((foo (x) (+ x 2)))
      (mymacro 3))

Later I decided that this isn't quite right, since it captures the
function *definition* in place at the time of macro expansion, not the
function *binding*, which is probably what I should be trying to use.
There may also be some bad interaction with the compile-time
environment.

Then somebody pointed out that if #3 were possible, this would be
equivalent to the simpler:

    (defmacro mymacro (&rest args)
      `(,#'foo ,@args))

That's what got us wondering why #3 is not allowed by the standard.

It just seemed odd to require a lambda expression or symbol as the car
of a cons form, instead of a function designator.  The impact on
implementors would probably be minimal; one would assume that the
typecasing code is already in place, since a lambda expression is
allowed.

Cheers,
joelh
From: Barry Margolin
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <GFD19.24$jG2.1843@paloalto-snr1.gtei.net>
In article <···············@sindri.juniper.net>,
Joel Ray Holveck  <·····@juniper.net> wrote:
>The discussion was in the context of a macro, whereby the macro
>fetches (or generates) the function object at expand time and inserts
>it in the appropriate position literally.

Sorry, I've been on vacation for a week and I'm joining this discussion
late.

>Then somebody pointed out that if #3 were possible, this would be
>equivalent to the simpler:
>
>    (defmacro mymacro (&rest args)
>      `(,#'foo ,@args))
>
>That's what got us wondering why #3 is not allowed by the standard.

This would require that function objects be dumpable as immediate data in
compiled files, which is difficult to do right and not required by the
language.  You also can't do:

(defmacro mymacro ()
  `(list ,#'foo))

for the same reason.  Functions are first-class objects in that they can be
passed around as data at run-time, but they don't pass through the compiler
very easily (except for the process of defining them in the first place).
See the section of CLHS on literal objects in compiled files.

-- 
Barry Margolin, ······@genuity.net
Genuity, Woburn, MA
*** DON'T SEND TECHNICAL QUESTIONS DIRECTLY TO ME, post them to newsgroups.
Please DON'T copy followups to me -- I'll assume it wasn't posted to the group.
From: Jochen Schmidt
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ai70a5$mrs$1@rznews2.rrze.uni-erlangen.de>
Joel Ray Holveck wrote:

> Then somebody pointed out that if #3 were possible, this would be
> equivalent to the simpler:
> 
>     (defmacro mymacro (&rest args)
>       `(,#'foo ,@args))

And why is 

(defmacro mymacro (&rest args)
  `(funcall ,#'foo ,@args))

any worse?

I personally like the explicit use of funcall or apply over the implicit 
"car of the form" way in some situations.

I prefer for example 
(funcall (foo (* (+ x y) z))  a)

over

((foo (* (+ x y) z)) a)
 
> That's what got us wondering why #3 is not allowed by the standard.

Because it is not needed. Some people even find it less readable in some 
situations.
Sometimes one hears that evaluating the first argument is the "more natural" 
or "intuitive" way of doing it. I don't think so. It is in no way more 
intuitive that one should be able to compute something at the place where 
in most cases you write a function-name.

(funcall (foo x) y)

Contains nothing special. There is a function called "funcall" and two 
arguments given to it. So Common Lisps explicit use of funcall in this case 
looks more consistent to my eyes.

> It just seemed odd to require a lambda expression or symbol as the car
> of a cons form, instead of a function designator.  The impact on
> implementors would probably be minimal; one would assume that the
> typecasing code is already in place, since a lambda expression is
> allowed.

The lambda expression is a special case and was added because people wanted 
it to be there. AFAIK a lambda expression is indeed often seen as some kind 
of name for a function. (lambda (a) (+ a 1)) is a name for "the function 
that adds one to its argument". So the use of a lambda expression in this 
position is not really inconsistent.

If you want to go further about this topic - AFAIR there was a _very_ long 
thread about this topic in cll (see http://groups.google.com)

--
http://www.dataheaven.de
From: Erann Gat
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <gat-3007021312000001@k-137-79-50-101.jpl.nasa.gov>
In article <·················@paloalto-snr1.gtei.net>, Barry Margolin
<······@genuity.net> wrote:

> In article <···············@sindri.juniper.net>,
> Joel Ray Holveck  <·····@juniper.net> wrote:
> >1. I know that a lambda as the car is valid.
> >
> >2. I know that a symbol as the car is valid.
> >
> >3. I know that a function as the car is invalid.
> >
> >4. I don't know why the designers of CL instituted point 3.
> >
> >Am I being clear yet?  Sorry about the vagueness before.
> 
> Because there's no need for it.  Before LET was invented, #1 was the only
> way to get local variables, so it was needed.  #2 is needed to call named
> functions.  #3 isn't need because FUNCALL and APPLY exist for that purpose.
> 
> Also, in general the only way to make use of an expression of type #3 would
> be to construct it on the fly and then execute it using EVAL.  Since
> FUNCALL and APPLY are much clearer ways of expressing this, what would be
> the justification for allowing #3?

One could use the same argument to show that ((lambda ...) ...) is not
needed.  After all, you can simply write (funcall (lambda ...) ...)

In fact, you could also argue that (<symbol> ...) is not needed because
you can write (funcall #'<symbol> ...).

It seems to me that the arguments for (<symbol> ...) and ((lambda ...)
...) are purely aesthetic, and that the same aesthetic argues for the
legality of ((function-maker ...) ...) rather than (funcall
(function-maker ...) ...)

E.
From: Barry Margolin
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <HMD19.25$jG2.1793@paloalto-snr1.gtei.net>
In article <····················@k-137-79-50-101.jpl.nasa.gov>,
Erann Gat <···@jpl.nasa.gov> wrote:
>One could use the same argument to show that ((lambda ...) ...) is not
>needed.  After all, you can simply write (funcall (lambda ...) ...)

True.

>In fact, you could also argue that (<symbol> ...) is not needed because
>you can write (funcall #'<symbol> ...).

But then you run into an infinite regress, since that expression has a
symbol in the car, and would have to be rewritten.  You have to have a base
case for the recursion to stop.

>It seems to me that the arguments for (<symbol> ...) and ((lambda ...)
>...) are purely aesthetic, and that the same aesthetic argues for the
>legality of ((function-maker ...) ...) rather than (funcall
>(function-maker ...) ...)

True.  I think the original justification for allowing lambda expressions
there was to look like the lambda calculus (hence the LAMBDA keyword).  To
really understand how this came about, you have to take a look at early
Lisps, which used M-Expressions rather than S-Expressions.  Common Lisp
retains lots of historical baggage left over from those days.

-- 
Barry Margolin, ······@genuity.net
Genuity, Woburn, MA
*** DON'T SEND TECHNICAL QUESTIONS DIRECTLY TO ME, post them to newsgroups.
Please DON'T copy followups to me -- I'll assume it wasn't posted to the group.
From: Erann Gat
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <gat-3007021706270001@192.168.1.50>
In article <·················@paloalto-snr1.gtei.net>, Barry Margolin
<······@genuity.net> wrote:

> In article <····················@k-137-79-50-101.jpl.nasa.gov>,
> Erann Gat <···@jpl.nasa.gov> wrote:
> >One could use the same argument to show that ((lambda ...) ...) is not
> >needed.  After all, you can simply write (funcall (lambda ...) ...)
> 
> True.
> 
> >In fact, you could also argue that (<symbol> ...) is not needed because
> >you can write (funcall #'<symbol> ...).
> 
> But then you run into an infinite regress, since that expression has a
> symbol in the car, and would have to be rewritten.  You have to have a base
> case for the recursion to stop.

FUNCALL and APPLY would become special forms.

> >It seems to me that the arguments for (<symbol> ...) and ((lambda ...)
> >...) are purely aesthetic, and that the same aesthetic argues for the
> >legality of ((function-maker ...) ...) rather than (funcall
> >(function-maker ...) ...)
> 
> True.  I think the original justification for allowing lambda expressions
> there was to look like the lambda calculus (hence the LAMBDA keyword).  To
> really understand how this came about, you have to take a look at early
> Lisps, which used M-Expressions rather than S-Expressions.  Common Lisp
> retains lots of historical baggage left over from those days.

Alas.

E.
From: Paul D. Lathrop
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <Xns925BC2CE75A7Apdlathrocharterminet@216.168.3.40>
Barry Margolin <······@genuity.net> wrote in
······················@paloalto-snr1.gtei.net: 

> True.  I think the original justification for allowing lambda
> expressions there was to look like the lambda calculus (hence the
> LAMBDA keyword).  To really understand how this came about, you have
> to take a look at early Lisps, which used M-Expressions rather than
> S-Expressions.  Common Lisp retains lots of historical baggage left
> over from those days. 
> 

True. The original reason for allowing lambda expressions comes from 
Lisp's relationship to the lambda calculus. This isn't something I 
would IMHO call "baggage", necessarily, as it is this relationship 
which makes much of the power Lisp possible. Remember that the 
revolutionary thought that made Lisp so unique was the idea that 
functions and data are fundamentally identical.

Regards,
Paul D. Lathrop
From: Daniel Pittman
Subject: M-Expressions and early Lisp (was Re: Lisp's unique feature: compiler available at run-time)
Date: 
Message-ID: <87heigslzy.fsf_-_@enki.rimspace.net>
On Tue, 30 Jul 2002, Barry Margolin wrote:
> In article <····················@k-137-79-50-101.jpl.nasa.gov>,
> Erann Gat <···@jpl.nasa.gov> wrote:
>>One could use the same argument to show that ((lambda ...) ...) is not
>>needed.  After all, you can simply write (funcall (lambda ...) ...)
> 
> True.
> 
>>In fact, you could also argue that (<symbol> ...) is not needed
>>because you can write (funcall #'<symbol> ...).
> 
> But then you run into an infinite regress, since that expression has a
> symbol in the car, and would have to be rewritten. You have to have a
> base case for the recursion to stop.
> 
>>It seems to me that the arguments for (<symbol> ...) and ((lambda ...)
>>...) are purely aesthetic, and that the same aesthetic argues for the
>>legality of ((function-maker ...) ...) rather than (funcall
>>(function-maker ...) ...)
> 
> True. I think the original justification for allowing lambda
> expressions there was to look like the lambda calculus (hence the
> LAMBDA keyword). To really understand how this came about, you have to
> take a look at early Lisps, which used M-Expressions rather than
> S-Expressions. Common Lisp retains lots of historical baggage left
> over from those days.

You made me curious, now, to learn more about these early Lisps and the
M-Expression syntax and structure.

So, do you have any pointers to information on them? Ideally stuff
that's on the Internet in a free or close to free form, but books would
also be of interest. :)

Thanks,
        Daniel

-- 
It [Australia] has more things that will kill you than anywhere else.
Of the world's ten most poisonous snakes, all are Australian.
        -- Bill Bryson, _In a Sunburned Country_
From: Markus Kliegl
Subject: Re: M-Expressions and early Lisp (was Re: Lisp's unique feature: compiler available at run-time)
Date: 
Message-ID: <87bs8od1fr.fsf@esoteric.holy-unicorn>
Daniel Pittman <······@rimspace.net> writes:

> On Tue, 30 Jul 2002, Barry Margolin wrote:
> > In article <····················@k-137-79-50-101.jpl.nasa.gov>,
> > Erann Gat <···@jpl.nasa.gov> wrote:
> >>One could use the same argument to show that ((lambda ...) ...) is not
> >>needed.  After all, you can simply write (funcall (lambda ...) ...)
> > 
> > True.
> > 
> >>In fact, you could also argue that (<symbol> ...) is not needed
> >>because you can write (funcall #'<symbol> ...).
> > 
> > But then you run into an infinite regress, since that expression has a
> > symbol in the car, and would have to be rewritten. You have to have a
> > base case for the recursion to stop.
> > 
> >>It seems to me that the arguments for (<symbol> ...) and ((lambda ...)
> >>...) are purely aesthetic, and that the same aesthetic argues for the
> >>legality of ((function-maker ...) ...) rather than (funcall
> >>(function-maker ...) ...)
> > 
> > True. I think the original justification for allowing lambda
> > expressions there was to look like the lambda calculus (hence the
> > LAMBDA keyword). To really understand how this came about, you have to
> > take a look at early Lisps, which used M-Expressions rather than
> > S-Expressions. Common Lisp retains lots of historical baggage left
> > over from those days.
> 
> You made me curious, now, to learn more about these early Lisps and the
> M-Expression syntax and structure.
> 
> So, do you have any pointers to information on them? Ideally stuff
> that's on the Internet in a free or close to free form, but books would
> also be of interest. :)
> 
> Thanks,
>         Daniel
> 

http://citeseer.nj.nec.com/mccarthy60recursive.html

Markus
From: Jeff Dalton
Subject: Re: M-Expressions and early Lisp (was Re: Lisp's unique feature:  compiler available at run-time)
Date: 
Message-ID: <fx4bs5nn1n2.fsf@todday.aiai.ed.ac.uk>
Daniel Pittman <······@rimspace.net> writes:

> > True. I think the original justification for allowing lambda
> > expressions there was to look like the lambda calculus (hence the
> > LAMBDA keyword). To really understand how this came about, you have to
> > take a look at early Lisps, which used M-Expressions rather than
> > S-Expressions. Common Lisp retains lots of historical baggage left
> > over from those days.
> 
> You made me curious, now, to learn more about these early Lisps and the
> M-Expression syntax and structure.
> 
> So, do you have any pointers to information on them? Ideally stuff
> that's on the Internet in a free or close to free form, but books would
> also be of interest. :)

The Lisp 1.5 book.  I'm pretty sure that's still in print.

John Allen's Lisp book.  Probably not in print.

The ALU website used to have some papers on Lisp history
that may be relevant.

The Steele and Gabriel "Evolution of Lisp" paper probably
has something.  (It should be avail via the ALU site, if
that site still exists.)

The syntax history goes something like this:

The list / S-expr notation for Lisp was not originally meant to be
used for programming.  Instead there were M-exprs which looked a bit
more like an ordinary programming language.  A translation from M-exprs
to S-exprs was defined so that Lisp programs could be represented
using a data structure that could be manipulated in Lisp.  That made
it possible to write a "universal function" (interpreter) for Lisp in
Lisp.  McCarthy wrote it in the M-notation and, it seems, thought of
it as having chiefly theoretical interest.  Steve Russell then noticed
that it could be "hand compiled" to give an actual interpreter, and
we've been using the S-expr notation ever since.

M-exprs are still around here and there.  McCarthy has used a version
of the notation in lecture notes, John Allen's Anatomy of Lisp uses
M-exprs, and another textbook whose name I've forgotten used M-exprs
and even provided a reader for them so that students could use them to
write programs.

Anyway, here's an example:

   mapcar[fn;l] = 
     [null[l] -> ();
         t    -> cons[fn[car[l]]; mapcar[fn; cdr[l]]]]

";"s and "[...]"s are very common.  f[a;b] is a two-arg fn call.

The [p1 -> e1; p2 -> e2; ...; pn -> en] is a conditional expression.
It can be translated directly into COND, which is where COND gets its
somewhat odd syntax.

Interestingly enough, Lisp 1.5 used failure continuations in
some cases.  E.g.:

   sassoc[x;y;u]    :   SUBR    functional

      The Function sassoc searches y, which is a list of dotted
   pairs, for a pair whose first element that is x [sic].  If
   such a pair is found, the value of sassoc is this pair.
   Otherwise, the function u of no arguments is taken as the
   value of sassoc.

      sassoc[x;y;u] = [null[y] -> u[];
                       eq[caar[y];x] -> car[y];
                       T -> sassoc[x;cdr[y];u]]

Some BNF:  The version in the Lisp 1.5 book is very simple.  McCarthy
has used a more elaborate version in lecture notes.  He uses more
operators, rather than making virtually everything a function call,
for instance.  Anyway, I'll describe the original since I don't have a
copy of the lecture notes here.

In practice, ordinary arithmetic and logical operators were used,
but they were not officially part of the syntax.  So ... here it is:

  <letter> ::= a | b | c | ... | z
  <identifier> ::= <letter><id part>
  <id part> ::= <empty> | <letter><id part> | <number><id part>

  <form> ::= <constant>
          |  <variable>
          |  <function>[<argument>; ...; <argument>]
          |  [<form> -> <form>; ...; <form> -> <form>]
  <constant> ::= <S-expression>
  <variable> ::= <identifier>
  <argument> ::= <form>

  <function> ::= <identifier>
              |  \[<var list>; <form>]
              |  label[<identifier>;<function>]

  <var list> ::= [<variable>; ...; <variable>]

The "\" is a lambda.

Function definitions were either

  <identifier> = <function>

or

  <identifier><var list> = <form>

The [p1 -> e1; p2 -> e2; ...; pn -> en] is (again) a conditional expression.

-- jd
From: Kaz Kylheku
Subject: Re: M-Expressions and early Lisp (was Re: Lisp's unique feature:  compiler available at run-time)
Date: 
Message-ID: <cf333042.0210211506.1aa2dc2b@posting.google.com>
Jeff Dalton <····@todday.aiai.ed.ac.uk> wrote in message news:<···············@todday.aiai.ed.ac.uk>... 
> Anyway, here's an example:
> 
>    mapcar[fn;l] = 
>      [null[l] -> ();
>          t    -> cons[fn[car[l]]; mapcar[fn; cdr[l]]]]
> 
> ";"s and "[...]"s are very common.  f[a;b] is a two-arg fn call.
> 
> The [p1 -> e1; p2 -> e2; ...; pn -> en] is a conditional expression.

Aha! That explains what inspired some of the idiotic, cryptic syntax
in some functional languages. Stolen from Lisp's garbage can. :)
From: cr88192
Subject: Re: M-Expressions and early Lisp (was Re: Lisp's unique feature:  compiler available at run-time)
Date: 
Message-ID: <ur9bjojmm8osad@corp.supernews.com>
Kaz Kylheku wrote:

> Jeff Dalton <····@todday.aiai.ed.ac.uk> wrote in message
> news:<···············@todday.aiai.ed.ac.uk>...
>> Anyway, here's an example:
>> 
>>    mapcar[fn;l] =
>>      [null[l] -> ();
>>          t    -> cons[fn[car[l]]; mapcar[fn; cdr[l]]]]
>> 
>> ";"s and "[...]"s are very common.  f[a;b] is a two-arg fn call.
>> 
>> The [p1 -> e1; p2 -> e2; ...; pn -> en] is a conditional expression.
> 
> Aha! That explains what inspired some of the idiotic, cryptic syntax
> in some functional languages. Stolen from Lisp's garbage can. :)

harsh, but I somewhat agree...

-- 
<cr88192[at]hotmail[dot]com>
<http://bgb1.hypermart.net/>
From: Kaz Kylheku
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ai8s4r$n3m$1@luna.vcn.bc.ca>
In article <···············@sindri.juniper.net>, Joel Ray Holveck wrote:
> 1. I know that a lambda as the car is valid.
> 
> 2. I know that a symbol as the car is valid.
> 
> 3. I know that a function as the car is invalid.
> 
> 4. I don't know why the designers of CL instituted point 3.

Because if you get rid of 3, thus allowing functions, people are going
to want to use, in the CAR, variables which are bound to functions. And that
creates an ambiguity in 2 that can only be resolved by some kludgy
evaluation rule.

  (defun x () ...)

  (defun call-the-function (x arg)
    (x arg)) ;; what should this do? 

  (call-the-function #'print 42)

So you see, it really is a namespace issue. The funcall operator is
needed because sometimes one wants to pull a function from a symbol's
value binding, rather than from its function binding, and that
won't happen when the symbol sits in the car position. So we put
the symbol ``funcall'' in that position, and move the symbol over
to the right, so to speak.
From: Shayne Wissler
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <VLq09.181699$uw.98692@rwcrnsc51.ops.asp.att.net>
"Matthias Blume" <········@shimizu-blume.com> wrote in message
···················@hana.shimizu-blume.com...

> I know of no reliable way (short of having control over the entire
> program -- or, at least all of the code that occurs "between" a
> macro's definition and the corresponding use) to reliably avoid this
> situation.  (A hygienic macro expander works by effectively
> maintaining global control over the entire program, rewriting even
> pieces of code that are not part of any macro instantiations.)  Being
> "careful" with or "good at" writing macros alone definitely does not
> solve this problem.
>
>    Example:
>
>       int num_errors = 0;
>       #define error(s) (num_errors++; fputs (s, stderr))
>       ...
>       double f (double x, double y, double num_error /* numerical error
*/)
>       {
>          ...
>          if (<some abnormal condition>)
>             error("abnormal condition occured");
>          ...
>       }
>
>    Notice how the call of "error" within f will fail to increment the
>    global error counter while instead screwing with a totally unrelated
>    variable...

For this particular case, you could always write: ::num_errors++. If you
only allowed macros to access variables from passed in variables or
explicitly referenced scopes, then you wouldn't have this problem.


Shayne Wissler
From: Matthias Blume
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <m3n0sd3ytt.fsf@hana.shimizu-blume.com>
"Shayne Wissler" <·········@yahoo.com> writes:

> "Matthias Blume" <········@shimizu-blume.com> wrote in message
> ···················@hana.shimizu-blume.com...
> 
> > I know of no reliable way (short of having control over the entire
> > program -- or, at least all of the code that occurs "between" a
> > macro's definition and the corresponding use) to reliably avoid this
> > situation.  (A hygienic macro expander works by effectively
> > maintaining global control over the entire program, rewriting even
> > pieces of code that are not part of any macro instantiations.)  Being
> > "careful" with or "good at" writing macros alone definitely does not
> > solve this problem.
> >
> >    Example:
> >
> >       int num_errors = 0;
> >       #define error(s) (num_errors++; fputs (s, stderr))
> >       ...
> >       double f (double x, double y, double num_error /* numerical error
> */)
> >       {
> >          ...
> >          if (<some abnormal condition>)
> >             error("abnormal condition occured");
> >          ...
> >       }
> >
> >    Notice how the call of "error" within f will fail to increment the
> >    global error counter while instead screwing with a totally unrelated
> >    variable...
> 
> For this particular case, you could always write: ::num_errors++.

... which would then be illegal C.

(By the way, I meant to spell the two variables the same.)
From: Shayne Wissler
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <9LC09.139650$_51.93522@rwcrnsc52.ops.asp.att.net>
"Matthias Blume" <········@shimizu-blume.com> wrote in message
···················@hana.shimizu-blume.com...

> > For this particular case, you could always write: ::num_errors++.
>
> ... which would then be illegal C.

... which doesn't change the overall point.
From: Matthias Blume
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <m3lm7u3p2j.fsf@hana.shimizu-blume.com>
"Shayne Wissler" <·········@yahoo.com> writes:

> "Matthias Blume" <········@shimizu-blume.com> wrote in message
> ···················@hana.shimizu-blume.com...
> 
> > > For this particular case, you could always write: ::num_errors++.
> >
> > ... which would then be illegal C.
> 
> ... which doesn't change the overall point.

It does.  Not all languages offer unique names for all the identifiers
that a macro might insert freely into its expansion.  This is
particularly true for macro systems where macros can be "local" and
can refer to lexically scoped variables at the sites of their
definitions.  The Scheme macro system is an example for this (although
one might say that this "feature" does not buy you all that much in
this particular case).
From: Shayne Wissler
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <Ixo19.180179$Wt3.135795@rwcrnsc53>
"Matthias Blume" <········@shimizu-blume.com> wrote in message
···················@hana.shimizu-blume.com...
> "Shayne Wissler" <·········@yahoo.com> writes:
>
> > "Matthias Blume" <········@shimizu-blume.com> wrote in message
> > ···················@hana.shimizu-blume.com...
> >
> > > > For this particular case, you could always write: ::num_errors++.
> > >
> > > ... which would then be illegal C.
> >
> > ... which doesn't change the overall point.
>
> It does.  Not all languages offer unique names for all the identifiers
> that a macro might insert freely into its expansion.  This is
> particularly true for macro systems where macros can be "local" and
> can refer to lexically scoped variables at the sites of their
> definitions.  The Scheme macro system is an example for this (although
> one might say that this "feature" does not buy you all that much in
> this particular case).

And this relates to the statement "which would then be illegal C" how?
From: Matthias Blume
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <m3k7nd315n.fsf@hana.shimizu-blume.com>
"Shayne Wissler" <·········@yahoo.com> writes:

> "Matthias Blume" <········@shimizu-blume.com> wrote in message
> ···················@hana.shimizu-blume.com...
> > "Shayne Wissler" <·········@yahoo.com> writes:
> >
> > > "Matthias Blume" <········@shimizu-blume.com> wrote in message
> > > ···················@hana.shimizu-blume.com...
> > >
> > > > > For this particular case, you could always write: ::num_errors++.
> > > >
> > > > ... which would then be illegal C.
> > >
> > > ... which doesn't change the overall point.
> >
> > It does.  Not all languages offer unique names for all the identifiers
> > that a macro might insert freely into its expansion.  This is
> > particularly true for macro systems where macros can be "local" and
> > can refer to lexically scoped variables at the sites of their
> > definitions.  The Scheme macro system is an example for this (although
> > one might say that this "feature" does not buy you all that much in
> > this particular case).
> 
> And this relates to the statement "which would then be illegal C" how?

C++ has a mechanism for uniquely naming global identifiers by
prefixing them with ::.  You have used that mechanism -- a mechanism
that is not available in C (which is why it renders the code illegal
C).
From: Shayne Wissler
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <IKx19.185331$Wt3.138900@rwcrnsc53>
"Matthias Blume" <········@shimizu-blume.com> wrote in message
···················@hana.shimizu-blume.com...
> "Shayne Wissler" <·········@yahoo.com> writes:
>
> > "Matthias Blume" <········@shimizu-blume.com> wrote in message
> > ···················@hana.shimizu-blume.com...
> > > "Shayne Wissler" <·········@yahoo.com> writes:
> > >
> > > > "Matthias Blume" <········@shimizu-blume.com> wrote in message
> > > > ···················@hana.shimizu-blume.com...
> > > >
> > > > > > For this particular case, you could always write:
::num_errors++.
> > > > >
> > > > > ... which would then be illegal C.
> > > >
> > > > ... which doesn't change the overall point.
> > >
> > > It does.  Not all languages offer unique names for all the identifiers
> > > that a macro might insert freely into its expansion.  This is
> > > particularly true for macro systems where macros can be "local" and
> > > can refer to lexically scoped variables at the sites of their
> > > definitions.  The Scheme macro system is an example for this (although
> > > one might say that this "feature" does not buy you all that much in
> > > this particular case).
> >
> > And this relates to the statement "which would then be illegal C" how?
>
> C++ has a mechanism for uniquely naming global identifiers by
> prefixing them with ::.  You have used that mechanism -- a mechanism
> that is not available in C (which is why it renders the code illegal
> C).

Unless I misread you, your original point was about macros in general. You
asserted that X was not possible; I provided a counter-example, and then you
started quibbling about my code being C++ and not C.
From: Kaz Kylheku
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ai5ak6$hvf$4@luna.vcn.bc.ca>
In article <··············@hana.shimizu-blume.com>, Matthias Blume wrote:
> "Shayne Wissler" <·········@yahoo.com> writes:
> 
>> "Matthias Blume" <········@shimizu-blume.com> wrote in message
>> ···················@hana.shimizu-blume.com...
>> 
>> > > For this particular case, you could always write: ::num_errors++.
>> >
>> > ... which would then be illegal C.
>> 
>> ... which doesn't change the overall point.
> 
> It does.  Not all languages offer unique names for all the identifiers
> that a macro might insert freely into its expansion.  This is
> particularly true for macro systems where macros can be "local" and
> can refer to lexically scoped variables at the sites of their
> definitions.  The Scheme macro system is an example for this (although
> one might say that this "feature" does not buy you all that much in
> this particular case).

Yeah, but in Scheme, lexical scope is the only way to contain
symbols so they don't clash. If you want static variables local to
a module, lexical scope is all you have. So of course you want the
macro system to easily work with this. But forcing more complexity
into the macro defining system looks like a misplaced effort.

In Lisp we have a package system, so it's not that important to capture
lexicals if a group of functions or macros wants to have some global variables
private to their module.
From: Matthias Blume
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <m3ado930p5.fsf@hana.shimizu-blume.com>
Kaz Kylheku <···@ashi.footprints.net> writes:

> In article <··············@hana.shimizu-blume.com>, Matthias Blume wrote:
> > "Shayne Wissler" <·········@yahoo.com> writes:
> > 
> >> "Matthias Blume" <········@shimizu-blume.com> wrote in message
> >> ···················@hana.shimizu-blume.com...
> >> 
> >> > > For this particular case, you could always write: ::num_errors++.
> >> >
> >> > ... which would then be illegal C.
> >> 
> >> ... which doesn't change the overall point.
> > 
> > It does.  Not all languages offer unique names for all the identifiers
> > that a macro might insert freely into its expansion.  This is
> > particularly true for macro systems where macros can be "local" and
> > can refer to lexically scoped variables at the sites of their
> > definitions.  The Scheme macro system is an example for this (although
> > one might say that this "feature" does not buy you all that much in
> > this particular case).
> 
> Yeah, but in Scheme, lexical scope is the only way to contain
> symbols so they don't clash. If you want static variables local to
> a module, lexical scope is all you have. So of course you want the
> macro system to easily work with this. But forcing more complexity
> into the macro defining system looks like a misplaced effort.

For the last time: I did not try to defend hygienic macros, I simply
explained what they are and what the perceived (by some) problems are
that they are supposed to solve.

> In Lisp we have a package system, so it's not that important to
> capture lexicals if a group of functions or macros wants to have
> some global variables private to their module.

That's right.  And besides the point.
From: Christopher Barber
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <pso3cu2d2sl.fsf@bandit.curl.com>
Matthias Blume <········@shimizu-blume.com> writes:

> Difficult part: A free occurence of an identifier within a macro's
> expansion (when the occurence does not stem from a corresponding free
> occurence of this identifier in the macro instantiation's argument)
> can get captured by a local binding of the same identifier when the
> binding's scope contains the macro's instantiation but not the macro's
> definition.
> I know of no reliable way (short of having control over the entire
> program -- or, at least all of the code that occurs "between" a
> macro's definition and the corresponding use) to reliably avoid this
> situation.  (A hygienic macro expander works by effectively
> maintaining global control over the entire program, rewriting even
> pieces of code that are not part of any macro instantiations.)  Being
> "careful" with or "good at" writing macros alone definitely does not
> solve this problem.

It is not necessary to globally rewrite code to implement hygienic macros.
You simply need to put more information into the identifiers.

- Christopher
From: Matthias Blume
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <m3ptx63pc1.fsf@hana.shimizu-blume.com>
Christopher Barber <·······@curl.com> writes:

> It is not necessary to globally rewrite code to implement hygienic macros.
> You simply need to put more information into the identifiers.

"Simply" is an understatement, to be sure.

It probably depends on what one means by "rewriting".  I bet your
"putting more information into identifiers" amounts to rewriting when
you look at it from the right angle...

Anyway, the (IMHO) most lucid explanations of how a hygienic macro
expander works aro all done in terms a global translation (which
rewrites every part of the program).
From: Kaz Kylheku
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ai5ak5$hvf$3@luna.vcn.bc.ca>
In article <··············@hana.shimizu-blume.com>, Matthias Blume wrote:
> Christopher Barber <·······@curl.com> writes:
> 
>> It is not necessary to globally rewrite code to implement hygienic macros.
>> You simply need to put more information into the identifiers.
> 
> "Simply" is an understatement, to be sure.
> 
> It probably depends on what one means by "rewriting".  I bet your
> "putting more information into identifiers" amounts to rewriting when
> you look at it from the right angle...
> 
> Anyway, the (IMHO) most lucid explanations of how a hygienic macro
> expander works aro all done in terms a global translation (which
> rewrites every part of the program).

Does that mean that when you are debugging, you must work with the
rewritten representation?
From: Matthias Blume
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <m365yx30ip.fsf@hana.shimizu-blume.com>
Kaz Kylheku <···@ashi.footprints.net> writes:

> In article <··············@hana.shimizu-blume.com>, Matthias Blume wrote:
> > Christopher Barber <·······@curl.com> writes:
> > 
> >> It is not necessary to globally rewrite code to implement hygienic macros.
> >> You simply need to put more information into the identifiers.
> > 
> > "Simply" is an understatement, to be sure.
> > 
> > It probably depends on what one means by "rewriting".  I bet your
> > "putting more information into identifiers" amounts to rewriting when
> > you look at it from the right angle...
> > 
> > Anyway, the (IMHO) most lucid explanations of how a hygienic macro
> > expander works aro all done in terms a global translation (which
> > rewrites every part of the program).
> 
> Does that mean that when you are debugging, you must work with the
> rewritten representation?

That's purely an implementation issue. In implementation of hygienic
macros that I did a number of years ago would preserve the original
source names, so that's what you would see (among other things) in
error messages and during debugging.  Other implementations, I am
sure, are similar.

Anyway, if this is supposed to be yet another attack on the usefulness
(or absence thereof) of hygienic macros, you are talking to the wrong
person.  I personally don't even *like* macros (hygienic or not) all
that much. In my daily programming I don't use macros at all.  (I am
also not using Lisp or Scheme, in case you wonder.)
From: Kaz Kylheku
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ai5ak3$hvf$2@luna.vcn.bc.ca>
In article <··············@hana.shimizu-blume.com>, Matthias Blume wrote:
> Difficult part: A free occurence of an identifier within a macro's
> expansion (when the occurence does not stem from a corresponding free
> occurence of this identifier in the macro instantiation's argument)
> can get captured by a local binding of the same identifier when the
> binding's scope contains the macro's instantiation but not the macro's
> definition.
> I know of no reliable way (short of having control over the entire
> program -- or, at least all of the code that occurs "between" a
> macro's definition and the corresponding use) to reliably avoid this
> situation.  (A hygienic macro expander works by effectively
> maintaining global control over the entire program, rewriting even
> pieces of code that are not part of any macro instantiations.)  Being
> "careful" with or "good at" writing macros alone definitely does not
> solve this problem.
> 
>    Example:
> 
>       int num_errors = 0;
>       #define error(s) (num_errors++; fputs (s, stderr))
>       ...
>       double f (double x, double y, double num_error /* numerical error */)
>       {
>          ...
>          if (<some abnormal condition>)
>             error("abnormal condition occured");
>          ...
>       }
> 
>    Notice how the call of "error" within f will fail to increment the
>    global error counter while instead screwing with a totally unrelated
>    variable...

However, in C++, you would simply apply scope resolution, and write
::num_errors++.  So being careful would indeed solve your problem in that
language. In C, we could write a function and call it: 

  void increment_errors() { num_errors++; }
  #define error(s) ( increment_errors(), fputs(s, stderr); )

In C99 and C++, we could make it an inline function.

In idiomatic Lisp, you would use the symbol *num-errors*, which would be
special, and which you would not try to use as a lexical variable; your macro
would INCF that special variable (and of course the inline function method
is available to you as well).  You could override the special variable with a
local dynamic binding, but in that case you would *want* the macro to operate
on that temporary binding.

It's not clear what value would be provided here by some hygienic system. What
*num-errors* would the macro possibly refer to other than the special one?
Do we really need macro definition sites to have their own lexical scopes
whose visibility is somehow carried over to the expansion sites? 
To me, a macro is defined in a separate world; I don't want it to be surrounded
by baggage which is somehow reachable in the expansion. I want the expansion
to know nothing about the macro, though I'm okay with the expansion using
some globals in the macro's package.

Okay, suppose that instead of a special variable,  you want to refer to a
num-errors slot within some struct or class instance. In that case, you want to
ensure that the instance exists in the scope of the expansion, so you might
want to define some with-* macro which ensures that. That with-* macro
will provide the mutator as a local macro via macrolet:

  (with-temporary-local-error-context
    ...  (error ...) ;; macrolet: increments local error count, not global one
    ...)

which ultimately expands to something like:

  (let ((#:gensym-25 (make-error-context))) 
    ... (progn (incf (slot-value 'error-count #:gensym-25))
	       (format *error-output* ...))
    ...)

So all in all, I don't see how there are no reliable ways; you just
have to identify what it means to do the reliable thing, and teach your
little macro robot programmers to do that.
From: Matthias Blume
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <m3fzy130ta.fsf@hana.shimizu-blume.com>
Kaz Kylheku <···@ashi.footprints.net> writes:

> In article <··············@hana.shimizu-blume.com>, Matthias Blume wrote:
> > Difficult part: A free occurence of an identifier within a macro's
> > expansion (when the occurence does not stem from a corresponding free
> > occurence of this identifier in the macro instantiation's argument)
> > can get captured by a local binding of the same identifier when the
> > binding's scope contains the macro's instantiation but not the macro's
> > definition.
> > I know of no reliable way (short of having control over the entire
> > program -- or, at least all of the code that occurs "between" a
> > macro's definition and the corresponding use) to reliably avoid this
> > situation.  (A hygienic macro expander works by effectively
> > maintaining global control over the entire program, rewriting even
> > pieces of code that are not part of any macro instantiations.)  Being
> > "careful" with or "good at" writing macros alone definitely does not
> > solve this problem.
> > 
> >    Example:
> > 
> >       int num_errors = 0;
> >       #define error(s) (num_errors++; fputs (s, stderr))
> >       ...
> >       double f (double x, double y, double num_error /* numerical error */)
> >       {
> >          ...
> >          if (<some abnormal condition>)
> >             error("abnormal condition occured");
> >          ...
> >       }
> > 
> >    Notice how the call of "error" within f will fail to increment the
> >    global error counter while instead screwing with a totally unrelated
> >    variable...
> 
> However, in C++, you would simply apply scope resolution, and write
> ::num_errors++.  So being careful would indeed solve your problem in that
> language.

Yeah, by going to a different language, all problems can be solved.

> In C, we could write a function and call it: 
> 
>   void increment_errors() { num_errors++; }
>   #define error(s) ( increment_errors(), fputs(s, stderr); )

This still is full of potential name clashes (increment_error, fputs,
stderr).

> In C99 and C++, we could make it an inline function.

...which is not a macro at all, so what's your point?

> In idiomatic Lisp, you would use the symbol *num-errors*, which would be
> special, and which you would not try to use as a lexical variable;

Right.  By using *global* conventions.  Isn't that what I said?

> your macro
> would INCF that special variable (and of course the inline function method
> is available to you as well).  You could override the special variable with a
> local dynamic binding, but in that case you would *want* the macro to operate
> on that temporary binding.
> 
> It's not clear what value would be provided here by some hygienic system.

In my original post I already said that I agree with Lispers who say
they have on use for hygienic macros, and that the cases they solve
simply don't come up.  I was simply offering an aswer to the question
what "hygienic" means.  Please, stop putting up an argument of whether
this particular meaning has any practical value to you or anyone else.
I was very careful in being as neutral as possible about that.

> So all in all, I don't see how there are no reliable ways; you just
> have to identify what it means to do the reliable thing, and teach your
> little macro robot programmers to do that.

I said "no reliable ways short of ...".  You have in no way
contradicted what I said.
From: Matthias Blume
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <m31y9l30f9.fsf@hana.shimizu-blume.com>
Matthias Blume <········@shimizu-blume.com> writes:

> Kaz Kylheku <···@ashi.footprints.net> writes:
> 
> > In article <··············@hana.shimizu-blume.com>, Matthias Blume wrote:
> > > Difficult part: A free occurence of an identifier within a macro's
> > > expansion (when the occurence does not stem from a corresponding free
> > > occurence of this identifier in the macro instantiation's argument)
> > > can get captured by a local binding of the same identifier when the
> > > binding's scope contains the macro's instantiation but not the macro's
> > > definition.
> > > I know of no reliable way (short of having control over the entire
> > > program -- or, at least all of the code that occurs "between" a
> > > macro's definition and the corresponding use) to reliably avoid this
> > > situation.  (A hygienic macro expander works by effectively
> > > maintaining global control over the entire program, rewriting even
> > > pieces of code that are not part of any macro instantiations.)  Being
> > > "careful" with or "good at" writing macros alone definitely does not
> > > solve this problem.
> > > 
> > >    Example:
> > > 
> > >       int num_errors = 0;
> > >       #define error(s) (num_errors++; fputs (s, stderr))
> > >       ...
> > >       double f (double x, double y, double num_error /* numerical error */)
> > >       {
> > >          ...
> > >          if (<some abnormal condition>)
> > >             error("abnormal condition occured");
> > >          ...
> > >       }
> > > 
> > >    Notice how the call of "error" within f will fail to increment the
> > >    global error counter while instead screwing with a totally unrelated
> > >    variable...
> > 
> > However, in C++, you would simply apply scope resolution, and write
> > ::num_errors++.  So being careful would indeed solve your problem in that
> > language.
> 
> Yeah, by going to a different language, all problems can be solved.
> 
> > In C, we could write a function and call it: 
> > 
> >   void increment_errors() { num_errors++; }
> >   #define error(s) ( increment_errors(), fputs(s, stderr); )
> 
> This still is full of potential name clashes (increment_error, fputs,
> stderr).
> 
> > In C99 and C++, we could make it an inline function.
> 
> ...which is not a macro at all, so what's your point?
> 
> > In idiomatic Lisp, you would use the symbol *num-errors*, which would be
> > special, and which you would not try to use as a lexical variable;
> 
> Right.  By using *global* conventions.  Isn't that what I said?
> 
> > your macro
> > would INCF that special variable (and of course the inline function method
> > is available to you as well).  You could override the special variable with a
> > local dynamic binding, but in that case you would *want* the macro to operate
> > on that temporary binding.
> > 
> > It's not clear what value would be provided here by some hygienic system.
> 
> In my original post I already said that I agree with Lispers who say
> they have on use for hygienic macros, and that the cases they solve
            ^^
            no

> simply don't come up.  I was simply offering an aswer to the question
> what "hygienic" means.  Please, stop putting up an argument of whether
> this particular meaning has any practical value to you or anyone else.
> I was very careful in being as neutral as possible about that.
> 
> > So all in all, I don't see how there are no reliable ways; you just
> > have to identify what it means to do the reliable thing, and teach your
> > little macro robot programmers to do that.
> 
> I said "no reliable ways short of ...".  You have in no way
> contradicted what I said.
From: Erik Naggum
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <3236789642671151@naggum.net>
* Paul D. Lathrop
| Sorry to sound ignorant, but what does non-hygienic mean in this context?

  "Non-hygienic" is primarily a socioideological device used to produce a
  distinction between the clean guys and the dirty guys.  In simpler terms, it
  is an us-vs-them, good-vs-evil thing.  Practially, it means that of the many
  places you can have cleanliness, one group of people think that their place
  is so important that another group of people are "non-hygienic" since they do
  not value that particular place equally highly or with the same passion.

  Curiously, this all started with a technical issue far removed from macros.
  In some languages, having one namespace for everything is considered clean,
  much like having every object in your entire kitchen in full view on the
  counters or the floor all the time, including keeping your food in the open,
  and then they go to a lot of trouble to ensure that this is "hygienic", with
  ritual and dedication.  Then people who use this kitchen naturally become
  slighly neurotic about hygiene.  Some think this is nuts and look around for
  a different solution and use namespaces for different kinds of things --
  functions, variables, classes and types, go tags, catch tags, etc, and invent
  packages to stuff things into so both the language and the programmer can
  keep the hygiene in one confined compartments at a time instead of the whole
  kitchen as a whole.  Hygiene is now much less of an issue, as it is built in
  to how you do things without having to double-check.  You would purposefully
  have to make a dirty mess to break hygiene in this multiple namespace model,
  which would be almost as much work as keeping the one-kitchen-with-everything
  clean.

  The really odd thing about making things too simple, like a kitchen without
  any cupboards or storage space at all, is that other things that should be
  clean and simple become sore spots -- bring something dirty into this too
  simple kitchen, and voil� (as they say in kitchens) you are non-hygienic and
  cleaning up becomes your obsession.  Ironically, cleanliness now means work
  and effort, something you have to do to leave a "natural form".  It is not
  just there, effortlessly achieved by routine, you have to keep it in mind
  with constant vigilance.  This presence of mind and the conscious awareness
  of an issue changes your entire outlook on it.  There is even a term for
  this: mysophobia, the abnormal fear of uncleanliness.  If you consider this
  state normal, it becomes a personal affront to you when somebody else does
  not "respect" your desire for cleanliness, and you spend some of your energy
  denouncing those who are "unclean".  Hence, you invoke "non-hygienic" to
  refer to those other people who are not like yourself.  Instead of causing
  other people to be more concerned with hygiene, they may find the label quite
  disconcerting, as it is easily perceived as a personal attack, but we have to
  understand that those who want to live in too simple and elegant kitchen (at
  least until they fill it with lots of food and useful things) actually try to
  defend themselves from what they perceive as a threat because foreign
  elements may actually "pollute" their once pristine namespace.

  So Scheme has taken cleanliness and simplicity too far and have ended up
  being mysophobia and consequenly excoriate Common Lisp for not being clean in
  exactly the same places they have scrubbed Scheme clean.  But this is an
  irrational fear, as it does not apply to Common Lisp.  Anyone who uses "non-
  hygienic" is, unfortunately or not, seen as judging Common Lisp in Scheme
  terms, which is one of the many ways to go _really_ wrong in any judgment.

  Common Lisp is like a house that looks neat and clean yet holds enormous
  amounts of stuff in cupboards and closets, a language in which you are not
  afraid to call a variable "list" just because there is a function and a type
  called "list".  To those who want everything in one, clean namespace, this is
  anathema!  They cannot understand that the hygiene problem has been solved at
  a very different language level.  After all, you have to work with your food
  on the same counter in the same kitchen, so the similarity overwhelms the
  slightly neurotic mind and the fear of all things non-hygienic takes control.

  Specifically, it is not "non-hygienic" in Common Lisp to let a macro both use
  bindings from and introduce new ones in the caller's environments any more
  than it is to bring something new into your kitchen if you have everything
  else stored away properly and you can wrap food in protective covers.  If you
  want more hygiene in Common Lisp, you have the tools you need to wrap it up
  so it becomes harmless and unharmed.  Since the very concept of wrapping
  things up is so much harder to accomplish in Scheme, this part of Common Lisp
  tends to be ignored by Scheme adherents who just get an emotional response
  from seeing the possibility of some dirty foreign object in their kitchen.

  Sadly, Scheme adherents have historically had a very strong need to show how
  clean they are and they have tried to rub Common Lisp users' collective nose
  in whatever they thought they were cleaner than.  Consequentyl, mere use of
  the term "non-hygienic" only prolongs the negative connotations of Scheme's
  neurotic cleanliness and the constant claims that Common Lisp does not live
  up to their requirements.  Frankly, it gets on a lot of Common Lispers nerve.

  As long as you have only one namespace, it _will_ get polluted, and even if
  you name your variables carefully in any binding (such as "lst" instead of
  "list"), you will sooner or later screw up, so when variables and functions
  share namespaces or when you define a huge number of local functions that
  override the global function definition, such as Schemers are also wont to
  do, you need protection from the pollution that the language _invites_.  So
  while Common Lisp may be "non-hygienic" as seen from Scheme, all of Scheme is
  non-hygienic as seen from Common Lisp.  Clearly, there is very limited value
  in such labels both ways.  Common Lispers just prefer to shove the neurotic
  Schemers out of the kitchen altogether.

[ Note: This is not posted to comp.lang.scheme.  Nevertheless, I predict that
  we will see several explosions from Scheme users who somehow think that it is
  perfectly OK to label Common Lisp "non-hygienic", but not OK to label Scheme
  "non-hygienic", because their strong _beliefs_ and _fears_ get in the way.
  Also, personal attacks and other forms of hate mail from Scheme freaks should
  be mailed to me privately.  If you have to exact revenge for blaspheming your
  beliefs in cleanliness, please be so kind as so avoid a public spectacle. ]

-- 
Erik Naggum, Oslo, Norway   *****   One manslaughter is another man's laughter.

Act from reason, and failure makes you rethink and study harder.
Act from faith, and failure makes you blame someone and push harder.
From: Thomas Bushnell, BSG
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <87znwc1zqb.fsf@becket.becket.net>
I don't care about the *word* hygenic.  The *point* behind the word is
to have a macro system that obeys the rules of lexical scoping--that
the "meaning" of an identifier is found always by looking at its
lexical context.

Traditional Lisp macro systems don't preserve the rules of lexical
scoping.  It's no big mystery why; the importance of lexical scoping
hadn't really been seen when the basics of the Lisp macro system were
being developed.

Then combine that with the not-inconsiderable difficulties of making a
lexically-scoped macro system that has all the functionality of the
traditional Lisp system, and it's clear why Lisp (quite reasonably)
continues to use the "non-hygenic" sort of macro system.

Algol-derived languages, which have basically always had lexical
scoping, often have primitive macro systems, which do *not* obey
lexical scoping, much to the annoyance of programmers in those
languages.  Several language features in GCC arose as attempts to deal
with this problem, to enable programmers to write "hygenic" macros.

It's probably not quite right to refer to the macro system as hygenic
or not; rather, it's the particular macro that is hygenic.  If the
expansion inserts bindings that can capture locals in the
context-of-use, then it's not hygenic.  But of course, Lisp
programmers have for many years been using facilities like gensym to
make their macros hygenic.  The goal behind a hygenic macro system is
that the macros written with it are guaranteed to be hygenic, just as
a function in lexically-scoped-lisp is guaranteed not to have the
funarg problem.

The other thing that gets lumped together with "hygenic" is
"referential transparency", which is that references in the
context-of-definition of the macro are expanded to *their* lexically
apparent bindings, and not whatever bindings happen to be in place in
the context-of-use.  As far as I know, Common Lisp does not have any
particular facilities to provide referential transparency, and instead
counsels programmers that their macros shouldn't depend on their
lexical context.

So, Scheme programmers use the term "hygenic" to refer to two
separate concepts: "hygenic" proper, and referential transparency.
The macro definition system guarantees referential transparency, which
Lisp probably could, but does not (as I understand it, out of a desire
to guarantee the possibility of efficient compilation).  The Scheme
macro system aims to guarantee "hygiene", and, while the Lisp system
does not, it does provide all the tools to competent programmers to
make their macros hygenic.

Thomas
From: Software Scavenger
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <a6789134.0207302204.38ebfaac@posting.google.com>
·········@becket.net (Thomas Bushnell, BSG) wrote in message news:<··············@becket.becket.net>...

> the context-of-use.  As far as I know, Common Lisp does not have any
> particular facilities to provide referential transparency, and instead
> counsels programmers that their macros shouldn't depend on their
> lexical context.

How would it be useful for a macro expansion to refer to the lexical
context of its definition?  It seems far more useful to me for it to
be able to refer to the lexical context of the expansion.  Especially
when multiple macro expansions work together and need to communicate
with each other.
From: Paul F. Dietz
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <3D47D49C.A69AF81B@dls.net>
Software Scavenger wrote:

> How would it be useful for a macro expansion to refer to the lexical
> context of its definition?  It seems far more useful to me for it to
> be able to refer to the lexical context of the expansion.  Especially
> when multiple macro expansions work together and need to communicate
> with each other.

I've often wished macro environments had more heft to them.  I'd like
to see compiler environment information (such as type declarations,
optimization levels, etc.) be available in the environment -- it would
help when writing compiler macros.  I suppose you could get around this
for the type declarations by defining your own declaration form. It
would expand to a LOCALLY form + a MACROLET defining fake macros to
communicate the declaration information downward.  But it would be
cleaner if this were all builtin.

	Paul
From: Kaz Kylheku
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ai900s$o5m$2@luna.vcn.bc.ca>
In article <····························@posting.google.com>, Software
Scavenger wrote:
> ·········@becket.net (Thomas Bushnell, BSG) wrote in message news:<··············@becket.becket.net>...
> 
>> the context-of-use.  As far as I know, Common Lisp does not have any
>> particular facilities to provide referential transparency, and instead
>> counsels programmers that their macros shouldn't depend on their
>> lexical context.
> 
> How would it be useful for a macro expansion to refer to the lexical
> context of its definition?

Suppose you wanted the macro to use some global variables, but your
language doesn't have packages. So you might want to do something
like this:

  (let (global-variable)
    (defmacro (...) (.. `(... global-variable))))

But special variables and packages make this unnecessary.
From: Erik Naggum
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <3237129893567597@naggum.net>
* Software Scavenger
| How would it be useful for a macro expansion to refer to the lexical context
| of its definition?

  If you have only one namespace for everything, which is a design flaw of
  Biblical proportions, you have to solve the problem that you have introduced
  to your programming environment by preventing a programmer's function and
  variable bindings, type definitions, etc, from clobbering the same referenced
  and required by the macro for its proper operation.  The point with all this
  hygiene and referential transparency is to defend the misguided notion that
  one namespace is not only sufficient, but superior to multiple namespaces.
  But rather than solve the problem with a fairly simple "rules" that functions
  are global and that you never clobber functions from a package you do not
  control, you now how to solve the problem of ensuring that clobbering
  functions does not cause massive damage to your entire system.

  For instance, if you have a macro that makes its caller call a "foo" when
  expanded, you have a dependency on the binding assumed by the macro.  If
  "foo" is a symbol in the Common-Lisp package, and if we momentarily ignore
  the rule that programmers may not clobber symbols in the Common-Lisp package,
  you would want the macro to refer to the binding assumed by the macro even if
  a programmer clobbered its binding with a lexical definition of "foo".

  You see that this is a non-problem in a properly designed language, and a
  major issue in a maldesigned language that introduces problems because it
  _believes_ in counterproductive principles of "simplicity".  The more you
  believe in something, the harder you push despite serious setbacks and
  concomitant problems.  Remember the belief in the simple assumption that the
  earth is the center of the universe and how it spawned immensely complex
  computations of planetary travel.  The right solution was to adopt a much
  simpler model that had the drawback of reducing the importance of humans to
  the Universe.  This was considered very threatening by those who believed
  very firmly in the universal importance of humans.  The same kind of failure
  to grasp that a single namespace causes problems is at the root of a number
  of very complex problems.

  Most of these problems evaporate if you adopt a perspective more conducive to
  the actual task of programming.  Insisting on the single namespace has caused
  a slower progression in computer science than we could have achieved with a
  deeper understanding of natural language grammars and of how people actually
  think in the design of artificial languages.  For instance, we have developed
  an excellent machinery to deal with context, yet define context-free grammars
  and think them superior to more complex grammars because they are simpler to
  implement.  This, sadly, causes people who struggle to get away from the bad
  design principles of programming languages to define much worse languages
  than they would have if there had been more theoretical support for better
  language design methodologies.  I would argue that the single namespace and
  the context-free grammar combined provide the worst possible foundation for
  the design of a programming language.

-- 
Erik Naggum, Oslo, Norway

Act from reason, and failure makes you rethink and study harder.
Act from faith, and failure makes you blame someone and push harder.
From: Jens Axel S�gaard
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <3d4326df$0$168$edfadb0f@dspool01.news.tele.dk>
Erik Naggum wrote:

> [ Note: This is not posted to comp.lang.scheme.
> Nevertheless, I predict that   we will see several
> explosions from Scheme users who somehow think that it is
> perfectly OK to label Common Lisp "non-hygienic", but not
> OK to label Scheme   "non-hygienic", because their strong
> _beliefs_ and _fears_ get in the way.   Also, personal
> attacks and other forms of hate mail from Scheme freaks
> should   be mailed to me privately.  If you have to exact
> revenge for blaspheming your   beliefs in cleanliness,
> please be so kind as so avoid a public spectacle. ]

The serious Scheme implementations offer you non-hygienic
macros as well (yes - I know they are not in the standard).
Then you have your own choice.

The situation is symmetric. Common Lispers can also choose.
Dorai Sitaram has made "Scheme Macros for Common Lisp."

    http://www.ccs.neu.edu/home/dorai/mbe/mbe-lsp.html

My personal guess is that he need them to make
TeX2Page run on both Scheme and Common Lisp. I'm still some what
amazed that he can do that.

--
Jens Axel S�gaard
From: Thomas Bushnell, BSG
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <87vg701zpl.fsf@becket.becket.net>
"Jens Axel S�gaard" <······@soegaard.net> writes:

> The serious Scheme implementations offer you non-hygienic
> macros as well (yes - I know they are not in the standard).
> Then you have your own choice.
> 
> The situation is symmetric. Common Lispers can also choose.
> Dorai Sitaram has made "Scheme Macros for Common Lisp."

As far as I know, Common Lisp systems do not guarantee referential
transparency, however.
From: Hannah Schroeter
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ai3dc2$26r$2@c3po.schlund.de>
[F'up c.l.l only]

Hello!

Thomas Bushnell, BSG <·········@becket.net> wrote:

>As far as I know, Common Lisp systems do not guarantee referential
>transparency, however.

Scheme doesn't either:

(define foo
  (let ((x 0))
    (lambda (y)
      (set! x (+ 2 x))
      (+ x y))))

(foo 1)
=> 3
(foo 1)
=> 5

Try Haskell 98 or Mercury or such if you want guaranteed referential
transparency (but don't use any of the popular Haskell extensions
or such!).

Kind regards,

Hannah.
From: Thomas Bushnell, BSG
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <877kje49og.fsf@becket.becket.net>
······@schlund.de (Hannah Schroeter) writes:

> [F'up c.l.l only]
> 
> Hello!
> 
> Thomas Bushnell, BSG <·········@becket.net> wrote:
> 
> >As far as I know, Common Lisp systems do not guarantee referential
> >transparency, however.
> 
> Scheme doesn't either:
> 
> (define foo
>   (let ((x 0))
>     (lambda (y)
>       (set! x (+ 2 x))
>       (+ x y))))
> 
> (foo 1)
> => 3
> (foo 1)
> => 5

Maybe you don't understand what Scheme programmers mean by
"referential transparency".

In the quoted code there is one variable named X, established by one
LET, and all the mentions of X are lexically within that binding and
refer to that one variable.

Similarly for Y.
From: Michael Hudson
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <lk8z3th4e6.fsf@pc150.maths.bris.ac.uk>
·········@becket.net (Thomas Bushnell, BSG) writes:

> ······@schlund.de (Hannah Schroeter) writes:
> 
> > [F'up c.l.l only]
> > 
> > Hello!
> > 
> > Thomas Bushnell, BSG <·········@becket.net> wrote:
> > 
> > >As far as I know, Common Lisp systems do not guarantee referential
> > >transparency, however.
> > 
> > Scheme doesn't either:
> > 
> > (define foo
> >   (let ((x 0))
> >     (lambda (y)
> >       (set! x (+ 2 x))
> >       (+ x y))))
> > 
> > (foo 1)
> > => 3
> > (foo 1)
> > => 5
> 
> Maybe you don't understand what Scheme programmers mean by
> "referential transparency".

That's a different meaning from what I mean by the term, and what
(say) this page:

   http://ist.unibw-muenchen.de/kahl/reftrans.html

means by the term...

> In the quoted code there is one variable named X, established by one
> LET, and all the mentions of X are lexically within that binding and
> refer to that one variable.

But the value of the expression cannot be determined just by looking
at it -- it matters how often foo has been called...

Cheers,
M.

-- 
  If you don't use emacs, you're a pathetic, mewling, masochistic
  weakling and I can't be bothered to convert you.    -- Ron Echeverri
From: Stephen J. Bevan
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <m3n0sh2dzn.fsf@dino.dnsalias.com>
Christopher Barber <·······@curl.com> writes:
> Marco Antoniotti <·······@cs.nyu.edu> writes:
> > after you have defined your macro you can say
> > 
> >         (with-file-open (f "f.dat")
> >            (print "something" f))
> > 
> > There is no way to do that as easily in any other language. 
> 
> You can do this this in any language with a sufficiently powerful macro
> system, not just CL.

Indeed.  However, for this particular example, there isn't much
difference between using a macro and a higher order function.  So,
the example could be recast in Scheme as :-

  (with-open-file "f.dat"
    (lambda (f)
      (display "something" f)))

and in Smalltalk :-

  "f.dat" withOpenFile: [ f | f print: 'something' ]

for suitable definitions of with-open-file and withOpenFile
respectively.  The same approach could, of course, be used in CL.
From: Hannah Schroeter
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ahmldu$kqq$2@c3po.schlund.de>
Hello!

Marco Antoniotti  <·······@cs.nyu.edu> wrote:

>[...]

>Suppose you want to open a file and make sure you close it when you
>are done, while handling errors.

>The C++/Java/SLDJ (Scripting Language Du Jour) idiom to do this is

>        FILE f = NULL;
>        try
>          {
>                f = open("f.dat", "w");
>                print(f, "something");
>          }
>        finally
>          {
>                close(f);
>          }

C++ doesn't even have "finally".

Try things like:
	FILE* f = 0;
	try {
		f = fopen(...);
		fprintf(f, ...);
	} catch (...) {
		fclose(f);
		throw;
	}
	fclose(f);

And even then, you're incorrect. The close is only correct iff
fopen() has succeeded. I.e.
	FILE* f = 0;
	f = fopen(...);
	try {
		fprintf(f, ...);
	} catch (...) {
		fclose(f);
		throw;
	}
	fclose(f);

But, of course, you could also use destructors in this case, as
another post has mentioned.

>apart from the fact that this sort of control structure was present in
>Lisp and CL way before it appeared in C/C++ (hence Java and your SLDJ)
>the CL equivalent is

>        (let ((f nil))
>          (unwind-protect
>             (progn
>                (setf f (open "f.dat" :direction :output))
>                (print "something" f))
>             (close f)))

You'd rather do:

(let ((f (open ...)))
  (unwind-protect
    (... do something on f ...)
    (close f)))

>Now, the interesting thing about CL is that you can write a macro that
>wraps the above piece of code in a nice way.

>        (defmacro with-file-open ((fvar file &key (direction :output))
>                                  &body forms)
>[...]

>There is no way to do that as easily in any other language. Especially
>your SLDJ!  Certainly, you do not get it compiled down to native
>machine code.

Yes, you can. Try your favourite functional programming language,
you can define something like

onOpenFile fileName mode function =
	...

in Haskell, SML, ocaml, ...

Usage is something like
  onOpenFile fileName mode
             (lambda fileHandle -> ... do something on fileHandle ...)

>[...]

>Moreover! Do you know that you can write a `try' macro in CL to make
>C++/Java/SLDJ programmers happy?

Is that really needed or do you just need to rename handler-case
into try? :-)

>Since you are so interested in data base programming, can you see the
>implications for a `with-transaction' macro?

Yep. And you might even abstract out the commonalities from the
individual macro definitions of with-open-file and with-transaction:
Some initialization, often with a binding, some main action, and
some cleanup which has to be executed always afterwards (i.e.
unwind-protect'ed). Macro writing macros are *cool*.

>[...]

>> So, I guess I agree with you, but it's unclear to me that the language
>> has to be able to extend itself on the fly.  More important to me is
>> the ability to change class definitions on the fly, which some
>> scripting languages now support.

>Great! Another application of Greenspun's Tenth Rule of
>Programming applied to SLDJ!

>`change-class' has been part of Common Lisp for ages (since the
>inception of CLOS).  *And* you still get the calls to `change-class'
>compiled to native machine language in Common Lisp.

I think the quote meant more something like defclass of an already
defined class name, including update-object-for-redefined-class :-)

>[...]

Kind regards,

Hannah.
From: Marco Antoniotti
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <y6c3cu99gly.fsf@octagon.mrl.nyu.edu>
······@schlund.de (Hannah Schroeter) writes:

> Hello!
> 
        ...

> C++ doesn't even have "finally".

It has `catch (...)'.

> Try things like:
> 	FILE* f = 0;
> 	try {
> 		f = fopen(...);
> 		fprintf(f, ...);
> 	} catch (...) {
> 		fclose(f);
> 		throw;
> 	}
> 	fclose(f);
> 
> And even then, you're incorrect. The close is only correct iff
> fopen() has succeeded. I.e.
> 	FILE* f = 0;
> 	f = fopen(...);
> 	try {
> 		fprintf(f, ...);
> 	} catch (...) {
> 		fclose(f);
> 		throw;
> 	}
> 	fclose(f);

This is incorrect.  The last `fclose' (assuming the C library
`fclose') generates an error if `fopen' failed.  You need to at least
conditionalize the call to `fclose' to be reasonably correct.

        if (f != NULL) fclose(f);

to which you may add that `fclose' itself may fail and so on and so on.

Note that my code was "pseudo" at some level.  I wasn't writing C++ or
Java code.

> 
> But, of course, you could also use destructors in this case, as
> another post has mentioned.

To which I just answered.

> >apart from the fact that this sort of control structure was present in
> >Lisp and CL way before it appeared in C/C++ (hence Java and your SLDJ)
> >the CL equivalent is
> 
> >        (let ((f nil))
> >          (unwind-protect
> >             (progn
> >                (setf f (open "f.dat" :direction :output))
> >                (print "something" f))
> >             (close f)))
> 
> You'd rather do:
> 
> (let ((f (open ...)))
>   (unwind-protect
>     (... do something on f ...)
>     (close f)))

No, you don't because in this case you miss the errors generated by
OPEN when called with :error as value to :if-exists or
:if-does-not-exist etc etc.

I admit that my code above is not completely correct.  You really need

        (let ((f nil))
           (unwind-protect
              (progn
                 (setf f (open ...))
                 (do-something f))
              (when (and f (open-stream-p f)) (close f))))

> >Now, the interesting thing about CL is that you can write a macro that
> >wraps the above piece of code in a nice way.
> 
> >        (defmacro with-file-open ((fvar file &key (direction :output))
> >                                  &body forms)
> >[...]
> 
> >There is no way to do that as easily in any other language. Especially
> >your SLDJ!  Certainly, you do not get it compiled down to native
> >machine code.
> 
> Yes, you can. Try your favourite functional programming language,
> you can define something like
> 
> onOpenFile fileName mode function =
> 	...
> 
> in Haskell, SML, ocaml, ...
> 
> Usage is something like
>   onOpenFile fileName mode
>              (lambda fileHandle -> ... do something on fileHandle ...)

Sorry.  This does not count.  That is equivalent to passing functions
around and using the underlying exception semantics to deal with
exceptions themselves.

Try changing the `try <expression> with <match cases>' of Ocaml into
`handler_case <expression> cases <match cases>'.  You need 'p4' to do
that.  Nothing wrong with it, but it is not the same as in CL.

> >[...]
> 
> >Moreover! Do you know that you can write a `try' macro in CL to make
> >C++/Java/SLDJ programmers happy?
> 
> Is that really needed or do you just need to rename handler-case
> into try? :-)

Well, pretty much that :)

        ...

Cheers

-- 
Marco Antoniotti ========================================================
NYU Courant Bioinformatics Group        tel. +1 - 212 - 998 3488
719 Broadway 12th Floor                 fax  +1 - 212 - 995 4122
New York, NY 10003, USA                 http://bioinformatics.cat.nyu.edu
                    "Hello New York! We'll do what we can!"
                           Bill Murray in `Ghostbusters'.
From: Marco Antoniotti
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <y6cwurl81yv.fsf@octagon.mrl.nyu.edu>
Marco Antoniotti <·······@cs.nyu.edu> writes:

> ······@schlund.de (Hannah Schroeter) writes:
> 
> > Hello!
> > 
>         ...
> 
> > C++ doesn't even have "finally".
> 
> It has `catch (...)'.

... meaning that it comes close to `finally'.

Cheers

-- 
Marco Antoniotti ========================================================
NYU Courant Bioinformatics Group        tel. +1 - 212 - 998 3488
719 Broadway 12th Floor                 fax  +1 - 212 - 995 4122
New York, NY 10003, USA                 http://bioinformatics.cat.nyu.edu
                    "Hello New York! We'll do what we can!"
                           Bill Murray in `Ghostbusters'.
From: Joe Marshall
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ywB%8.101273$_51.86389@rwcrnsc52.ops.asp.att.net>
"Hannah Schroeter" <······@schlund.de> wrote in message ·················@c3po.schlund.de...
> Hello!
>
> Marco Antoniotti  <·······@cs.nyu.edu> wrote:
>
> >        (let ((f nil))
> >          (unwind-protect
> >             (progn
> >                (setf f (open "f.dat" :direction :output))
> >                (print "something" f))
> >             (close f)))
>
> You'd rather do:
>
> (let ((f (open ...)))
>   (unwind-protect
>     (... do something on f ...)
>     (close f)))

If an interrupt occurs after the open, but before the unwind-protect
is entered, you could drop the stream.  This is a bit better:

(let ((f nil)
      (abort-p t))
   (unwind-protect
     (multiple-value-prog1
       (progn (setq f (open "f.dat" :direction :output))
               ... do something ....)
       (setq abort-p nil))
     (when f
       (close f :abort abort-p))))

But still, if an interrupt occurs after open constructs the stream,
but before the SETQ is performed, you can drop the stream.  In addition,
an interrupt in the cleanup form could abort before the close completes.

To be even safer, you ought to do something like this:

(let ((interrupts-enabled-p  #+allegro excl::*without-interrupts*
                             #+lispworks sys::*in-no-interrupts*)
      (f nil)
      (abort-p t)
      ;; Evaluate the arguments before entering the no interrupt context.
      (open-argument-0 "f.dat")
      (open-argument-1 :direction)
      (open-argument-2 :output))
  (multiple-value-prog1
      (let (#+allegro (excl::*without-interrupts* t)
            #+lispworks (sys::*in-no-interrupts* 1))
        (unwind-protect
            (multiple-value-prog1
                (progn (setq f (open open-argument-0 open-argument-1 open-argument-2))
                       (let (#+allegro (excl::*without-interrupts* interrupts-enabled-p)
                             #+lispworks (sys::*in-no-interrupts* interrupts-enabled-p))
                         #+lispworks
                         (unless sys::*in-no-interrupts*
                           (system::without-interrupt-check-for-interrupts))
                         ... do something ...))
              (setq abort-p nil))
          (when f
            (close f :abort abort-p))))
   #+lispworks
   (unless sys::*in-no-interrupts*
     (system::without-interrupt-check-for-interrupts))))

This ensures that the OPEN, CLOSE, and UNWIND-PROTECT cleanup forms cannot be
interrupted.  The drawback to this approach is that if, for some reason, OPEN
or CLOSE take a long time, the Lisp system will not respond to anything.
If there is a way to prevent asynchronous aborts from happening without
disabling interrupts altogether, then you can prevent the stream from
being accidentally lost, but still retain responsiveness.
From: JRStern
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <3d3ed506.6034567@news.verizon.net>
On 24 Jul 2002 11:16:26 -0400, Marco Antoniotti <·······@cs.nyu.edu>
wrote:
>> Pretty much any language that supports functions or methods can be
>> looked at as extensible.  C++'s overloaded operators are very powerful
>> that way.  I keep wishing someone had a SQL that allowed overloaded
>> operators (does anybody?) for user-defined types, but I guess the
>> solution is to use Java for database procedures.
>
>There are degrees of extensibility.  C++ operator overloading (sans
>multiple dispatching, something you have in CLOS :) ) goes a long way
>to allow you to "extend" the language.  But, always given that Tape,
>and Read/Write heads essentially allow for "extensibility", CL is
>still unparalled in its ability to "extend itself".

OK, it's unparalleled.

>Now, the interesting thing about CL is that you can write a macro that
>wraps the above piece of code in a nice way.
...
>There is no way to do that as easily in any other language. Especially
>your SLDJ!  Certainly, you do not get it compiled down to native
>machine code.

Wrappers are kewl.

Overloading operators are kewl.

Both are subject to abuse so that code becomes write-only.  From this
observation, I surmise that there may be a theoretical issue at stake
which would suggest limiting such features.  Until that is settled, I
maintain some skepticism about kewl features.

(Yes, yes, all features of all languages are subject to abuse, but
surely you know what I mean?)

>Essentially you have added a completely new construct to CL.  To do so
>in Java you need to write a complete parser (I know! I did it).  Busy
>minds are surely working right now to get this sort of things in your
>SLDJ (thus reinforcing Greenspun's Tenth).

Greenspun's Tenth Rule of Programming: "Any sufficiently complicated C
or Fortran program contains an ad-hoc, informally-specified bug-ridden
slow implementation of half of Common Lisp." 

Yes, well, maybe, sort of.  More like Turing's Ninth (which I invent
on the spur of the moment): Any sufficiently complex program contains
a half-assed version of a Turing machine.

Yes.  Absolutely.  And this is exactly what I *recommend* rather than
extend or warp the base language more than a very limited amount.

(Actually, both Greenspun's Tenth and Turing's Ninth are meaningless,
since virtually any interesting program is going to have internal
state and branching on data, and thus appears, from various improper
perspectives, to be exceeding its mandate to be a linear
transformation of input to output)

>Since you are so interested in data base programming, can you see the
>implications for a `with-transaction' macro?

Actually, no.  You ever do any database programming?

>> There was a book about a zillion years ago, "The Art of the Metaobject
>> Protocol", about using CLOS and the art and science of modifying its
>> basic syntax and semantics, whether on the fly or offline.  I reviewed
>> the book for SIGArt and took the same line then, that the authors
>> never quite explained why it was a good idea to do that -- not just
>> extend the language, but warp even the basic features.
>
>You are entitled to this opinion.  Not that I agree with you.
>Especially when you do not explain what are "the basic features".

Hey, whatever.  At some point, it goes too far.  Redefining addition
to be a trinary operator, say.  See previous points.

>> So, I guess I agree with you, but it's unclear to me that the language
>> has to be able to extend itself on the fly.  More important to me is
>> the ability to change class definitions on the fly, which some
>> scripting languages now support.
>
>Great! Another application of Greenspun's Tenth Rule of
>Programming applied to SLDJ!

That's one view.  Another is the question, as I think I've addressed
it several times now in this message, and was apparently Shayne's
original subject, of what might comprise the best set of basic and
fixed features in a (OO) language.

>`change-class' has been part of Common Lisp for ages (since the
>inception of CLOS).  *And* you still get the calls to `change-class'
>compiled to native machine language in Common Lisp.
>
>Moreover, can you explain to us how you did not appreciate "The Art of
>the MOP" and at the same time like one of the operations that fully
>require all its power to be understood correctly?

I'm advocating you don't worry about changing the tires on a moving
car.  I don't want to change anything meta -- I want to differentiate
between the meta and the actual.  That's the basic name of my game.

You may or may not agree, but I think that's clear enough.

Joshua Stern
From: Daniel Barlow
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <87n0sg7ocg.fsf@noetbook.telent.net>
················@gte.net (JRStern) writes:

> Hey, whatever.  At some point, it goes too far.  Redefining addition
> to be a trinary operator, say.  See previous points.

* (+ 1 2 3)
6

"Too far" is a matter of personal taste and/or project standards, of
course, but I find it hard to think of a situation where that usage
would be frowned on


-dan

-- 

  http://ww.telent.net/cliki/ - Link farm for free CL-on-Unix resources 
From: Marco Antoniotti
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <y6c4renq1vy.fsf@octagon.mrl.nyu.edu>
Daniel Barlow <···@telent.net> writes:

> ················@gte.net (JRStern) writes:
> 
> > Hey, whatever.  At some point, it goes too far.  Redefining addition
> > to be a trinary operator, say.  See previous points.
> 
> * (+ 1 2 3)
> 6
> 
> "Too far" is a matter of personal taste and/or project standards, of
> course, but I find it hard to think of a situation where that usage
> would be frowned on

Hey, (Common) Lispers go even further. '+' is at least a 50-ary
operator :) Actually I am wrong. '+' is also a zero-ary (for lack of a
better word) operator.

        cl-prompt> (+)
        0
        cl-prompt> (*)
        1

Cheers

-- 
Marco Antoniotti ========================================================
NYU Courant Bioinformatics Group        tel. +1 - 212 - 998 3488
715 Broadway 10th Floor                 fax  +1 - 212 - 995 4122
New York, NY 10003, USA                 http://bioinformatics.cat.nyu.edu
                    "Hello New York! We'll do what we can!"
                           Bill Murray in `Ghostbusters'.
From: Paolo Amoroso
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ss4=PWv7aY9LVxRIp7Te+yAZ3HKz@4ax.com>
[Followup to comp.lang.lisp only]

On Tue, 23 Jul 2002 23:56:51 GMT, ················@gte.net (JRStern) wrote:

> There was a book about a zillion years ago, "The Art of the Metaobject
> Protocol", about using CLOS and the art and science of modifying its
> basic syntax and semantics, whether on the fly or offline.  I reviewed
> the book for SIGArt and took the same line then, that the authors
> never quite explained why it was a good idea to do that -- not just
> extend the language, but warp even the basic features.

This paper discusses some applications of a MOP:

Open Implementations and Meta Object Protocols
http://www.parc.xerox.com/spl/groups/eca/pubs/papers/Kiczales-TUT95/for-web.pdf

I don't know whether they qualify as language extensions the way you mean
them.


Paolo
-- 
EncyCMUCLopedia * Extensive collection of CMU Common Lisp documentation
http://www.paoloamoroso.it/ency/README
From: Michael Sullivan
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <1ffrtsg.1its0r7clhc5lN%michael@bcect.com>
Shayne Wissler <·········@yahoo.com> wrote:

> JRStern wrote:
> 
> >> - What other languages have at least one of the above?
> > 
> > GWBasic could dynamically load source into a running program.  Heck,
> > most scripting languages can write text to a disk file then invoke the
> > file, and it's not all that different.

> Hold on--the requirement is that the generated program can access the
> variables and functions of the generating program. Otherwise C would count
> as having this feature because you can generate C source, compile it, and
> execute it from a C program.

Well... C still counts as having this feature in a very roundabout way,
since your generator can publish an API to all it's relevant variables
and functions, and include headers for it with the source it produces,
compiles and runs.

The fact is, you can implement Lisp in any turing complete language, so
in some sense they all have this feature if you allow any amount of
program complexity to achieve it.   But think about how much effort is
required to write a C program that does this.  No matter how elegant
working this way makes your problem solution, the engineering required
to make this idea work in C (or most scriping languages) will likely
dwarf the dificulties in finding another way to represent the program
without doing this.

The fact is, to do this sort of thing requires a grammar generator for
the language.  Lisp, having a very simple grammar and ready made tools
to call the compiler/interpreter, makes it easy.  Other languages with
very simple grammars can (and sometimes do) also make it fairly easy.
Most common use languages that are not related to Lisp either have much
more complicated grammars or give up a great deal of expressiveness and
ability to handle abstraction (often both).


Michael

-- 
Michael Sullivan
Business Card Express of CT             Thermographers to the Trade
Cheshire, CT                                      ·······@bcect.com
From: Rob Warnock
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ahj77s$aqvm0$1@fido.engr.sgi.com>
JRStern <················@gte.net> wrote:
+---------------
| As others have said, you just build up source text in a variable,
| then Eval() the variable.
+---------------

Correction: In Common Lisp, EVAL works on Lisp *objects*, not
test strings per se. Or more precisely, if you EVAL a text string,
you only get back the string itself (since strings are self-evaluating
literal objects):

	(eval "(+ 2 3)") ==> "(+ 2 3)"

What you usually want to feed to EVAL is a *data structure* that
represents a valid Lisp program, e.g.:

	(eval (list '+ 2 3)) ==> 5

It's the READ procedure (including variants such as READ-FROM-STRING
and the READ that's an implicit part of LOAD), with all of its
programmability (see CLHS "2.1.1 Readtables") that converts source
text into Lisp data structures.

Those resulting data structures might or might not get passed to EVAL,
depending on who did the READ and why. In the following case, it does:

	(eval
	  (read-from-string
	    (coerce
	      (list #\( #\+ #\space #\2 #\space #\3 #\))
	      'string)))
	==> 5

But in other cases, the application just wants to use the resulting
data structure for its own purposes.


-Rob

-----
Rob Warnock, 30-3-510		<····@sgi.com>
SGI Network Engineering		<http://www.rpw3.org/>
1600 Amphitheatre Pkwy.		Phone: 650-933-1673
Mountain View, CA  94043	PP-ASEL-IA

[Note: ·········@sgi.com and ········@sgi.com aren't for humans ]  
From: JRStern
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <3d3d7048.3225397@news.verizon.net>
Rob,

Thanks, I knew it wasn't really strings in Lisp, but between my fuzzy
memory of the right terminology and trying to simplify for the
discussion, I said whatever.

Can you clarify, in Common Lisp when you do the Eval(), does it have
access to the current execution context, or does it evaluate in a
separate context?  I think it's in-context, ... but I never did that
much with CL, and it was a long time ago.

Joshua Stern


On 23 Jul 2002 09:22:04 GMT, ····@rigden.engr.sgi.com (Rob Warnock)
wrote:
>JRStern <················@gte.net> wrote:
>+---------------
>| As others have said, you just build up source text in a variable,
>| then Eval() the variable.
>+---------------
>
>Correction: In Common Lisp, EVAL works on Lisp *objects*, not
>test strings per se. Or more precisely, if you EVAL a text string,
>you only get back the string itself (since strings are self-evaluating
>literal objects):
>
>	(eval "(+ 2 3)") ==> "(+ 2 3)"
>
>What you usually want to feed to EVAL is a *data structure* that
>represents a valid Lisp program, e.g.:
>
>	(eval (list '+ 2 3)) ==> 5
>
>It's the READ procedure (including variants such as READ-FROM-STRING
>and the READ that's an implicit part of LOAD), with all of its
>programmability (see CLHS "2.1.1 Readtables") that converts source
>text into Lisp data structures.
>
>Those resulting data structures might or might not get passed to EVAL,
>depending on who did the READ and why. In the following case, it does:
>
>	(eval
>	  (read-from-string
>	    (coerce
>	      (list #\( #\+ #\space #\2 #\space #\3 #\))
>	      'string)))
>	==> 5
>
>But in other cases, the application just wants to use the resulting
>data structure for its own purposes.
>
>
>-Rob
>
>-----
>Rob Warnock, 30-3-510		<····@sgi.com>
>SGI Network Engineering		<http://www.rpw3.org/>
>1600 Amphitheatre Pkwy.		Phone: 650-933-1673
>Mountain View, CA  94043	PP-ASEL-IA
>
>[Note: ·········@sgi.com and ········@sgi.com aren't for humans ]  
>
From: Jochen Schmidt
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ahjsij$5ku$1@rznews2.rrze.uni-erlangen.de>
JRStern wrote:

> Rob,
> 
> Thanks, I knew it wasn't really strings in Lisp, but between my fuzzy
> memory of the right terminology and trying to simplify for the
> discussion, I said whatever.

Ok - but rendering them as strings takes away an important point about lisp 
and is not really a valid simpification.

> Can you clarify, in Common Lisp when you do the Eval(), does it have
> access to the current execution context, or does it evaluate in a
> separate context?  I think it's in-context, ... but I never did that
> much with CL, and it was a long time ago.

In CL EVAL evaluates in a lexical null-environment but in the actual dynamic 
environment (CL has both - lexically and dynamically scoped "special" 
variables).

(let ((a 1)) (eval '(+ a a)))
-> ERROR  "unbound variable a"

(defvar *a* 1) ; A global special variable *a*

(let ((*a* 2)) 
  (eval '(+ *a* *a*)))
-> 4

--
http://www.dataheaven.de
From: Marco Antoniotti
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <y6c1y9upgy9.fsf@octagon.mrl.nyu.edu>
Jochen Schmidt <···@dataheaven.de> writes:

> JRStern wrote:
> 

        ...

> > Can you clarify, in Common Lisp when you do the Eval(), does it have
> > access to the current execution context, or does it evaluate in a
> > separate context?  I think it's in-context, ... but I never did that
> > much with CL, and it was a long time ago.
> 
> In CL EVAL evaluates in a lexical null-environment but in the actual dynamic 
> environment (CL has both - lexically and dynamically scoped "special" 
> variables).
> 
> (let ((a 1)) (eval '(+ a a)))
> -> ERROR  "unbound variable a"
> 
> (defvar *a* 1) ; A global special variable *a*
> 
> (let ((*a* 2)) 
>   (eval '(+ *a* *a*)))
> -> 4

Just for the record.  It is a (minor) misfortune that CL did not include
better "environment" handling alongside EVAL.

Secondly, the use of EVAL is highly discouraged in common CL
practice.  As a matter of fact, you do not need it that much at all.
Essentially the macro system allows to work around EVAL shortcomings
and to produce better code. (Jochen, I know you know this. I just felt
it needed clarification).

Cheers

-- 
Marco Antoniotti ========================================================
NYU Courant Bioinformatics Group        tel. +1 - 212 - 998 3488
719 Broadway 12th Floor                 fax  +1 - 212 - 995 4122
New York, NY 10003, USA                 http://bioinformatics.cat.nyu.edu
                    "Hello New York! We'll do what we can!"
                           Bill Murray in `Ghostbusters'.
From: Joel Ray Holveck
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <y7clm81aee6.fsf@sindri.juniper.net>
> Just for the record.  It is a (minor) misfortune that CL did not include
> better "environment" handling alongside EVAL.

Every now and then I come back and think about this.  If you had The
Environment Protocol Of Your Dreams (including integration with eval,
macroexpand, etc, and whatever else you wanted), what would it
include?  How would it help?

I'm asking more out of randomness than any particular objective here,
so feel free to blow the question off, but it's nice to know how my
favorite language is circumscribed.

joelh
From: Marco Antoniotti
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <y6cheip9mnp.fsf@octagon.mrl.nyu.edu>
Joel Ray Holveck <·····@juniper.net> writes:

> > Just for the record.  It is a (minor) misfortune that CL did not include
> > better "environment" handling alongside EVAL.
> 
> Every now and then I come back and think about this.  If you had The
> Environment Protocol Of Your Dreams (including integration with eval,
> macroexpand, etc, and whatever else you wanted), what would it
> include?  How would it help?

Maybe it would not help that much all the time.  Essentially I see
EVAL very rarely in newer code and I practically do not use it at all.

Yet.  There are times when it would turn out to be useful, and it
would add elegance to the language (of course, complicating its
implementation).  So, I think it is a minor misfortune, and one I can
happily live with.

> 
> I'm asking more out of randomness than any particular objective here,
> so feel free to blow the question off, but it's nice to know how my
> favorite language is circumscribed.
> 

Well, I think I just blew the question off :)

Cheers

-- 
Marco Antoniotti ========================================================
NYU Courant Bioinformatics Group        tel. +1 - 212 - 998 3488
719 Broadway 12th Floor                 fax  +1 - 212 - 995 4122
New York, NY 10003, USA                 http://bioinformatics.cat.nyu.edu
                    "Hello New York! We'll do what we can!"
                           Bill Murray in `Ghostbusters'.
From: sv0f
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <none-2407021205230001@129.59.212.53>
In article <···············@sindri.juniper.net>, Joel Ray Holveck
<·····@juniper.net> wrote:

>Every now and then I come back and think about this.  If you had The
>Environment Protocol Of Your Dreams (including integration with eval,
>macroexpand, etc, and whatever else you wanted), what would it
>include?  How would it help?

Or: In what ways were the environment handling facilities proposed in
CLtL2 (pp. 207-214) but excluded from the ANSI standard deficient?

(This may be in the Hyperspec...)
From: Kent M Pitman
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <sfwn0sh9gsb.fsf@shell01.TheWorld.com>
····@vanderbilt.edu (sv0f) writes:

> In article <···············@sindri.juniper.net>, Joel Ray Holveck
> <·····@juniper.net> wrote:
> 
> >Every now and then I come back and think about this.  If you had The
> >Environment Protocol Of Your Dreams (including integration with eval,
> >macroexpand, etc, and whatever else you wanted), what would it
> >include?  How would it help?
> 
> Or: In what ways were the environment handling facilities proposed in
> CLtL2 (pp. 207-214) but excluded from the ANSI standard deficient?
> 
> (This may be in the Hyperspec...)

They were removed because they were faulty.  A number of problems were 
discovered and we tried for a while to patch them.  It became clear that
we would not finish in time, so we removed them rather than have a bunch
of buggy design be part of the unchangeable spec.

It was assumed that vendors might continue to work on these and come up
with a suitable substrate for next-round standardization.  Not much seems
to have occurred in this way.

As a rule, though, please take note:

The things that are in CLTL2 were about MACROEXPAND and friends.  As a
rule, I think there is very little dispute that environment management
facilities for macroexpansion are in order since these things contain
information known at compile time, and consequently aid efficient
compilation.

HOWEVER, run-time environments, such as are used by EVAL, are an entirely
different beast.  Claiming you want runtime environment support pushes
almost to the point of saying you don't want efficient compilation, since
specifying what evaluation environment will be needed is very very hard
if you're talking about ever asking the question "what is my lexical 
environment now?"

To see the issue, consider the following simple case:

 (let ((x 5))
   (values (+ x x) (get-lexical-environment)))

The question is: what is the value of X in the lexical environment
that GET-LEXICAL-ENVIRONMENT would want to return?  In most high-quality
compilers, (let ((x 5)) (+ x x)) is going to compile to the constant 10,
with no runtime call to + and no binding of x.  In systems where
you have access to the runtime lexical environment, you effectively have
to inhibit some number of compiler optimizations and transformations if
the runtime user is to have any expectation at all about what's going on.

If all you mean by runtime environment for EVAL is the ability to
construct a set of special variable bindings, you can already do this
with PROGV.

If what you mean by runtime environment for EVAL is the ability to add
FLET or MACROLET forms, you can of course, just use CONS (or backquote or
whatever) to wrap the form you want to evaluate with suitable extra context.

If you want something else, you're best to detail it so that people can
tell you if there's a better way to do this.  
From: JRStern
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <3d3dee09.35418268@news.verizon.net>
On Tue, 23 Jul 2002 17:29:03 +0200, Jochen Schmidt <···@dataheaven.de>
wrote:
>> Thanks, I knew it wasn't really strings in Lisp, but between my fuzzy
>> memory of the right terminology and trying to simplify for the
>> discussion, I said whatever.
>
>Ok - but rendering them as strings takes away an important point about lisp 
>and is not really a valid simpification.

Mea maximum culpa.

>> Can you clarify, in Common Lisp when you do the Eval(), does it have
>> access to the current execution context, or does it evaluate in a
>> separate context?  I think it's in-context, ... but I never did that
>> much with CL, and it was a long time ago.
>
>In CL EVAL evaluates in a lexical null-environment but in the actual dynamic 
>environment (CL has both - lexically and dynamically scoped "special" 
>variables).
>
>(let ((a 1)) (eval '(+ a a)))
>-> ERROR  "unbound variable a"
>
>(defvar *a* 1) ; A global special variable *a*
>
>(let ((*a* 2)) 
>  (eval '(+ *a* *a*)))
>-> 4

Thanks.

J.
From: Thomas Bushnell, BSG
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <87it34zpyc.fsf@becket.becket.net>
················@gte.net (JRStern) writes:

> Mea maximum culpa.

You mean "mea maxima culpa".
From: Marco Antoniotti
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <y6cy9bzon3k.fsf@octagon.mrl.nyu.edu>
·········@becket.net (Thomas Bushnell, BSG) writes:

> ················@gte.net (JRStern) writes:
> 
> > Mea maximum culpa.
> 
> You mean "mea maxima culpa".

The complete sentence should be

        "mea culpa, mea culpa, mea maxima culpa"

:)

Cheers

-- 
Marco Antoniotti ========================================================
NYU Courant Bioinformatics Group        tel. +1 - 212 - 998 3488
715 Broadway 10th Floor                 fax  +1 - 212 - 995 4122
New York, NY 10003, USA                 http://bioinformatics.cat.nyu.edu
                    "Hello New York! We'll do what we can!"
                           Bill Murray in `Ghostbusters'.
From: Thomas Bushnell, BSG
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <871y9ru0lj.fsf@becket.becket.net>
Marco Antoniotti <·······@cs.nyu.edu> writes:

> ·········@becket.net (Thomas Bushnell, BSG) writes:
> 
> > ················@gte.net (JRStern) writes:
> > 
> > > Mea maximum culpa.
> > 
> > You mean "mea maxima culpa".
> 
> The complete sentence should be
> 
>         "mea culpa, mea culpa, mea maxima culpa"
> 

Well, sure;
From: Michael Sullivan
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <1ffwiov.1lxps97jlbka3N%michael@bcect.com>
Thomas Bushnell, BSG <·········@becket.net> wrote:

> ················@gte.net (JRStern) writes:
> 
> > Mea maximum culpa.
> 
> You mean "mea maxima culpa".

I thought he meant "mea maxima fucking culpa."


Michael, but what do I know.

-- 
Michael Sullivan
Business Card Express of CT             Thermographers to the Trade
Cheshire, CT                                      ·······@bcect.com
From: Christian Lynbech
Subject: The essence of lisp
Date: 
Message-ID: <ofznwig2jt.fsf@situla.ted.dk.eu.ericsson.se>
[ if anything is going to demonstrate lack of c.l.l maturity it is to
  start a thread with that kind of subject :-) ]

The talk of lisp and unique features has prompted me to post my own
little pet theory as to what makes lisp the One True Programming Language.

Lisp in general and Common Lisp in particular has lots of features,
many of which can be found in other languages as well, including some
without direct heritage such as Dylan.

Its kind of like the reverse of Greenspuns 10. rule of programming (I
suggest we name it the Greenspun Corollary): Any programming language
can be turned halfway into an ad hoc informally-specified bug-ridden
slow version of Common Lisp through a lot of hard work.

So why can't one just ride all the way down that road and add stuff to
say Java and have the best of both worlds? What is lost when moving
features, one at a time from Lisp to other languages?

I believe that the essential quality that Lisp has, the one that all
the other features ultimately relies upon and contributes to, 
is *linguistic flexibility*.

Think about the process of programming, what is it? It is a
conversation between the programmer and the machine in which the
programmer is explaining to the machine how it must go about solving a
particular problem, in other words a linguistic process.

This process is however very difficult because there is a huge gap
between the abstraction level of the programmers understanding of the
problem (it would be on par with "write me an accounting system") and
the level of abstraction the machine is able to understand. So not
only must the programmer figure out how to phrase the explanation, but
the programmer must at the same time enhance his or hers understanding
of the problem in order to be able to put the solution into the right
words.

In order to cross the gap of abstraction levels, a bridge must be
built, and the building blocks of such a bridge are abstractions.
These abstractions are essentially linguistic artifacts and thus the
linguistic power of the tool becomes the dominant (even if heavily
obscured) quality affecting the efficiency by which the programmer is
able to work.

Efficiency is important to remember here. Working in a language with a
low degree of linguistic power does not mean that the program cannot
be written, it only requires more resources to do so. This is simply
the Turing tar pit at play. Mathematically we know that all
programming languages are equally powerful, meaning that we know that
the *result* will be the same, no matter which tools we employ, but
the *efficiency* by which the result is reached may differ
radically. Our old math professors may not have cared much how we
arrive at programs, but the people that signs our paychecks definitely
will! (Alledgely, a mathematician at my old university should at one
point have dismmissed all of computer science as nothing more than
applied monad theory.)

The linguistic power of Lisp is however not very easy to transfer to
other programming languages without turning them into Lisp in the
process. It is not just macros or sequence functions or mapping
operators or higher order functions, but also symbols (what other
programming language would allow spaces in function names) and program
structure (as opposed to program text) and other more subtle instances
of the same.


------------------------+-----------------------------------------------------
Christian Lynbech       | Ericsson Telebit, Skanderborgvej 232, DK-8260 Viby J
Phone: +45 8938 5244    | email: ·················@ted.ericsson.se
Fax:   +45 8938 5101    | web:   www.ericsson.com
------------------------+-----------------------------------------------------
Hit the philistines three times over the head with the Elisp reference manual.
                                        - ·······@hal.com (Michael A. Petonic)
From: Nils Goesche
Subject: Re: The essence of lisp
Date: 
Message-ID: <lklm82mx9h.fsf@pc022.bln.elmeg.de>
Christian Lynbech <·················@ted.ericsson.se> writes:

> [ if anything is going to demonstrate lack of c.l.l maturity it is to
>   start a thread with that kind of subject :-) ]

I would be surprised if anybody disagreed with you, as you
(thankfully) didn't crosspost to comp.object ;-)

> The linguistic power of Lisp is however not very easy to transfer to
> other programming languages without turning them into Lisp in the
> process. It is not just macros or sequence functions or mapping
> operators or higher order functions, but also symbols (what other
> programming language would allow spaces in function names) and
> program structure (as opposed to program text) and other more subtle
> instances of the same.

I think this is true.  One thing you didn't mention is object
identity, as measured by EQL, which seems to be regarded as very
important by Kent Pitman (``Identity, identity, identity!'').  I have
a feeling that he is right on this, although I'll probably need quite
another while until I see clearly, why.  Of course, we are identifying
symbols by their identity, but I find myself doing this with other
objects in Lisp, too, more and more often, and much more so than in
other languages, even functional ones, although I don't know yet, why.
This might also be the key for understanding that dynamic typing is
not only a matter of taste, as if you could remove it from Lisp
leaving all other things equal, but essential: Objects (with identity)
have types, not variables.

Just a thought.

Regards,
-- 
Nils Goesche
"Don't ask for whom the <CTRL-G> tolls."

PGP key ID 0x42B32FC9
From: David Van Camp
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <uAH69.10200$WJ3.1054088@news1.news.adelphia.net>
Since no one else seems to have mentioned it, the language system FORTH
provided all this an more way back around the same time frame as Lisp
(70's). Indeed, while Lisp was critisized as slow and unfriendly, Forth was
rapidly gaining popularity as an interpreted system with speeds approaching
native C.

However, Forth's big problem was it's reliance on Reverse Polish Notation
(Lisp uses Polish Notation -- eg. not reversed), which is nearly unreadable
by humans. Postscript's underlying macro language was based on Forth. Many
other interpreted languages' underlying macro languages were based on Forth
(including the Java Runtime Environment [JRE], I've heard.). However, human
programmers, with only small numbers of exceptions, did not like reverse
polish notation. They could not understand it. So the language never became
popular.

But, AFAIK the lanugage has not died. I've heard that OO extensions were
added and some still think it is the greatest language ever devised. It had
(has?), I've heard, a popular following in real-time games and other
specialized (particularly real-time) domains (My brother -- a special
effects artist with a Grammy to this name told me it has or had a popular
following in motion picture special effects.) I don't know for sure. But it
certainly is a very interesting language even if outdated.

One of the most mind-expanding books on software development I ever read was
"Thinking in Forth" by I cannot remember who. This was a really great book
on modular, iterative programming concepts. I learned more from this book
than many others I've read -- inspite of the fact that I never did much with
Forth. It was better, IMO, than Beck's XP book -- and more than 15 years
earlier. It was a truly visionary work!

But it was NOT Bruce Eckel's work -- he's never wrote anything I know of
that is even remotely similar, despite the naming similarities (I have no
idea if he had any awarness of the book I descibed above -- certainly
nothing in any of his books that *I* have read indicate any such awareness.)

The "Thinking in Forth" book I read is more closely related to "Zen and the
Art of Motorcycle Maintenence" except it was (somewhat -- but not very)
specific to the Forth programming language. (And, for those who do not know,
neither book had anything to do with motorcycles nor Java nor C++, but
everything to do with good software development practices... :)

BTW, I mean no negative critisism of Bruce here, he is a fine writer and
speaker -- I just feel it important to make it clear, due to the similarity
of titles, that the book I am refering to is a very different book from (and
pre-dates) any in Eckel's "Thinking in..." series.
--
David Van Camp.com :: Software Development Consulting
Patterns, Reuse, Software Process Improvement
http://www.davidvancamp.com

Visit the OO Pattern Digest :: http://patterndigest.com
A catalog of condensed patterns, books and related resources

"Shayne Wissler" <·········@yahoo.com> wrote in message
··························@rwcrnsc53...
> At http://www.paulgraham.com/diff.html, Paul Graham says that the feature
of
> having the "whole language always available" is unique to lisp.
>
> With lisp, he iterates that you can:
>  1) Compile and run code while reading
>  2) Read and run code while compiling
>  3) Read and compile code at run-time
>
> Although I'm clear on what is technologically possible with respect to
> having a compiler available at run-time, I'm not precisely clear on what
> Paul means in the above. I *think* he means:
>
> 1. While parsing a lisp program, you can compile and invoke some of the
code
> you already parsed to help parse the subsequent code, allowing you to
modify
> the syntax on the fly. If this is correct, what are the limitations? Can I
> write a lisp program that starts out as lisp and starts looking like C++?
>
> 2. While compiling a lisp program, you can execute special-purpose code
that
> makes semantic transformations of the lisp executable. When you see a
given
> input program, you can generate whatever code you want as a function of
it.
> This is called "macros" in lisp, but it is nothing like textual macros;
> really it's a program generating another program at the binary or
byte-code
> level.
>
> 3. A running lisp program can read (or write), compile, and execute any
> program, in the current program's execution context. I.e., if the newly
> loaded program makes references to variables and functions that are only
> defined in the original program, they are dynamically linked and
accessible
> as the program is compiled.
>
> A few questions:
>  - Have I got it right?
>  - What are the limitations?
>  - What other languages have at least one of the above?
>  - Is it really true that no other language has all of the above?
>
>
> Shayne Wissler
>
>
>
From: Espen Vestre
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <kwofc44myg.fsf@merced.netfonds.no>
"David Van Camp" <····@davidvancamp.com> writes:

> But, AFAIK the lanugage has not died.

By no means. It's the native language of Open Firmware, the boot
firmware (i.e. it serves the same purpose, but is superior to, BIOS
on PC hardware) used by Apple and Sun computers.

(http://www.openfirmware.org)
-- 
  (espen)
From: Lars Brinkhoff
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <858z38a562.fsf@junk.nocrew.org>
"David Van Camp" <····@davidvancamp.com> writes:
> One of the most mind-expanding books on software development I ever
> read was "Thinking in Forth" by I cannot remember who.

"Thinking Forth" by Leo Brodie?
From: Herb Martin
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <YCL69.286043$q53.9210416@twister.austin.rr.com>
> > One of the most mind-expanding books on software development I ever
> > read was "Thinking in Forth" by I cannot remember who.
>
> "Thinking Forth" by Leo Brodie?

"Starting Forth" also by Brodie was one of the finest books
on programming ever written.  It didn't have much readership
outside the Forth community, but like "Software Tools" and
"Software Tools in Pascal", both by Kernighan and Plauger,
it demonstrated how to write good programs.

The fact that each of these wrote those programs in a specific
language was largely secondary.

Forth was very popular as an embedded controller language,
including many early video arcade games.  Originally invented
by Charles Moore as portable language to control astronomical
telescopes, Forth traveled with it's on mini-disk-operating-system.

The value of this was high, but the obscurity of dealing with its
'screens' as a disk abstraction was probably more of a hindrance
to new Forth programmers than the stack-oriented reverse polish
syntax.

For those that needed the disk abstraction it was however a big
win.

Like Lisp, when the object oriented craze hit, the Forth (and Lisp)
programmers just sat down and started typing until they had 'objects'
defined in the language(s.)  No new syntax was required to add
object oriented concepts to either language.  (They might help but
the languages were capable of re-defining the necessary extensions
without altering the underlying model.)

The end result in both Lisp and Forth was that if you wanted to write
a pure object oriented program there were practically no compromises
required to use these languages that way.

Ok, I admit it -- I loved Forth, and I still enjoy Lisp (<grin> when I can
get it.)

Herb Martin
Try ADDS for great Weather too:
http://adds.aviationweather.noaa.gov/projects/adds

"Lars Brinkhoff" <·········@nocrew.org> wrote in message
···················@junk.nocrew.org...
> "David Van Camp" <····@davidvancamp.com> writes:
> > One of the most mind-expanding books on software development I ever
> > read was "Thinking in Forth" by I cannot remember who.
>
> "Thinking Forth" by Leo Brodie?
From: David Van Camp
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <_0S69.11186$WJ3.1236332@news1.news.adelphia.net>
"Lars Brinkhoff" <·········@nocrew.org> wrote in message
···················@junk.nocrew.org...
> "David Van Camp" <····@davidvancamp.com> writes:
> > One of the most mind-expanding books on software development I ever
> > read was "Thinking in Forth" by I cannot remember who.
>
> "Thinking Forth" by Leo Brodie?

Yes, that's it! Thanks.

dvc
From: Lieven Marchand
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <87n0roxqwo.fsf@wyrd.be>
"David Van Camp" <····@davidvancamp.com> writes:

> One of the most mind-expanding books on software development I ever read was
> "Thinking in Forth" by I cannot remember who. This was a really great book
> on modular, iterative programming concepts. I learned more from this book
> than many others I've read -- inspite of the fact that I never did much with
> Forth. It was better, IMO, than Beck's XP book -- and more than 15 years
> earlier. It was a truly visionary work!

Thinking Forth by Brodie.

It is indeed a great book, and only incidentally about Forth and more
about program design. The companion volume Starting Forth is intended
to teach you the language. The books are even being reprinted by the
Forth Interest Group.

-- 
Bored, now.
Lieven Marchand <···@wyrd.be>
From: Stephen J. Bevan
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <m31y90xjwz.fsf@dino.dnsalias.com>
"David Van Camp" <····@davidvancamp.com> writes:
> Since no one else seems to have mentioned it, the language system FORTH
> provided all this an more way back around the same time frame as Lisp
> (70's).

Lisp was one of the languages that Moore knew about when he first
started to dabble in the early 60s with what would later become
Forth (see http://www.colorforth.com/HOPL.html).


> ... Postscript's underlying macro language was based on Forth.

The designers of PostScript don't quite see it like that.
See the PostScript reference manual for their view.


> Many other interpreted languages' underlying macro languages were
> based on Forth (including the Java Runtime Environment [JRE], I've heard.).

The JVM uses a stack based byte-code instruction set.  The only thing
it has in common with Forth is that it uses a stack (not two like
Forth), like a lot of virtual machines that have been designed over
the last 40 years.  Stack based instruction sets (Burroughs B5500)
were one of the influences in the design of Forth (again see
http://www.colorforth.com/HOPL.html). 


> But, AFAIK the lanugage has not died. I've heard that OO extensions were
> added and some still think it is the greatest language ever devised.

There are no OO features in ANS Forth, but there are some Forths that
have OO features and there are packages you can download that add
various OO features (the smallest being about 12 lines long IIRC).


> One of the most mind-expanding books on software development I ever read was
> "Thinking in Forth" by I cannot remember who.

  Thinking FORTH -- A Language and Philosophy for Solving Problems
  Leo Brodie
  Prentice Hall
  1984
From: Christopher Browne
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ajghfg$1ara1f$1@ID-125932.news.dfncis.de>
"David Van Camp" <····@davidvancamp.com> wrote:
> However, Forth's big problem was it's reliance on Reverse Polish
> Notation (Lisp uses Polish Notation -- eg. not reversed), which is
> nearly unreadable by humans. 

No, the _BIG_ problem with Forth was the BLOCKS controversy.  

On the one hand, BLOCKS are very cool, allowing you to REALLY easily
program your own customized Virtual Memory system.  Doing everything
with 1K blocks is really slick, especially on minscule hardware where
Forth _is_ the operating system.

On the other hand, modern OSes have this newfangled thing called
_FILES_, which tend to be much more efficient at storing programs
(because whitespace can simply evaporate away), and people expect
well-behaved applications to be able to interoperate with other
applications _at the OS level_, so that for Forth to pretend that it
_is_ the OS is totally unfriendly.

This "war" had a lot to do with the lack of advancement of Forth onto
"modern" platforms in any big way.  

(Is this the whole story?  Certainly not.  Others might point to
things other than BLOCKS.  But it points to stuff that is likely a
bigger deal than notation, though...  I think there's also an issue
that Forth was a "practical" language, eschewing the sophistication of
academia, and that there was a lot of sneering back and forth that
discouraged educational use of Forth...)

> Postscript's underlying macro language was based on Forth.

"underlying macro language" doesn't make a terribly lot of sense, and
it is much fairer to say that it was found to be _similar to_ Forth.
Folks from Adobe have claimed otherwise, and I don't see any reason to
_demand_ the claim that Adobe looked at Forth in order to come up with
the design of Postscript.

It seems much more likely to me that the presence of stack oriented
notations in both languages represents coincidence, as opposed to one
having inspired the other.

To be sure, stacks _are_ important, and _are_ widely used in _many_
languages to store arguments, and for two languages to have come up
with postfix stack-oriented programming notations should not be any
vast surprise.

The storage models for the respective langauges are _totally_
different, where Postscript appears to be pretty much "heap-based," so
that you can readily redefine functions and sling objects in and out
of memory without really worrying about where they come and go.  Which
is rather like garbage-collected Lisp.

In contrast, Forth's memory model is lots more 'anal retentive' about
the programmer getting into the precise details of where data is
stored than C is, with malloc().  (If you're using BLOCKS as VM, you
really _need_ to be that anal retentive. :-))

> Many other interpreted languages' underlying macro languages were
> based on Forth (including the Java Runtime Environment [JRE], I've
> heard.). However, human programmers, with only small numbers of
> exceptions, did not like reverse polish notation. They could not
> understand it. So the language never became popular.

JRE is _much_ more akin to the P-code of the old UCSD Pascal systems;
it likely also more meaningfully draws from the Smalltalk
environments.

> But, AFAIK the lanugage has not died. I've heard that OO extensions
> were added and some still think it is the greatest language ever
> devised. It had (has?), I've heard, a popular following in real-time
> games and other specialized (particularly real-time) domains (My
> brother -- a special effects artist with a Grammy to this name told
> me it has or had a popular following in motion picture special
> effects.) I don't know for sure. But it certainly is a very
> interesting language even if outdated.

You might look up OpenBoot/OpenForth, which I think is an IEEE
standard, which is a boot language system used on a number of non-PC
platforms.

Various implementors have headed in various directions.  The Forth
world more resembles the Lisp world _before_ Common Lisp was
standardized, with quite a number of very different kinds of
implementations.

There's an ANSI standard which is much smaller than CL which has
probably been less influential than CL.

> One of the most mind-expanding books on software development I ever
> read was "Thinking in Forth" by I cannot remember who. This was a
> really great book on modular, iterative programming concepts. I
> learned more from this book than many others I've read -- inspite of
> the fact that I never did much with Forth. It was better, IMO, than
> Beck's XP book -- and more than 15 years earlier. It was a truly
> visionary work!

Leo Brody was the author of _Thinking Forth_, which is, indeed a great
book.

The great insight he pointed out that _well_ applies today is that it
is tremendously important to coin good names for functions.  If you do
that, it makes it much easier to figure out the "vocabulary" of the
internals of your application, and makes it much easier to string them
together.

> But it was NOT Bruce Eckel's work -- he's never wrote anything I
> know of that is even remotely similar, despite the naming
> similarities (I have no idea if he had any awarness of the book I
> descibed above -- certainly nothing in any of his books that *I*
> have read indicate any such awareness.)

I don't think so either.

> The "Thinking in Forth" book I read is more closely related to "Zen
> and the Art of Motorcycle Maintenence" except it was (somewhat --
> but not very) specific to the Forth programming language. (And, for
> those who do not know, neither book had anything to do with
> motorcycles nor Java nor C++, but everything to do with good
> software development practices... :)

I'll buy that.
-- 
(reverse (concatenate 'string ··········@" "enworbbc"))
http://cbbrowne.com/info/forth.html
There are  three kinds of people:  those who can count,  and those who
can't.
From: Rahul Jain
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <87it2cxgnq.fsf@localhost.localdomain>
Christopher Browne <········@acm.org> writes:

> No, the _BIG_ problem with Forth was the BLOCKS controversy.  
[...]
> On the other hand, modern OSes have this newfangled thing called
> _FILES_, which tend to be much more efficient at storing programs
> (because whitespace can simply evaporate away), and people expect
> well-behaved applications to be able to interoperate with other
> applications _at the OS level_, so that for Forth to pretend that it
> _is_ the OS is totally unfriendly.

Now _that_ is something that Lisp and Forth really have in common. :)

Damn unix and its impotent filesystems!

> > Many other interpreted languages' underlying macro languages were
> > based on Forth (including the Java Runtime Environment [JRE], I've
> > heard.).
[...]
> JRE is _much_ more akin to the P-code of the old UCSD Pascal systems;
> it likely also more meaningfully draws from the Smalltalk
> environments.

One thing to note is that this "locals" stack is actually part of the
stack frame, and gets unwound with the function call, IIUC.

> > But, AFAIK the lanugage has not died. I've heard that OO extensions
> > were added and some still think it is the greatest language ever
> > devised. It had (has?), I've heard, a popular following in real-time
> > games and other specialized (particularly real-time) domains (My
> > brother -- a special effects artist with a Grammy to this name told
> > me it has or had a popular following in motion picture special
> > effects.) I don't know for sure. But it certainly is a very
> > interesting language even if outdated.
[...]
> Various implementors have headed in various directions.  The Forth
> world more resembles the Lisp world _before_ Common Lisp was
> standardized, with quite a number of very different kinds of
> implementations.
> 
> There's an ANSI standard which is much smaller than CL which has
> probably been less influential than CL.

One could make the argument that Forth systems are so disparate in
their purposes and environments that having signficant standardization
among them won't buy them much. To me, Forth is more of a community of
people who like a specific kind of execution model. Not being too
knowledgable about Forth, I'll let others comment on this conjecture.

-- 
-> -/                        - Rahul Jain -                        \- <-
-> -\  http://linux.rice.edu/~rahul -=-  ············@techie.com   /- <-
-> -X "Structure is nothing if it is all you got. Skeletons spook  X- <-
-> -/  people if [they] try to walk around on their own. I really  \- <-
-> -\  wonder why XML does not." -- Erik Naggum, comp.lang.lisp    /- <-
|--|--------|--------------|----|-------------|------|---------|-----|-|
   (c)1996-2002, All rights reserved. Disclaimer available upon request.
From: Elizabeth D. Rather
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <3D5BEA0B.377168F8@forth.com>
Rahul Jain wrote:
> ...
> > > But, AFAIK the lanugage has not died. I've heard that OO extensions
> > > were added and some still think it is the greatest language ever
> > > devised. It had (has?), I've heard, a popular following in real-time
> > > games and other specialized (particularly real-time) domains (My
> > > brother -- a special effects artist with a Grammy to this name told
> > > me it has or had a popular following in motion picture special
> > > effects.) I don't know for sure. But it certainly is a very
> > > interesting language even if outdated.

Forth was originally designed for embedded and real-time systems, and
is still widely used there.  Examples range from the Open Firmware
(IEEE 1275) used in Sun, Apple, and IBM workstations & servers to the
firmware in FedEx's handheld package tracking devices, space shuttle
and satellite instrumentation, etc.

Forth has evolved a lot since the early 80's, when a lot of people 
first met it, and versions are available for most contemporary 
platforms (including cross-compilers for most popular
microcontrollers).  
I'm not sure what you mean by "outdated", but it's hard to imagine
Forth being any more "outdated" than C, which was developed about
the same time.

> > Various implementors have headed in various directions.  The Forth
> > world more resembles the Lisp world _before_ Common Lisp was
> > standardized, with quite a number of very different kinds of
> > implementations.
> >
> > There's an ANSI standard which is much smaller than CL which has
> > probably been less influential than CL.
> 
> One could make the argument that Forth systems are so disparate in
> their purposes and environments that having signficant standardization
> among them won't buy them much. To me, Forth is more of a community of
> people who like a specific kind of execution model. Not being too
> knowledgable about Forth, I'll let others comment on this conjecture.

ANS Forth has actually been widely influential.  All the commercial
vendors and most public-domain & open-source versions are compliant.
A proposed addendum to the standard for cross-compilers has been
published.  The main divergence is in extensions that adapt to
particular platforms or application domains.  For example, most
Windows-based Forths have OOP extensions, whereas versions for
embedded systems offer efficient multitaskers, fixed-point fraction
math packages (for MPUs w/o hardware floating point), etc.

Cheers,
Elizabeth

-- 
==================================================
Elizabeth D. Rather   (US & Canada)   800-55-FORTH
FORTH Inc.                         +1 310-491-3356
5155 W. Rosecrans Ave. #1018  Fax: +1 310-978-9454
Hawthorne, CA 90250
http://www.forth.com

"Forth-based products and Services for real-time
applications since 1973."
==================================================
From: Stan Barr
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <slrnalnqb6.1j7.stanb@citadel.metropolis.local>
On 15 Aug 2002 11:53:13 -0400, Rahul Jain <·····@rice.edu> wrote:
>Christopher Browne <········@acm.org> writes:
>
>> No, the _BIG_ problem with Forth was the BLOCKS controversy.  
>[...]
>> On the other hand, modern OSes have this newfangled thing called
>> _FILES_, which tend to be much more efficient at storing programs
>> (because whitespace can simply evaporate away), and people expect
>> well-behaved applications to be able to interoperate with other
>> applications _at the OS level_, so that for Forth to pretend that it
>> _is_ the OS is totally unfriendly.
>
>Now _that_ is something that Lisp and Forth really have in common. :)

I liked this comment I came across yesterday;  

"Operating System: An operating system is a collection of things that
don't fit into a language.  There shouldn't be one."

Daniel H. H. Ingalls on the design priciples behind Smalltalk.

-- 
Cheers,
Stan Barr  ·····@dial.pipex.com

The future was never like this!
From: Bernd Paysan
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <orlqja.1i2.ln@miriam.mikron.de>
Bart Lateur wrote:
> Implementation ought to look like (for cards with similar possibilities)
> (warning: bad Ascii art attempt -- see with fixed ptch font):
> 
> 
> | access card A |    | access card B |    | access card C |
> |                    |                    |
> -------------------------------------------
> |
> |  Common access methods (API)
> |
> -------------------------------------
> |                 |                 |
> | language A |    | language B |    | language C |

But it's wrong. The API typically is defined for one language, e.g. C. This 
includes bindings (stdcall in Win32, C in Unix). The API often hides the 
implementation, e.g. xlib vs. the X protocol (which is something different, 
and wrapped by xlib into a C-conforming library). Xlib alone is 880k on my 
Linux installation, while Xvnc (a framebuffer implementation of the X 
server) is 1250k. In other words: The API alone, as defined in C, takes 
almost as much space as the server.

Note also that X is a middleware API, and that your GUI application should 
not directly use Xlib, but a widget library. A widget library can be 
written to work with several graphics APIs.

-- 
Bernd Paysan
"If you want it done right, you have to do it yourself"
http://www.jwdt.com/~paysan/
From: Thomas Worthington
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <rdQ69.1548$U06.14384@newsfep3-gui.server.ntli.net>
On Thu, 15 Aug 2002 16:30:57 +0100, Christopher Browne wrote:

> "David Van Camp" <····@davidvancamp.com> wrote:
>> However, Forth's big problem was it's reliance on Reverse Polish
>> Notation (Lisp uses Polish Notation -- eg. not reversed), which is
>> nearly unreadable by humans.
> 
> No, the _BIG_ problem with Forth was the BLOCKS controversy.
> 
> On the one hand, BLOCKS are very cool, allowing you to REALLY easily
> program your own customized Virtual Memory system.  Doing everything
> with 1K blocks is really slick, especially on minscule hardware where
> Forth _is_ the operating system.
> 
> On the other hand, modern OSes have this newfangled thing called
> _FILES_, which tend to be much more efficient at storing programs
> (because whitespace can simply evaporate away), and people expect
> well-behaved applications to be able to interoperate with other
> applications _at the OS level_, so that for Forth to pretend that it
> _is_ the OS is totally unfriendly.
> 
> This "war" had a lot to do with the lack of advancement of Forth onto
> "modern" platforms in any big way.
> 

On the other hand, you just have to look at the code produced without
BLOCKS to see what a devastating effect source code in files has on style
and quality of code produced. The big problem is that blocks enforce
factoring and factoring produces good code. It's not impossible to factor
without blocks but, without being forced to factor, programmers naturally
slip into writing one long definition where 7 or 8 would be better.

In my opinion the only real argument about source code is what size blocks
it should be in. When trying to get control of some brain-dead library
call with 30 parameters in a nested structure the 1024 byte block is too
small but for most other uses it's too big.

TWW
From: Gary Chanson
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ajgsr5$imc$2@pcls4.std.com>
"Thomas Worthington" <···@theBitBeforeTheAtSignAgain.cx> wrote in message
·························@newsfep3-gui.server.ntli.net...
>
> On the other hand, you just have to look at the code produced without
> BLOCKS to see what a devastating effect source code in files has on style
> and quality of code produced. The big problem is that blocks enforce
> factoring and factoring produces good code. It's not impossible to factor
> without blocks but, without being forced to factor, programmers naturally
> slip into writing one long definition where 7 or 8 would be better.

    I don't buy it!  I don't see such a correlation at all.  Good
programmers write good code in any environment, while bad programmers write
bad code in any environment.  Some of the worst code I've ever seen was
written in blocks.

    Blocks make it much harder to organize code and to make major changes.
Blocks are one of the main reason that Forth has earned a bad reputation due
to unreadable code.

--

-Gary Chanson (MVP for Windows SDK)
-Software Consultant (Embedded systems and Real Time Controls)
·········@mvps.org

-Abolish public schools
From: Albert van der Horst
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <H0xJGn.Mxu.1.spenarn@spenarnc.xs4all.nl>
In article <············@pcls4.std.com>,
Gary Chanson <········@no.spam.TheWorld.com> wrote:
>
>"Thomas Worthington" <···@theBitBeforeTheAtSignAgain.cx> wrote in message
>·························@newsfep3-gui.server.ntli.net...
>>
>> On the other hand, you just have to look at the code produced without
>> BLOCKS to see what a devastating effect source code in files has on style
>> and quality of code produced. The big problem is that blocks enforce
>> factoring and factoring produces good code. It's not impossible to factor
>> without blocks but, without being forced to factor, programmers naturally
>> slip into writing one long definition where 7 or 8 would be better.
>
>    I don't buy it!  I don't see such a correlation at all.  Good
>programmers write good code in any environment, while bad programmers write
>bad code in any environment.  Some of the worst code I've ever seen was
>written in blocks.
>
>    Blocks make it much harder to organize code and to make major changes.
>Blocks are one of the main reason that Forth has earned a bad reputation due
>to unreadable code.

In ciforth I combine blocks and files.
A file contains an application.
A facility (like looping over strings) is contained in one,
at most a few blocks.
So blocks are the ultimate library tool for me.
Having tools in files is inferior, because you end up with
too much files, and the temptation to add things you don't
need is big, especially for those 3 line facilities.

(On the other hand, I wouldn't like to do a large application
in blocks.)

In combination with REQUIRE (looks up a word in the blocks file)
blocks make it very easy to organise code.
(Of course my block file can be handled by standard Unix tools
as a text file.)
Having all facilities in a single file makes it very easy to
make major changes. E.g. the discovery that my (PARSE) should
be called PARSE requires editing a single file instead of
dozens or hundreds.

>-Gary Chanson (MVP for Windows SDK)
>-Software Consultant (Embedded systems and Real Time Controls)
>·········@mvps.org

Groetjes Albert
-- 
Albert van der Horst,Oranjestr 8,3511 RA UTRECHT,THE NETHERLANDS
To suffer is the prerogative of the strong. The weak -- perish.
······@spenarnc.xs4all.nl     http://home.hccnet.nl/a.w.m.van.der.horst
From: Thomas Worthington
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <sE479.152$HW3.36635@newsfep1-gui.server.ntli.net>
On Thu, 15 Aug 2002 18:25:03 +0100, Gary Chanson wrote:

> "Thomas Worthington" <···@theBitBeforeTheAtSignAgain.cx> wrote in
> message ·························@newsfep3-gui.server.ntli.net...
>>
>> On the other hand, you just have to look at the code produced without
>> BLOCKS to see what a devastating effect source code in files has on
>> style and quality of code produced. The big problem is that blocks
>> enforce factoring and factoring produces good code. It's not impossible
>> to factor without blocks but, without being forced to factor,
>> programmers naturally slip into writing one long definition where 7 or
>> 8 would be better.
> 
>     I don't buy it!  I don't see such a correlation at all.  Good
> programmers write good code in any environment, while bad programmers
> write bad code in any environment.

All people are lazy and there is some merit to the argument (Larry Wall I
think said it) that good programmers are lazy because they find the easier
way to do things. Text files pander to this by making it easy to just type
away without really thinking about where the good factorisations are.

>  Some of the worst code I've ever
> seen was written in blocks.

I'm sure it was. Bad coders are amazinly good at generating bad code
where ever they are.

>     Blocks make it much harder to organize code and to make major
>     changes.

Changes can be bothersome but organisation is easier.

> Blocks are one of the main reason that Forth has earned a bad reputation
> due to unreadable code.

No. Forth has earned that  reputation through a combination of the terse
names for the most frequently used primitives, it's lack of any clear
relationship to Algol which most other popular languages have, and poor
block editors.

While I was at university I used a system where blocks were passed to TeX
along with their shadow blocks and presented in a clearer fashion than the
C++ programs that other people used to write their assingments. Blocks are
a partial setp towards literate programming and as such are a partial step
away from the C-style jumble of comments and code, the one interrupting
the other's flow.

TWW

>Abolish public schools

I assume you mean "make all schools as good as private schools"?
From: Gary Chanson
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ajjgt6$p0f$1@pcls4.std.com>
"Thomas Worthington" <···@theBitBeforeTheAtSignAgain.cx> wrote in message
························@newsfep1-gui.server.ntli.net...
>
> All people are lazy and there is some merit to the argument (Larry Wall I
> think said it) that good programmers are lazy because they find the easier
> way to do things. Text files pander to this by making it easy to just type
> away without really thinking about where the good factorisations are.

    Good programmers know that paying attention to factoring, fornatting and
organization will save them time and effort in the long run.  It's the real
choice of the "lazy" programmer.

> Changes can be bothersome but organisation is easier.

    Organization is MUCH easier in text files.

--

-Gary Chanson (MVP for Windows SDK)
-Software Consultant (Embedded systems and Real Time Controls)
·········@mvps.org

-Abolish public schools
From: Kaz Kylheku
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ajlmrb$kss$1@luna.vcn.bc.ca>
In article <···················@newsfep1-gui.server.ntli.net>, Thomas Worthington wrote:
> On Thu, 15 Aug 2002 18:25:03 +0100, Gary Chanson wrote:
>> "Thomas Worthington" <···@theBitBeforeTheAtSignAgain.cx> wrote in
>>     Blocks make it much harder to organize code and to make major
>>     changes.
> 
> Changes can be bothersome but organisation is easier.

But the 80 column punched cards---that's where the real action is!
From: Gary Chanson
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ajm0um$gv8$1@pcls4.std.com>
"Kaz Kylheku" <···@ashi.footprints.net> wrote in message
·················@luna.vcn.bc.ca...
>
> But the 80 column punched cards---that's where the real action is!

    I used to know a Forth programmer who could easily get 8 definitions on
single card!  He was REALLY GOOD at using up all of the white space in a
block!  He produced the most unreadable code I've ever seen in any language.

--

-Gary Chanson (MVP for Windows SDK)
-Software Consultant (Embedded systems and Real Time Controls)
·········@mvps.org

-Abolish public schools
From: Thomas Worthington
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <NVI79.23885$IU4.699461@newsfep2-win.server.ntli.net>
On Sat, 17 Aug 2002 17:33:50 +0100, Gary Chanson wrote:


> "Kaz Kylheku" <···@ashi.footprints.net> wrote in message
> ·················@luna.vcn.bc.ca...
>>
>> But the 80 column punched cards---that's where the real action is!
> 
>     I used to know a Forth programmer who could easily get 8 definitions
>     on
> single card!  He was REALLY GOOD at using up all of the white space in a
> block!  He produced the most unreadable code I've ever seen in any
> language.

Yes, best to take one block per definition and try to fill up the white
space on the shadow block with commentary.

TWW
From: Gary Chanson
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ajnp71$i9l$1@pcls4.std.com>
"Thomas Worthington" <···@theBitBeforeTheAtSignAgain.cx> wrote in message
···························@newsfep2-win.server.ntli.net...
>
> Yes, best to take one block per definition and try to fill up the white
> space on the shadow block with commentary.

    No.  It's best to dump blocks and use text files.

--

-Gary Chanson (MVP for Windows SDK)
-Software Consultant (Embedded systems and Real Time Controls)
·········@mvps.org

-Abolish public schools
From: Julian V. Noble
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <3D6C37BB.20EBC57F@virginia.edu>
Kaz Kylheku wrote:
> 
> In article <···················@newsfep1-gui.server.ntli.net>, Thomas Worthington wrote:
> > On Thu, 15 Aug 2002 18:25:03 +0100, Gary Chanson wrote:
> >> "Thomas Worthington" <···@theBitBeforeTheAtSignAgain.cx> wrote in
> >>     Blocks make it much harder to organize code and to make major
> >>     changes.
> >
> > Changes can be bothersome but organisation is easier.
> 
> But the 80 column punched cards---that's where the real action is!

No, no. Punched paper tape--I still have some programs in that format.
Unfortunately the last optical tape reader around here passed away in
the 1970's. So I have no idea what they are :-(

-- 
Julian V. Noble
Professor of Physics
···@virginia.edu

   "Science knows only one commandment: contribute to science."
   -- Bertolt Brecht, "Galileo".
From: Jerry Avins
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <3D70151A.6772619E@ieee.org>
"Julian V. Noble" wrote:
> 
> Kaz Kylheku wrote:
> >
> > In article <···················@newsfep1-gui.server.ntli.net>, Thomas Worthington wrote:
> > > On Thu, 15 Aug 2002 18:25:03 +0100, Gary Chanson wrote:
> > >> "Thomas Worthington" <···@theBitBeforeTheAtSignAgain.cx> wrote in
> > >>     Blocks make it much harder to organize code and to make major
> > >>     changes.
> > >
> > > Changes can be bothersome but organisation is easier.
> >
> > But the 80 column punched cards---that's where the real action is!
> 
> No, no. Punched paper tape--I still have some programs in that format.
> Unfortunately the last optical tape reader around here passed away in
> the 1970's. So I have no idea what they are :-(
> 
> --
> Julian V. Noble
> Professor of Physics
> ···@virginia.edu
> 
>    "Science knows only one commandment: contribute to science."
>    -- Bertolt Brecht, "Galileo".

I have an OAE paper tape reader. Someone asked for it months after I
last wrote that in a newsgroup, but I couldn't find it then. I found it
since, but lost the request.

Jerry
-- 
Engineering is the art of making what you want from things you can get.
�����������������������������������������������������������������������
From: Stig Hemmer
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ekvwuq725u9.fsf@gnoll.pvv.ntnu.no>
"Julian V. Noble" <···@virginia.edu> writes:
> No, no. Punched paper tape--I still have some programs in that format.
> Unfortunately the last optical tape reader around here passed away in
> the 1970's. So I have no idea what they are :-(

You mean you can't read paper tape yourself?

Stig Hemmer,
Jack of a Few Trades.
From: Marcel Hendrix
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <8l5c9.217$hf3.1547@typhoon.bart.nl>
(#30164) Stig Hemmer <····@pvv.ntnu.no> wrote Re: Lisp's unique feature: compiler available at run-time

> "Julian V. Noble" <···@virginia.edu> writes:
>> No, no. Punched paper tape--I still have some programs in that format.
>> Unfortunately the last optical tape reader around here passed away in
>> the 1970's. So I have no idea what they are :-(

> You mean you can't read paper tape yourself?


Of course we can...

-marcel

-- ----------------------------------------------------------------------
    0 VALUE fhandle
FALSE VALUE file=open?
FALSE VALUE eof?
    0 VALUE #lines

: READ-LN	0 #TIB !  0 >IN !
		FALSE TO eof?
		1 +TO #lines
		TIB #128 fhandle READ-LINE ABORT" read error"
		0= DUP TO eof? IF DROP EXIT ENDIF
		#TIB ! ;

: SKIP		1 >IN +! ;

: +CHAR		TIB >IN @ + ·@  SKIP ; 		\ <> --- <char>

: -VALID?	+CHAR '|' <> ; 			\ <> --- <boolean>

: ···@		+CHAR 'o' = 1 AND ;		\ <> --- <0|1>

: ····@+	FOR  AFT			\ <accu> <n> --- <accu'>
		       1 LSHIFT  ···@ OR 
		    THEN
		NEXT ;

: GET-BYTE	READ-LN				\ <> --- <byte>
		-VALID? IF ^J EXIT ENDIF	\ skip "|"
		0  5 ····@+  SKIP  
		3 ····@+ ;

: WRITE-CR	CR ;

: .BYTE		DUP ^J =			\ <byte> --- <>
		   IF DROP CR EXIT		\ convert Unix <lf> to <cr><lf>
		ENDIF EMIT ;


	-- A little shell to test it.

: OpenPaperTape					\ <> --- <>
		S" papertap.dat" R/O OPEN-FILE	\ redirect i/o to file
		 ABORT" can't open" TO fhandle
		 TRUE TO file=open?
		0 TO #lines
		WRITE-CR ;

: ClosePaperTape 				\ <> --- <>
		file=open? IF fhandle CLOSE-FILE ABORT" can't close"
			      FALSE TO file=open?
			ENDIF
		CR #lines DEC. ." lines read." 
		0 >IN !  0 #TIB ! ;		\ Clear TIB !


: (GET-TAPE)	OpenPaperTape
		BEGIN
		 GET-BYTE .BYTE
		 WAIT? eof? OR
		UNTIL
		ClosePaperTape ;


: GET-TAPE	['] (GET-TAPE) CATCH
		file=open? IF fhandle CLOSE-FILE DROP
			      FALSE TO file=open?
			ENDIF
		THROW ;	


: .ABOUT	CR ." Enter:  GET-TAPE  to decode the papertape in PAPERTAP.DAT" ;

-- papertap.dat ------------------------------------------------------------------
___________
| o  o.   |
| oo o.ooo|
| ooo .ooo|
|  o  .   |
| oo o.o o|
| oo  .  o|
| oo o.oo |
| oooo.  o|
|  o  .   |
| ooo .   |
| ooo . o |
| oo o.ooo|
| oo  .ooo|
| ooo . o |
| oo  .  o|
| oo o.o o|
| oo o.o o|
| oo  .o o|
| ooo . o |
| ooo . oo|
|  o  .   |
| oo  .o  |
| oo o.ooo|
| oo  .o o|
| ooo . oo|
|  o  .   |
| oo o.  o|
| ooo .o  |
|  o  .   |
| ooo .o  |
| oo  .  o|
| oo o. oo|
| oo  .o o|
|  o  .   |
| ooo .o  |
| oo o.ooo|
|  o  .   |
| oo  . oo|
| oo o.   |
| oo  .  o|
| oo o.oo |
| oo  .ooo|
| oo  .o o|
|  o  .   |
| oo  .  o|
|  o  .   |
| oo o.o  |
| oo o.  o|
| oo  .ooo|
| oo o.   |
| ooo .o  |
|  o  .   |
| oo  . o |
| ooo .o o|
| oo o.o  |
| oo  . o |
|  ooo.ooo|
|  ooo.ooo|
|    o. o |
|    o. o |
| o  o.oo |
| oo o.ooo|
| oo o.oo |
| oo  .o o|
|  o o.o  |
|  o  .   |
| o  o.  o|
| ooo .o  |
|  o  .ooo|
| ooo . oo|
|  o  .   |
| oo  .  o|
|  o  .   |
| oo o.   |
| oo  .  o|
| ooo . o |
| oo  .o  |
| ooo .ooo|
| oo  .  o|
| ooo . o |
| oo  .o o|
|  o  .   |
| ooo .   |
| ooo . o |
| oo o.ooo|
| oo  . o |
| oo o.o  |
| oo  .o o|
| oo o.o o|
|  o o.oo |
|    o. o |
___________
From: ···@redhat.invalid
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <Lq979.6$hZ1.143@news13-win.server.ntlworld.com>
In comp.lang.forth Gary Chanson <········@no.spam.theworld.com> wrote:

> "Thomas Worthington" <···@theBitBeforeTheAtSignAgain.cx> wrote in message
> ·························@newsfep3-gui.server.ntli.net...
>>
>> On the other hand, you just have to look at the code produced without
>> BLOCKS to see what a devastating effect source code in files has on style
>> and quality of code produced. The big problem is that blocks enforce
>> factoring and factoring produces good code. It's not impossible to factor
>> without blocks but, without being forced to factor, programmers naturally
>> slip into writing one long definition where 7 or 8 would be better.

>     I don't buy it!  I don't see such a correlation at all.  Good
> programmers write good code in any environment, while bad programmers write
> bad code in any environment.

Surely you're not denying that the quality of the environment
substantially influences the quality of the result, are you?

Andrew.
From: Gary Chanson
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ajjgt7$p0f$2@pcls4.std.com>
<···@redhat.invalid> wrote in message
····················@news13-win.server.ntlworld.com...
>
> >     I don't buy it!  I don't see such a correlation at all.  Good
> > programmers write good code in any environment, while bad programmers
write
> > bad code in any environment.
>
> Surely you're not denying that the quality of the environment
> substantially influences the quality of the result, are you?

    I am.  The "quality of the environment" might effect how much effort it
might take, and at some point might become effectively unusable.  The
quality of the result is still in the hands of the programmer.

--

-Gary Chanson (MVP for Windows SDK)
-Software Consultant (Embedded systems and Real Time Controls)
·········@mvps.org

-Abolish public schools
From: ···@redhat.invalid
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <j7c79.7$hZ1.135@news13-win.server.ntlworld.com>
In comp.lang.forth Gary Chanson <········@no.spam.theworld.com> wrote:

> <···@redhat.invalid> wrote in message
> ····················@news13-win.server.ntlworld.com...
>>
>> >     I don't buy it!  I don't see such a correlation at all.  Good
>> > programmers write good code in any environment, while bad programmers
> write
>> > bad code in any environment.
>>
>> Surely you're not denying that the quality of the environment
>> substantially influences the quality of the result, are you?

>     I am.  The "quality of the environment" might effect how much effort it
> might take, and at some point might become effectively unusable.  The
> quality of the result is still in the hands of the programmer.

Well, we've disagreed before but this time I am simply amazed at this
disagreement.  In my experience the quality of the programming
environment (editor, programming language, operating system, compiler,
and so on) has a great influence on the quality of the program that is
produced.

Dijkstra -- as usual -- put it more clearly than I ever could when he
said that the tools we use have a profound (and devious!) influence on
our thinking habits and, therefore, on our thinking abilities.

Andrew.
From: Gary Chanson
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ajk66v$s3g$1@pcls4.std.com>
<···@redhat.invalid> wrote in message
····················@news13-win.server.ntlworld.com...
>
> >     I am.  The "quality of the environment" might effect how much effort
it
> > might take, and at some point might become effectively unusable.  The
> > quality of the result is still in the hands of the programmer.
>
> Well, we've disagreed before but this time I am simply amazed at this
> disagreement.  In my experience the quality of the programming
> environment (editor, programming language, operating system, compiler,
> and so on) has a great influence on the quality of the program that is
> produced.
>
> Dijkstra -- as usual -- put it more clearly than I ever could when he
> said that the tools we use have a profound (and devious!) influence on
> our thinking habits and, therefore, on our thinking abilities.

    I don't think we're talking about the same thing.  I'm probably
interpreting "quality of the environment" more narrowly then you are.

    For instance, if we're talking about text editors, I can produce the
same results with Notepad as I can with my editor, all be it, with more
effort.  I can not produce the same results with a block editor.

--

-Gary Chanson (MVP for Windows SDK)
-Software Consultant (Embedded systems and Real Time Controls)
·········@mvps.org

-Abolish public schools
From: Jerry Avins
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <3D5D3EB2.CC42DC8A@ieee.org>
···@redhat.invalid wrote:
> 
> In comp.lang.forth Gary Chanson <········@no.spam.theworld.com> wrote:
> 
> > "Thomas Worthington" <···@theBitBeforeTheAtSignAgain.cx> wrote in message
> > ·························@newsfep3-gui.server.ntli.net...
> >>
> >> On the other hand, you just have to look at the code produced without
> >> BLOCKS to see what a devastating effect source code in files has on style
> >> and quality of code produced. The big problem is that blocks enforce
> >> factoring and factoring produces good code. It's not impossible to factor
> >> without blocks but, without being forced to factor, programmers naturally
> >> slip into writing one long definition where 7 or 8 would be better.
> 
> >     I don't buy it!  I don't see such a correlation at all.  Good
> > programmers write good code in any environment, while bad programmers write
> > bad code in any environment.
> 
> Surely you're not denying that the quality of the environment
> substantially influences the quality of the result, are you?
> 
> Andrew.

I don't deny the salutary effect on an environment that consists in part
of people one respects exclaiming "Bullshit!" when appropriate.

Jerry
-- 
Engineering is the art of making what you want from things you can get.
�����������������������������������������������������������������������
From: ···@redhat.invalid
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <mbc79.8$hZ1.107@news13-win.server.ntlworld.com>
In comp.lang.forth Jerry Avins <···@ieee.org> wrote:
>
> I don't deny the salutary effect on an environment that consists in part
> of people one respects exclaiming "Bullshit!" when appropriate.

A long time ago when I was a very green programmer, I had the dubious
pleasure of watching Elizabeth going through my code, a line at a
time, making comments in pencil in the margins.  As you might imagine,
many of the comments were, ah, less than complimentary.  I don't know
if she remembers the occasion, but I sure do...

Andrew.
From: Julian V. Noble
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <3D6C380E.593C00C4@virginia.edu>
Jerry Avins wrote:
> 
> ···@redhat.invalid wrote:
> >
> > In comp.lang.forth Gary Chanson <········@no.spam.theworld.com> wrote:
> >
> > > "Thomas Worthington" <···@theBitBeforeTheAtSignAgain.cx> wrote in message
> > > ·························@newsfep3-gui.server.ntli.net...
> > >>
> > >> On the other hand, you just have to look at the code produced without
> > >> BLOCKS to see what a devastating effect source code in files has on style
> > >> and quality of code produced. The big problem is that blocks enforce
> > >> factoring and factoring produces good code. It's not impossible to factor
> > >> without blocks but, without being forced to factor, programmers naturally
> > >> slip into writing one long definition where 7 or 8 would be better.
> >
> > >     I don't buy it!  I don't see such a correlation at all.  Good
> > > programmers write good code in any environment, while bad programmers write
> > > bad code in any environment.
> >
> > Surely you're not denying that the quality of the environment
> > substantially influences the quality of the result, are you?
> >
> > Andrew.
> 
> I don't deny the salutary effect on an environment that consists in part
> of people one respects exclaiming "Bullshit!" when appropriate.
> 
> Jerry
> --
> Engineering is the art of making what you want from things you can get.
> �����������������������������������������������������������������������

Now that's what I call an _environment_ !!
-- 
Julian V. Noble
Professor of Physics
···@virginia.edu

   "Science knows only one commandment: contribute to science."
   -- Bertolt Brecht, "Galileo".
From: Jeff Fox
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <3D5B6B67.10906@ultratechnology.com>
Thomas Worthington wrote:

> On the other hand, you just have to look at the code produced without
> BLOCKS to see what a devastating effect source code in files has on style
> and quality of code produced. The big problem is that blocks enforce
> factoring and factoring produces good code. It's not impossible to factor
> without blocks but, without being forced to factor, programmers naturally
> slip into writing one long definition where 7 or 8 would be better.


My problem with people's rejection of blocks in not with they use of
files rather than blocks for source, but rather the rejection of blocks
for use for data within programs.  Unfortunately those who were raised
thinking that everything should be a file are not able to consider the
alternatives.

best wishes,
Jeff Fox
From: Gary Chanson
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ajhbfu$ekq$1@pcls4.std.com>
"Jeff Fox" <···@ultratechnology.com> wrote in message
···················@ultratechnology.com...
>
> My problem with people's rejection of blocks in not with they use of
> files rather than blocks for source, but rather the rejection of blocks
> for use for data within programs.  Unfortunately those who were raised
> thinking that everything should be a file are not able to consider the
> alternatives.

    It makes no sense to confine a program to a fixed block size unless it's
a natural attribute of the program.  There are a few problems which do fit
blocks, but they are by far the exception.

--

-Gary Chanson (MVP for Windows SDK)
-Software Consultant (Embedded systems and Real Time Controls)
·········@mvps.org

-Abolish public schools
From: jmdrake
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <e20a4a47.0208160832.26626fa@posting.google.com>
"Gary Chanson" <········@no.spam.TheWorld.com> wrote in message news:<············@pcls4.std.com>...
> "Jeff Fox" <···@ultratechnology.com> wrote in message
> ···················@ultratechnology.com...
> >
> > My problem with people's rejection of blocks in not with they use of
> > files rather than blocks for source, but rather the rejection of blocks
> > for use for data within programs.  Unfortunately those who were raised
> > thinking that everything should be a file are not able to consider the
> > alternatives.
> 
>     It makes no sense to confine a program to a fixed block size unless it's
> a natural attribute of the program.  There are a few problems which do fit
> blocks, but they are by far the exception.

I disagree.  Many small database problems map nicely to block VM schemes.
Besides, most programs use some "fixed block" memory scheme even if
this is hidden from the users.  A "paragraph" of memory allocated
by DOS is 16 bytes.  A "page" of memory in MS SQL server is always
8K bytes.  (Oracle allows you to choose blocksize).  These things
are "hidden" from the programmer and only come in to play when 
doing "performance tuning".  In Forth with blocks it's not hidden,
unless you write some "abstraction layer" to hide it.  Besides, it's
not an either/or problem.  As far back as L&P's Forth 83 you had your
choice of both in the same system.

Regards,

John M. Drake
From: Christopher Browne
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ajjdit$1b8hor$4@ID-125932.news.dfncis.de>
Quoth ··········@yahoo.com (jmdrake):
> "Gary Chanson" <········@no.spam.TheWorld.com> wrote in message news:<············@pcls4.std.com>...
>> "Jeff Fox" <···@ultratechnology.com> wrote in message
>> ···················@ultratechnology.com...
>> >
>> > My problem with people's rejection of blocks in not with they use
>> > of files rather than blocks for source, but rather the rejection
>> > of blocks for use for data within programs.  Unfortunately those
>> > who were raised thinking that everything should be a file are not
>> > able to consider the alternatives.
>> 
>> It makes no sense to confine a program to a fixed block size unless
>> it's a natural attribute of the program.  There are a few problems
>> which do fit blocks, but they are by far the exception.
>
> I disagree.  Many small database problems map nicely to block VM
> schemes.  Besides, most programs use some "fixed block" memory
> scheme even if this is hidden from the users.  A "paragraph" of
> memory allocated by DOS is 16 bytes.  A "page" of memory in MS SQL
> server is always 8K bytes.  (Oracle allows you to choose blocksize).
> These things are "hidden" from the programmer and only come in to
> play when doing "performance tuning".  In Forth with blocks it's not
> hidden, unless you write some "abstraction layer" to hide it.
> Besides, it's not an either/or problem.  As far back as L&P's Forth
> 83 you had your choice of both in the same system.

I tend to agree.  It doesn't make sense for there to be a _huge_ set
of block size selections, if the block is visible to the programmer in
the first place.

The typical use in Forth of blocks for VM pretty much requires that
you know the size of the block, and it is _MUCH_ simpler to code
things based on some fixed size.

If you really desire to get _really_ sophisticated, in a manner that
parallels the way some DBMSes and filesystems offer variable block
sizes, then what you're going to do is to layer a "size-agnostic"
abstraction on top.  And you'll _still_ have to have some code that is
_very_ anal about knowing block sizes that implements the abstraction.

In a RDBMS or filesystem, that abstraction is normally totally hidden,
just as the _precise_ details of what a C malloc() or typical Lisp
memory allocation are generally pretty opaque to the usual user.

But the _typical_ Forth usage of blocks for VM involves staying pretty
close to the metal, having very little in the way of "potentially
expensive opaque abstractions" in the way (in much the manner that
people perceive garbage collection as being a "potentially expensive
opaque abstraction"), and once people have gotten accustomed to 1K
blocks, a lot of useful idioms fall out of that, and people get
accustomed to those idioms.

The _real_ controversy about blocks has little to do with this, but
rather points to what you do about having _programs_ in blocks, and
the impact of doing that.  And the dogma evidently hasn't changed much
since the writing of _Thinking Forth_, the classic one still being
that "Blocks force you to write good code by forcing functions to fit
into 16 lines and 64 columns."  Unfortunately, the belief that this
will result in bad programmers writing good code is as false as it was
back then, that being the flaw in the dogma.
-- 
(concatenate 'string "cbbrowne" ·@acm.org")
http://cbbrowne.com/info/oses.html
Rules  of  the Evil  Overlord  #97.  "My  dungeon  cells  will not  be
furnished with  objects that  contain reflective surfaces  or anything
that can be unravelled." <http://www.eviloverlord.com/>
From: Thomas Worthington
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <y5c79.1548$B46.92021@newsfep2-gui>
On Fri, 16 Aug 2002 18:42:54 +0100, Christopher Browne wrote:

> 
> The _real_ controversy about blocks has little to do with this, but
> rather points to what you do about having _programs_ in blocks, and the
> impact of doing that.  And the dogma evidently hasn't changed much since
> the writing of _Thinking Forth_, the classic one still being that
> "Blocks force you to write good code by forcing functions to fit into 16
> lines and 64 columns."  Unfortunately, the belief that this will result
> in bad programmers writing good code is as false as it was back then,
> that being the flaw in the dogma.

If that's what your dogma is; I don't think blocks help poor programmers
but they prevent otherwise good programmers from drifting into two page
definitions instead of short, clear, easily tested and understood ones.

It takes more than an unusual editor to make good programmers out of bad
ones.

TWW
From: Julian V. Noble
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <3D6C371A.67A7F9C8@virginia.edu>
jmdrake wrote:
> 
> "Gary Chanson" <········@no.spam.TheWorld.com> wrote in message news:<············@pcls4.std.com>...
> > "Jeff Fox" <···@ultratechnology.com> wrote in message
> > ···················@ultratechnology.com...
> > >
> > > My problem with people's rejection of blocks in not with they use of
> > > files rather than blocks for source, but rather the rejection of blocks
> > > for use for data within programs.  Unfortunately those who were raised
> > > thinking that everything should be a file are not able to consider the
> > > alternatives.

********************************************************************************
> >     It makes no sense to confine a program to a fixed block size unless it's
> > a natural attribute of the program.  There are a few problems which do fit
> > blocks, but they are by far the exception.
********************************************************************************


> I disagree.  Many small database problems map nicely to block VM schemes.
> Besides, most programs use some "fixed block" memory scheme even if
> this is hidden from the users.  A "paragraph" of memory allocated
> by DOS is 16 bytes.  A "page" of memory in MS SQL server is always
> 8K bytes.  (Oracle allows you to choose blocksize).  These things
> are "hidden" from the programmer and only come in to play when
> doing "performance tuning".  In Forth with blocks it's not hidden,
> unless you write some "abstraction layer" to hide it.  Besides, it's
> not an either/or problem.  As far back as L&P's Forth 83 you had your
> choice of both in the same system.
> 
> Regards,
> 
> John M. Drake

	I think you have concatenated two different postings. I don't
	think Jeff Fox said the stuff between the lines of asterisks.
	I think he is pro-blocks, for some purposes, and IOW agrees
	with you. I also agree with you.


-- 
Julian V. Noble
Professor of Physics
···@virginia.edu

   "Science knows only one commandment: contribute to science."
   -- Bertolt Brecht, "Galileo".
From: Anton Ertl
Subject: Blocks (was: Lisp's unique feature: compiler available at run-time)
Date: 
Message-ID: <2002Aug17.103203@a0.complang.tuwien.ac.at>
Thomas Worthington <···@theBitBeforeTheAtSignAgain.cx> writes:
>On the other hand, you just have to look at the code produced without
>BLOCKS to see what a devastating effect source code in files has on style
>and quality of code produced.  The big problem is that blocks enforce
>factoring and factoring produces good code.

[Note: screens are the traditional term for blocks used for source code]

I have looked at
<http://www.complang.tuwien.ac.at/forth/objects/objects.fs> (I know
that this was developed without screens) and at
<http://home.iae.nl/users/mhx/lprof.frt> (looking at the formatting,
this does not seem to come from a screens editor, either).  In both
files I don't see any definition that would not fit into a screen
(with reformatting to make some of the longer lines fit).  What code
are you talking about?

If you orient yourself on screen sizes for factoring decisions, your
definitions are much too large.

And in any case, you do not need screens to get that kind of
disciplinary help; changing your Forth compiler to produce an error
when compiling a definition containing more than X words, or something
similar, is pretty easy.

Followups set to clf.

- anton
-- 
M. Anton Ertl  http://www.complang.tuwien.ac.at/anton/home.html
comp.lang.forth FAQs: http://www.complang.tuwien.ac.at/forth/faq/toc.html
EuroForth 2002: http://www.complang.tuwien.ac.at/anton/euroforth2002/
From: Jeff Fox
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <3D5B6B3E.2000605@ultratechnology.com>
Christopher Browne wrote:

> 
> No, the _BIG_ problem with Forth was the BLOCKS controversy.  


Your right about all they symptoms.  My most recent essay on Forth
was on this subject, http://www.ultratechnology.com/essence.htm

best wishes,
Jeff Fox
From: Howerd Oakford
Subject: Comments on http://www.ultratechnology.com/essence.htm
Date: 
Message-ID: <bqp99.1020$997.53304@newsfep1-win.server.ntli.net>
Hi Jeff,

Nice essay!

I would take the arguments you make one step further - that the use of files
is just one example of the total reliance on the "Operating System" as
axiomatic and indispensible. Another is the use of memory management heaps.
The idea of the "OS" is to solve all potential problems now, so that
everything built on top of the OS becomes simpler and more uniform.
Unfortunately, this has proven not to be the case, at least IMHO. I am very
aware of the fact that the whole of the modern software industry is based on
the "OS", and that even the suggestion that there may be alternatives is
either derided or taken as heresy. So, yes I am being a little defensive
here....

In my paper http://www.inventio.co.uk/multlang.htm I describe a one-line
database, based on BLOCK :

: >FILE ( n - a)   8 /MOD 2100 + BLOCK SWAP 128 * + ;

and also how BLOCK based records can be exported very simply to a file-based
text editor.

I think this illustrates both the advantages and problems of not using files
:
The advantage is simplicity.
The problems are to do with multiple programs requriing disk space.

The number 2100 refers to the start block of a file used to store the data
( in this case running under DOS ).
Without a central Disk Operating System, certain ranges of blocks must be
explicitly allocated for this program's data.
If another program requires BLOCK storage, or this program requires more,
the 2100 may have to be changed.
This makes a coupling between programs, all of which are fighting over the
same disk space.
The conventional answer to this is use a central Disk Operating System to
add a level of indirection, and a directory to label sections of disk as
files. Unix takes this to the extreme, where everything is a file.

The trouble is, and I think this is the point that you and Chuck are making,
is that the penalty for using a general purpose Disk Operating System is
considerably greater than most people realise, and it is often possible to
program without one.

Getting rid of the Disk Operating System means that the applications must
resolve their block requirements amongst themselves. It also means that
general purpose programming, where programs in the form of files, can be
sold, transfered and run on computers running Disk Operating Systems is not
going to happen in the conventional way.

So what are the alternatives?
1) ColorForth uses linear memory divided into blocks, a contiguous selection
of which can be copied to a floppy disk.

2) I tried using the first 64 bytes of each block as fields to allow
"indirect blocks" - rather like the Forth dictionary.
Blocks used by a given program are labled as such, so that other programs
can ignore them. You loose the simple 1-to-1 mapping between block number
and physical sector, but it does allow each program to have as many blocks
as it likes without referring to a central Disk Operating System. ( This
ties in nicely with my version control scheme too ).
Is this a Disk Operating System? Possibly, but it lacks a central directory.

Any other ideas?

Regards

Howerd   8^)





"Jeff Fox" <···@ultratechnology.com> wrote in message
·····················@ultratechnology.com...
> Christopher Browne wrote:
>
> >
> > No, the _BIG_ problem with Forth was the BLOCKS controversy.
>
>
> Your right about all they symptoms.  My most recent essay on Forth
> was on this subject, http://www.ultratechnology.com/essence.htm
>
> best wishes,
> Jeff Fox
>
From: John Passaniti
Subject: Re: Comments on http://www.ultratechnology.com/essence.htm
Date: 
Message-ID: <umctcqtefdac84@corp.supernews.com>
"Howerd Oakford" <··············@ntlworld.com> wrote in message
·························@newsfep1-win.server.ntli.net...
> I would take the arguments you make one step further -
> that the use of files is just one example of the total
> reliance on the "Operating System" as axiomatic and
> indispensible.

Maybe in a general sense there is some merit to this view.  But again in the
embedded realm it doesn't hold.  There is no "total reliance" on operating
systems for embedded systems programmers targetting systems that don't have
operating systems.

I've asked Jeff this pointed question but he refuses to answer it.  Maybe
you will.  In situations where the development environment is different from
the target environment, how does the use of operating systems on the
development side impact on the target side.

I'll use a real-world example-- the project I'm now working on.  The target
is a Motorola HC08 processor.  It has about 32k of flash, around 512 bytes
of RAM.  There is no operating system on the target.  The development
envrionment is Cygwin under Windows XP and a cross-compiler for the HC08.
Please explain to me how the reliance on an operating system on the
development side affects my "mindset" as I write code for the target.

> I am very aware of the fact that the whole of the
> modern software industry is based on the "OS",
> and that even the suggestion that there may be
> alternatives is either derided or taken as heresy.

Really?  I don't see that at all in the embedded realm.
From: Howerd Oakford
Subject: Re: Comments on http://www.ultratechnology.com/essence.htm
Date: 
Message-ID: <8My99.2161$BV5.130442@newsfep1-gui.server.ntli.net>
Hi John,

"John Passaniti" <····@JapanIsShinto.com> wrote in message
···················@corp.supernews.com...
>
> "Howerd Oakford" <··············@ntlworld.com> wrote in message
> ·························@newsfep1-win.server.ntli.net...
> > I would take the arguments you make one step further -
> > that the use of files is just one example of the total
> > reliance on the "Operating System" as axiomatic and
> > indispensible.
>
> Maybe in a general sense there is some merit to this view.  But again in
the
> embedded realm it doesn't hold.  There is no "total reliance" on operating
> systems for embedded systems programmers targetting systems that don't
have
> operating systems.
>
> I've asked Jeff this pointed question but he refuses to answer it.  Maybe
> you will.  In situations where the development environment is different
from
> the target environment, how does the use of operating systems on the
> development side impact on the target side.
>
> I'll use a real-world example-- the project I'm now working on.  The
target
> is a Motorola HC08 processor.  It has about 32k of flash, around 512 bytes
> of RAM.  There is no operating system on the target.  The development
> envrionment is Cygwin under Windows XP and a cross-compiler for the HC08.
> Please explain to me how the reliance on an operating system on the
> development side affects my "mindset" as I write code for the target.

Firstly, most embedded work is done using C, a language specifically
designed for writing operating systems.
C encourages "programming by data structure", and makes efficient factoring
of the problem difficult ( IMHO ).
There is an implicit assumption that because the big boys write operating
systems and use C ( or C++ or Java etc ), that this is the best way to
program something even if it is not an operating system.

Secondly, the same reasoning that leads to the very existance of operating
systems, is also applied to embedded systems : try to write general purpose
code that will solve every potential problem. This is a very difficult
mindset to dispel, because it is so obviously true that standards are good.
Experience in mechanical engineering proves beyond doubt that it is better
to specify an M3 bolt than a non-standard one. It is completely logical to
apply this experience to software engineering. The difference is that
standard M3 bolts are real objects, whereas standard interfaces are symbolic
references to real "objects".
[ As an aside, this is why I think that the charging of interest on loans is
wrong ( meaning not efficient ) - because it is treating a symbolic
representation of real goods as if it is real itself. ]
Many people have found that the standardised, general purpose "operating
system" approach is far from ideal.

Thirdly, the term "embedded" has grown to cover quite complex systems. A
mobile phone, for example, may have 4 MB of Flash and 256K or RAM, and uses
an operating system which looks remarkably like Windows. This is why it is
very slow, full of bugs, and occasionally crashes - but these disadvantages
are outweighed by the fact that it has kept hundreds of programmers well fed
for many years - I'm one of them!
Todays embedded systems are often based on their own operating systems. This
is taken as the Holy Grail of embedded programming - to write the whole
operating system so well that a new phone ( with a bigger LCD, new keypad
etc ) can be developed by changing a few parameters in the make file...
Maybe one day it will happen, but I suspect not.

Fourthly, the concept of the "cross compiler" has many of the features of an
operating system. I changed over from compiling on the same processor to
compiling on a PC around 1986, using chipForth. This was a definite
improvement - the cost/power ratio of the PC meant that compilation became
much quicker. There are clear advantages to this approach - change your
target processor, then change your cross compiler accordingly - simple. The
disadvantage is an increase in the complexity of the development
environment - there are now two systems : Host and Target, and each may have
different system variables. My jury is still out on cross compilation, until
a better alternative comes along. Forth is, after all,  platform independent
( or can be ) so it seems wrong that cross compilation is neccessary. BTW, I
am still not completely convinced about the value of compilation of any
sort, but that is another issue.

> > I am very aware of the fact that the whole of the
> > modern software industry is based on the "OS",
> > and that even the suggestion that there may be
> > alternatives is either derided or taken as heresy.
>
> Really?  I don't see that at all in the embedded realm.
You will just have to trust me on this one ;^)

Regards

Howerd
From: John Passaniti
Subject: Re: Comments on http://www.ultratechnology.com/essence.htm
Date: 
Message-ID: <kjH99.119782$vg.20216178@twister.nyroc.rr.com>
"Howerd Oakford" <··············@ntlworld.com> wrote in message
··························@newsfep1-gui.server.ntli.net...
> Firstly, most embedded work is done using C, a language
> specifically designed for writing operating systems.

Yesterday, I had to tighten a screw that had come loose.  I couldn't find my
screwdrivers (my partner probably loaned them out again), and ended up
instead using a butterknife.  The butterknife was clearly not designed to
tighten loose screws, but it worked perfectly fine.  The reason is that the
only requirement of a tool that tightened this screw was that it have a flat
edge.  The screwdriver might have been better-- a better handle to grip,
stronger metal allowing for turning larger screws, etc.  But for this task,
the butterknife was more than adequate.

C may not have been specifically designed for embedded work, but if it fits
the needs of the programmer, that does it matter?  More important, there is
plenty of intersection between the needs of many embedded systems and the
needs of porting an operating system.

Be careful of arguments structured around "but language X was specifically
designed for Y."  Because that then applies to Forth as well.  Forth, we are
told, was specifically designed for embedded systems work.  If I take your
argument at face value, then doesn't that mean that Forth is inappropriate
for non-embedded systems work?  Tread lightly here.

> C encourages "programming by data structure", and
> makes efficient factoring of the problem difficult
> ( IMHO ).

Programs = Data Structures + Algorithms

I have yet to find a language (including Forth) where that equation doesn't
hold.  *All* programming is about finding appropriate representations of
data, and combining them with algorithms.  If you disagree, then please show
me any non-trivial program (presumably in Forth) that doesn't define a data
structure of some kind, and that doesn't define algorithms on that data
structure.

The limits of C regarding being able to factor code have nothing to do with
"programming by data structure" as you put it.  There are other limitations
in C (such call/return sequences of functions being fixed by the language).

> There is an implicit assumption that because the big boys
> write operating systems and use C ( or C++ or Java etc ),
> that this is the best way to program something even if it is
> not an operating system.

Opinion noted, but you still haven't answered my simple and direct question.
I'll ask again in a different way:

I have a development system.  This system has an operating system, uses
files, and has layer upon layer of abstraction.  On this system, I am
writing code that will target a system that doesn't have any operating
system.  How does my development system influence the code I write and run
on the target.  Don't avoid the question.

> Secondly, the same reasoning that leads to the very existance
> of operating systems, is also applied to embedded systems : try
> to write general purpose code that will solve every potential
> problem.

In my 15 years of experience, there has only been *one* general purpose
abstraction that I have carried forward.  Nearly everything product I (and
the others I work on) is composed of original code tuned to the specifics of
the application.  The only exceptions are in products that are follow-ons to
other products.

> Todays embedded systems are often based on
> their own operating systems.

While the embedded realm has certainly grown over time to include larger
systems that use operating systems, the vast majority of applications that
are still being designed are for smaller targets.  Cell phones and PDAs and
home Internet routers and everything else that use standardized operating
systems are certainly the most visible and "sexy" applications.  But that's
not where the majority of embedded development is occuring.  The most sold
processors in the world are still 8-bit microcontrollers.  You've got dozens
lurking in your home, more than you might expect in your car, one in your
keyboard, your mouse, your monitor, your microwave oven, every digital clock
on your home, many of your kid's toys, etc.

The vast majority of embedded development is not based on targets that can
support the kind of operating systems you're talking about.

> Fourthly, the concept of the "cross compiler" has many of the
> features of an operating system.

Huh?  You'll have to explain that statement in more detail, please.  What
specifically are these features?
From: Gary Chanson
Subject: Re: Comments on http://www.ultratechnology.com/essence.htm
Date: 
Message-ID: <ak8ebp$119$1@pcls4.std.com>
"John Passaniti" <····@JapanIsShinto.com> wrote in message
·····························@twister.nyroc.rr.com...
>
> In my 15 years of experience, there has only been *one* general purpose
> abstraction that I have carried forward.  Nearly everything product I (and
> the others I work on) is composed of original code tuned to the specifics
of
> the application.  The only exceptions are in products that are follow-ons
to
> other products.

    It's getting to the point where everything I'm given to work on is some
kind of follow-on product (even when it's not appropriate).  No one seems to
want to believe that it really will take less time to start from scratch.
It's getting very depressing!

--

-Gary Chanson (MVP for Windows SDK)
-Software Consultant (Embedded systems and Real Time Controls)
·········@mvps.org

-Abolish public schools
From: Howerd Oakford
Subject: Re: Comments on http://www.ultratechnology.com/essence.htm
Date: 
Message-ID: <14O99.59285$IU4.1770982@newsfep2-win.server.ntli.net>
Hi John,

"John Passaniti" <····@JapanIsShinto.com> wrote in message
·····························@twister.nyroc.rr.com...
>
> "Howerd Oakford" <··············@ntlworld.com> wrote in message
> ··························@newsfep1-gui.server.ntli.net...
> > Firstly, most embedded work is done using C, a language
> > specifically designed for writing operating systems.
[snip]
> C may not have been specifically designed for embedded work, but if it
fits
> the needs of the programmer, that does it matter?
It matters to me. I cannot speak for "the programmer", who evidently finds
that C suits his or her needs.

> More important, there is
> plenty of intersection between the needs of many embedded systems and the
> needs of porting an operating system.
Yes.

> Be careful of arguments structured around "but language X was specifically
> designed for Y."  Because that then applies to Forth as well.  Forth, we
are
> told, was specifically designed for embedded systems work.  If I take your
> argument at face value, then doesn't that mean that Forth is inappropriate
> for non-embedded systems work?  Tread lightly here.
I cannot speak for Chuck, but I certainly did not design Forth - I
discovered it, in the form of microForth.
Forth was found to be good for solving embedded software problems, because
at the time that was the only type of software around.

> > C encourages "programming by data structure", and
> > makes efficient factoring of the problem difficult
> > ( IMHO ).
>
> Programs = Data Structures + Algorithms
>
> I have yet to find a language (including Forth) where that equation
doesn't
> hold.  *All* programming is about finding appropriate representations of
> data, and combining them with algorithms.  If you disagree, then please
show
> me any non-trivial program (presumably in Forth) that doesn't define a
data
> structure of some kind, and that doesn't define algorithms on that data
> structure.
I agree that "Programs = Data Structures + Algorithms", since programs
consist of data structures and algorithms.
Programming ( the creation of Programs ) can be achieved by many different
methods. In my experience C encourages a particular style of programming
which, for me, is not as efficient as the way I program in Forth.

> The limits of C regarding being able to factor code have nothing to do
with
> "programming by data structure" as you put it.  There are other
limitations
> in C (such call/return sequences of functions being fixed by the
language).
Yes.

> > There is an implicit assumption that because the big boys
> > write operating systems and use C ( or C++ or Java etc ),
> > that this is the best way to program something even if it is
> > not an operating system.
>
> Opinion noted, but you still haven't answered my simple and direct
question.
> I'll ask again in a different way:

> I have a development system.  This system has an operating system, uses
> files, and has layer upon layer of abstraction.  On this system, I am
> writing code that will target a system that doesn't have any operating
> system.  How does my development system influence the code I write and run
> on the target.  Don't avoid the question.
I must apologise for having failed to answer the question to your
satisfaction.
There is clearly nothing in the development system that directly influences
the code that you write and run on the target. However, if your development
system has the following features :
1) it is separate from the target, and has a different processor.
2) uses a file-based operating system,
3) supports C or similar batch processing language,
then I would say that the development system has been heavily influenced by
the software design mindset which is prevalent today.  So there is an
implicit pressure to write your target code in this way too.

> > Secondly, the same reasoning that leads to the very existance
> > of operating systems, is also applied to embedded systems : try
> > to write general purpose code that will solve every potential
> > problem.
>
> In my 15 years of experience, there has only been *one* general purpose
> abstraction that I have carried forward.  Nearly everything product I (and
> the others I work on) is composed of original code tuned to the specifics
of
> the application.  The only exceptions are in products that are follow-ons
to
> other products.
Excellent! This has not been my experience with the majority of embedded C
projects.

> > Todays embedded systems are often based on
> > their own operating systems.
>
> While the embedded realm has certainly grown over time to include larger
> systems that use operating systems, the vast majority of applications that
> are still being designed are for smaller targets.  Cell phones and PDAs
and
> home Internet routers and everything else that use standardized operating
> systems are certainly the most visible and "sexy" applications.  But
that's
> not where the majority of embedded development is occuring.  The most sold
> processors in the world are still 8-bit microcontrollers.  You've got
dozens
> lurking in your home, more than you might expect in your car, one in your
> keyboard, your mouse, your monitor, your microwave oven, every digital
clock
> on your home, many of your kid's toys, etc.
>
> The vast majority of embedded development is not based on targets that can
> support the kind of operating systems you're talking about.
True. This argument only applies to the larger embedded systems. Having said
that, you can get a C-based RTOS in a couple of K these days, do even your
microwave might have one...

> > Fourthly, the concept of the "cross compiler" has many of the
> > features of an operating system.
>
> Huh?  You'll have to explain that statement in more detail, please.  What
> specifically are these features?
I was thinking here of the "one cross compiler supports all" design, where
when a new processor comes along, a new module is written that supports it.
Another feature is the "batch processing" mode, where source code is
entirely separate from both the cross compiler and target system.

On a slightly different tack, I see from a couple of your postings that you
like to learn as many languages as possible. I have had the good fortune to
discover Forth first in my programming career - since then I have had to
learn C, Perl, VBscript and JavaScript. I would just like to point out that,
from my admittedly biassed point of view, there is absolutely no comparison
between Forth and these other languages.
I can understand C ( for example ) as fully as I need to. I can see why
design decisions were made, and if I felt the need I could write a C
compiler.
I still do not understand Forth. I can understand why design decisions were
made, and I could write a Forth if I felt the need, but I still cannot
understand why Forth works.When I read Jeff's essays, or Chuck's talks, I
sometimes catch a glimpse of one of the many facets of Forth, and this helps
me to understand more.
What I am trying to do here is to take an overview of this and other
postings, and point out that while the discussion of details and techniques
is fascinating, and may well be applicable to other languages, there is
something synergistic about Forth which is very real.
Any discussion of this aspect of Forth is going to be "woolly", partly
because reductionist analysis is not helpful, partly because everything
depends on the context, and partly because I am still trying to find the
right way to express these ideas.
But I am an optimist, and just because something is dificult, doesn't mean
that we shouldn't try!
Forth is something special, and I belive that it should be possible to
explain why...

Regards

Howerd
From: Greg Menke
Subject: Re: Comments on http://www.ultratechnology.com/essence.htm
Date: 
Message-ID: <m3fzx49w2o.fsf@europa.pienet>
"Howerd Oakford" <··············@ntlworld.com> writes:

 
> Firstly, most embedded work is done using C, a language specifically
> designed for writing operating systems.
> C encourages "programming by data structure", and makes efficient factoring
> of the problem difficult ( IMHO ).
> There is an implicit assumption that because the big boys write operating
> systems and use C ( or C++ or Java etc ), that this is the best way to
> program something even if it is not an operating system.

C is often a pretty good choice, C++ isn't too bad either- just some
additional setup and games with constructors, and some fun & games
with link scripts.  C/C++ are nice because at minimum all you need is
a reasonably complete library plus some kind of bootstrap routine and
you're pretty well set.

 
> Secondly, the same reasoning that leads to the very existance of operating
> systems, is also applied to embedded systems : try to write general purpose
> code that will solve every potential problem. This is a very difficult
> mindset to dispel, because it is so obviously true that standards are good.
> Experience in mechanical engineering proves beyond doubt that it is better
> to specify an M3 bolt than a non-standard one. It is completely logical to
> apply this experience to software engineering. The difference is that
> standard M3 bolts are real objects, whereas standard interfaces are symbolic
> references to real "objects".
> [ As an aside, this is why I think that the charging of interest on loans is
> wrong ( meaning not efficient ) - because it is treating a symbolic
> representation of real goods as if it is real itself. ]
> Many people have found that the standardised, general purpose "operating
> system" approach is far from ideal.

If you want to homebrew all the device drivers, tasking,
interrupt/exception management and filesystem stuff for every target
you build for, you're welcome to it.  I'd rather use a decent &
efficient RTOS for that kind of stuff and spend the rest of the time
sailing.



> Todays embedded systems are often based on their own operating systems. This
> is taken as the Holy Grail of embedded programming - to write the whole
> operating system so well that a new phone ( with a bigger LCD, new keypad
> etc ) can be developed by changing a few parameters in the make file...
> Maybe one day it will happen, but I suspect not.

I don't think it has anything to do with "Holy Grails", its a matter
of simplifying the problem.  The projects I work on don't have keypads
or LCD's- they don't even have any leds to blink.  If you want to
debug it, theres an oscilloscope, logic analyzer, a serial console or
two & a remote gdb stub for userspace bugs.  I rely on an OS to boot
up, set up a reasonable C library, give me timer interrupts,
multithreading, serial I/O, and ultimately to call my main() so I can
get on with the project.

 
> Fourthly, the concept of the "cross compiler" has many of the features of an
> operating system. I changed over from compiling on the same processor to
> compiling on a PC around 1986, using chipForth. This was a definite
> improvement - the cost/power ratio of the PC meant that compilation became
> much quicker. There are clear advantages to this approach - change your
> target processor, then change your cross compiler accordingly - simple. The
> disadvantage is an increase in the complexity of the development
> environment - there are now two systems : Host and Target, and each may have
> different system variables. My jury is still out on cross compilation, until
> a better alternative comes along. Forth is, after all,  platform independent
> ( or can be ) so it seems wrong that cross compilation is neccessary. BTW, I
> am still not completely convinced about the value of compilation of any
> sort, but that is another issue.

Cross compilers by themselves have nothing much to do with operating
systems.  They are simply a compiler hosted on a given machine that
generates code for a given target which may be an entirely different
architecture.  But all that gets generated is compiled code, you have
to supply whatever operating system infrastructure you want.  That can
range from nothing at all, to some kind of C library or an RTOS, your
choice.  Any dependencies on the target must be dealt with either by
your code or by whatever auxiliary code you link in- there is no
correspondence to the host you're building on at all.  Perhaps you're
talking about how the Forth implementation you're using does
cross-compiling- but such techniques are not universal.

The systems I work on don't have the capacity to host their own
compiler suites and development environment, so I must have a cross
compiler to do much of anything at all.

Gregm
From: Bruce Hoult
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <bruce-A00CF3.12093316082002@copper.ipg.tsnz.net>
In article <···············@ID-125932.news.dfncis.de>,
 Christopher Browne <········@acm.org> wrote:

> The storage models for the respective langauges are _totally_
> different, where Postscript appears to be pretty much "heap-based," so
> that you can readily redefine functions and sling objects in and out
> of memory without really worrying about where they come and go.  Which
> is rather like garbage-collected Lisp.

Original PostScript was not garbage-collected.  It used a mark/release 
"stack" model of heap allocation, where everything allocated after a 
given point in time was deallocated in one fell swoop.  This was a good 
match to printing, on several levels.  First, no state needs to be 
retained between print jobs.  This was frequently sufficient to ensure 
not running out of memory on simple print jobs.  Bigger documents would 
load a set of utility functions and data initially and then manually 
bracket each page with save/restore pairs.

Later PostScript versions added true GC.

-- Bruce
From: Anton Ertl
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <2002Aug17.093022@a0.complang.tuwien.ac.at>
Christopher Browne <········@acm.org> writes:
>"David Van Camp" <····@davidvancamp.com> wrote:
>> However, Forth's big problem was it's reliance on Reverse Polish
>> Notation (Lisp uses Polish Notation -- eg. not reversed), which is
>> nearly unreadable by humans. 

Marc de Groot once made a witty comment about confusing unreadability
and illiteracy, but unfortunately I don't find it at the moment.

I am a human, and I have no problem reading prefix, postfix, and even
infix and I can also read Forth (which goes beyond RPN thanks to stack
manipulation words, e.g., DUP and SWAP).

If you write unreadable Forth code, then you have probably misused
those features that go beyond RPN (stack manipulation words).  You do
not need to use such words in present-day Forth, you could use just
locals and RPN.  However, many experienced Forth programmers advocate
trying to do without locals when you learn Forth, in order to help you
learn how to properly factor Forth code (good style Forth code is
factored more finely than in most other languages): if your Forth code
comes out unreadable, your factoring is wrong, and you should try to
change it.

>No, the _BIG_ problem with Forth was the BLOCKS controversy.  

How can a controversy be a problem?  In any case, the past tense is
the right tense for that, although it seems that there are still some
embers glowing.

>On the other hand, modern OSes have this newfangled thing called
>_FILES_, which tend to be much more efficient at storing programs
>(because whitespace can simply evaporate away), and people expect
>well-behaved applications to be able to interoperate with other
>applications _at the OS level_, so that for Forth to pretend that it
>_is_ the OS is totally unfriendly.
>
>This "war" had a lot to do with the lack of advancement of Forth onto
>"modern" platforms in any big way.  

How so?  All hosted Forths I have used supported files, even the '83
vintage fig-Forth variant I used on the C64.

>> Postscript's underlying macro language was based on Forth.

A recent discussion of this claim led to Section 5.8 of the clf
General/Misc FAQ
<http://www.complang.tuwien.ac.at/forth/faq/faq-general-5.html#ss5.8>

>The storage models for the respective langauges are _totally_
>different, where Postscript appears to be pretty much "heap-based," so
>that you can readily redefine functions and sling objects in and out
>of memory without really worrying about where they come and go.  Which
>is rather like garbage-collected Lisp.
>
>In contrast, Forth's memory model is lots more 'anal retentive' about
>the programmer getting into the precise details of where data is
>stored than C is, with malloc().  (If you're using BLOCKS as VM, you
>really _need_ to be that anal retentive. :-))

Actually the old Postscript and the old Forth storage model are the
same: In both you can allocate memory from a stack, and reclaim memory
by throwing away everything allocated since some point in time (in
Postscript with save/restore, in Forth with FORGET or markers).

If you experience one as different from the other, it has to do with
the mindset with which you approach programming in these languages.

In the meantime, Postscript has acquired garbage collection, and Forth
has acquired C-style ALLOCATE/FREE, and there's also a conservative
garbage collector for Forth.

>Various implementors have headed in various directions.  The Forth
>world more resembles the Lisp world _before_ Common Lisp was
>standardized, with quite a number of very different kinds of
>implementations.
>
>There's an ANSI standard which is much smaller than CL which has
>probably been less influential than CL.

All popular Forth systems implement ANS Forth; in contrast, the most
popular Lisp implementation (Elisp) does not implement Common Lisp
(and let's not get into Scheme), so I do not understand your comment
about the influence of the standards, nor the one about the Lisp world
before Common Lisp.

Concerning the size, there are a number of things I would like to see
in ANS Forth, and a few things I would rather not have that are in it
(other people would place the balance differently); I don't know if
this is similar or different from the Lisp Community's view of Common
Lisp.

- anton
-- 
M. Anton Ertl  http://www.complang.tuwien.ac.at/anton/home.html
comp.lang.forth FAQs: http://www.complang.tuwien.ac.at/forth/faq/toc.html
EuroForth 2002: http://www.complang.tuwien.ac.at/anton/euroforth2002/
From: Christopher Browne
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ajlmle$1cdu0m$1@ID-125932.news.dfncis.de>
In an attempt to throw the authorities off his trail, ·····@mips.complang.tuwien.ac.at (Anton Ertl) transmitted:
> Christopher Browne <········@acm.org> writes:
>>"David Van Camp" <····@davidvancamp.com> wrote:
>>> However, Forth's big problem was it's reliance on Reverse Polish
>>> Notation (Lisp uses Polish Notation -- eg. not reversed), which is
>>> nearly unreadable by humans. 
>
> Marc de Groot once made a witty comment about confusing
> unreadability and illiteracy, but unfortunately I don't find it at
> the moment.
>
> I am a human, and I have no problem reading prefix, postfix, and
> even infix and I can also read Forth (which goes beyond RPN thanks
> to stack manipulation words, e.g., DUP and SWAP).
>
> If you write unreadable Forth code, then you have probably misused
> those features that go beyond RPN (stack manipulation words).  You
> do not need to use such words in present-day Forth, you could use
> just locals and RPN.  However, many experienced Forth programmers
> advocate trying to do without locals when you learn Forth, in order
> to help you learn how to properly factor Forth code (good style
> Forth code is factored more finely than in most other languages): if
> your Forth code comes out unreadable, your factoring is wrong, and
> you should try to change it.

Lisp has almost exactly the same problem, of "cadaverous" code that is
rife with CAR/CDR, with the really egregious examples involving NTH in
much the way that really horrible Forth code might contain lots of
ROLLs.

The experienced programmer writes code that doesn't _need_ to have a
lot of that.

You don't need to  eschew those constructs altogether, in the way
people interpret Dijkstra's "GOTO Considered Harmful"; the point is
that well-written code doesn't need them a lot.

I'll poke at Perl, too: I finally figured out the real "irritation" of
it, which is that it makes SO MUCH use of stupid punctuation.  In
human language, you have accents and punctuation here and there in
order to point peoples' attention at _important_ changes of
expectations.  In effect, Perl code heavily accents _everything_, thus
making it difficult to tell which 'accents' are the truly important
ones.  That's not _exactly_ the same problem, but I think it's a
problem with Perl.

>>No, the _BIG_ problem with Forth was the BLOCKS controversy.  

> How can a controversy be a problem?  In any case, the past tense is
> the right tense for that, although it seems that there are still
> some embers glowing.

A controversy that seriously divides a community, making it difficult
for them to work together, causes a problem...

>>On the other hand, modern OSes have this newfangled thing called
>>_FILES_, which tend to be much more efficient at storing programs
>>(because whitespace can simply evaporate away), and people expect
>>well-behaved applications to be able to interoperate with other
>>applications _at the OS level_, so that for Forth to pretend that it
>>_is_ the OS is totally unfriendly.

>>This "war" had a lot to do with the lack of advancement of Forth
>>onto "modern" platforms in any big way.

> How so?  All hosted Forths I have used supported files, even the '83
> vintage fig-Forth variant I used on the C64.

I'm not sure there ever was a 'hosted' Forth on the Atari 8 bit
platform, where I mainly used Forth; the question of "take the box
over over" versus "run atop OS" was something of an issue when the IBM
PC started becoming popular, and the common use of Forth for embedded
systems has left it something of an issue, as embedded needs are quite
different from the needs of those expecting apps to play with others
atop the OS.

>>> Postscript's underlying macro language was based on Forth.
>
> A recent discussion of this claim led to Section 5.8 of the clf
> General/Misc FAQ
> <http://www.complang.tuwien.ac.at/forth/faq/faq-general-5.html#ss5.8>
>
>>The storage models for the respective langauges are _totally_
>>different, where Postscript appears to be pretty much "heap-based," so
>>that you can readily redefine functions and sling objects in and out
>>of memory without really worrying about where they come and go.  Which
>>is rather like garbage-collected Lisp.
>>
>>In contrast, Forth's memory model is lots more 'anal retentive' about
>>the programmer getting into the precise details of where data is
>>stored than C is, with malloc().  (If you're using BLOCKS as VM, you
>>really _need_ to be that anal retentive. :-))
>
> Actually the old Postscript and the old Forth storage model are the
> same: In both you can allocate memory from a stack, and reclaim memory
> by throwing away everything allocated since some point in time (in
> Postscript with save/restore, in Forth with FORGET or markers).

I guess I never did work with Postscript in the oldest days...
> If you experience one as different from the other, it has to do with
> the mindset with which you approach programming in these languages.

Certainly the fact that your goal, in PS, is to set up a _few_
functions to help construct a set of pages, generate those pages, and
then clear _everything_ out, makes for a different model.  I wouldn't
expect that people would be redefining words terribly often within a
print job...

> In the meantime, Postscript has acquired garbage collection, and
> Forth has acquired C-style ALLOCATE/FREE, and there's also a
> conservative garbage collector for Forth.

The latter must be recent...
-- 
(reverse (concatenate 'string ····················@" "454aa"))
http://cbbrowne.com/info/advocacy.html
"Nightmares - Ha!  The way my life's been going lately,
 Who'd notice?"  -- Londo Mollari
From: Anton Ertl
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <2002Aug19.104055@a0.complang.tuwien.ac.at>
Christopher Browne <········@acm.org> writes:
>In an attempt to throw the authorities off his trail, ·····@mips.complang.tuwien.ac.at (Anton Ertl) transmitted:
>> Forth has acquired C-style ALLOCATE/FREE, and there's also a
>> conservative garbage collector for Forth.
>
>The latter must be recent...

May '98.  Before that, you could use Boehm's conservative garbage
collector on many Forth systems.

- anton
-- 
M. Anton Ertl  http://www.complang.tuwien.ac.at/anton/home.html
comp.lang.forth FAQs: http://www.complang.tuwien.ac.at/forth/faq/toc.html
EuroForth 2002: http://www.complang.tuwien.ac.at/anton/euroforth2002/
From: Chris Double
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ur8h077ou.fsf@double.co.nz>
"David Van Camp" <····@davidvancamp.com> writes:

> One of the most mind-expanding books on software development I ever read was
> "Thinking in Forth" by I cannot remember who. 

Most likely "Thinking Forth" by Leo Brodie:

http://c2.com/cgi/wiki?ThinkingForth

I use Forth on a Nokia 9210 for software development. It's very low
memory requirements suits it very well. In the long term though I'd
like to bootstrap a Lisp and use that.

Chris.
-- 
http://radio.weblogs.com/0102385
From: Kaz Kylheku
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <aji2la$nu2$5@luna.vcn.bc.ca>
In article <·······················@news1.news.adelphia.net>, David Van
Camp wrote:
> Since no one else seems to have mentioned it, the language system FORTH
> provided all this an more way back around the same time frame as Lisp
> (70's). Indeed, while Lisp was critisized as slow and unfriendly, Forth was
> rapidly gaining popularity as an interpreted system with speeds approaching
> native C.
> 
> However, Forth's big problem was it's reliance on Reverse Polish Notation
> (Lisp uses Polish Notation -- eg. not reversed), which is nearly unreadable

Lisp does *not* use Polish notation.  In Polish notation, it is *implicit* how
many operands go with an operator. Lisp's notation makes the syntax tree
structure explicit using parentheses. 

It is not only the direction of reverse Polish that makes it unreadable, but
the lack of any explicit indication of grouping.  A symbol is either a value
that is pushed onto a stack, or an operator that implicitly knows how much to
pop off. 

The lack of grouping is silly for all kinds of reasons. What if you want, say,
a multiplication operator that takes a variable number of arguments?  You then
push some marker onto the stack, which tells the function to stop popping. 

In other words, parsing the program's syntax is conflated into evaluation. The
evaluation stack doubles as a parsing stack.

A sane version of Forth would include parenthesis symbols, which are actually
pushed onto the stack. Thus ( 3 4 5 * ) would mean, push the symbols 
( 3 4 5 and * onto the stack, and the ) would mean: pop the stack until 
the ( symbol is encountered, pushing the symbols in between the
parenthes onto a list, and then push the resulting list onto the stack.
If the stack becomes empty upon popping the ( parentheses, you have
finished constructing your nested list.  Pass that to your local neighborhood
Lisp evaluator. 

So for instance

  ((a b +)
    (a b) add defun)

would define a binary function called add that adds two numbers. The advantage
is immediately obvious: we can actually tell how many parameters this will
consume; no explicit pops are needed. Because the expression is evaluated as
whole, the defun operator can parse everything and *write the code* that makes
the symbols a and b refer to the two top values on the stack, ensure that
they are popped, and that there is a result value on the stack no matter how
the function terminates, so that the human being doesn't have to waste
its time looking for underflows and overflows, having to only ensure that
parentheses balance in the source code.

:) :) :)
From: William Tanksley Google
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <de3fc1ef.0208201142.61d65d63@posting.google.com>
Kaz Kylheku <···@ashi.footprints.net> wrote:
> David Van Camp wrote:
> > Since no one else seems to have mentioned it, the language system FORTH
> > provided all this an more way back around the same time frame as Lisp
> > (70's). Indeed, while Lisp was critisized as slow and unfriendly, Forth was
> > rapidly gaining popularity as an interpreted system with speeds approaching
> > native C.

Just for the record, Forth's inventor knew of Lisp -- they weren't
really contemporaries, although I don't know which one 'invented' the
more interactive system first.

> > However, Forth's big problem was it's reliance on Reverse Polish Notation
> > (Lisp uses Polish Notation -- eg. not reversed), which is nearly unreadable

> Lisp does *not* use Polish notation.  In Polish notation, it is *implicit* 
> how many operands go with an operator. Lisp's notation makes the syntax 
> tree structure explicit using parentheses. 

Forth also doesn't use Reverse Polish notation. It's visually similar
in its simpler expressions, but, unlike RPN, Forth is based on
semantics, not notation. The expression "3 3" is not RPN, but it's
perfectly valid Forth.

> It is not only the direction of reverse Polish that makes it unreadable,

Correct, of course.

> but the lack of any explicit indication of grouping. 

Incorrect -- it's the reader's lack of experience. Thousands of people
read RPN-like notations every day. It's accepted mathematical notation
in many fields; HP's calculators use it; and of course Postscript and
Forth use it.

> A symbol is either a value
> that is pushed onto a stack, or an operator that implicitly knows how much 
> to pop off. 

No. A symbol is a function which takes a stack and returns a stack.
The function '3', unless defined otherwise, will return a stack
containing one additional element than the input had, whose value is
3.

There's a huge difference.

> The lack of grouping is silly for all kinds of reasons. What if you want, 
> say,a multiplication operator that takes a variable number of arguments?
> You then push some marker onto the stack, which tells the function to 
> stop popping. 

Correct. Or you push a count on the stack after all the arguments. I'm
missing the part where you give one of the "all kinds of reasons why
this is silly."

> In other words, parsing the program's syntax is conflated into evaluation. 
> The evaluation stack doubles as a parsing stack.

Here's the part you're missing. Because you're used to Lisp (and C,
and Algol, and Fortran, they're all the same in this respect) you
assume that calling a function means that you have to apply parameters
to the function as part of the parsing.

Lisp (and the other languages mentioned above) have a tree-structured
grammar to reflect the fact that parameters get applied to functions,
one function application for every parameter. Forth has a flat
list-structured grammar to reflect the fact that functions are
composed with each other: given the code "f g", Lisp-like notation
would be (f (g)).

The Forth code "3 4 +" doesn't _really_ translate to "(+  3 4)"; it
translates to something more like "(+ (3 (4 (?))))", where the ?
represents the state of the stack before the snippet executes.

This all goes to show that some pretty Lisp code translates to ugly
Forth code; some pretty Forth code translates to ugly Lisp code (as
the example above shows). There's no other moral; this doesn't prove
that either language is superior.

> A sane version of Forth would include parenthesis symbols, which are actually
> pushed onto the stack. Thus ( 3 4 5 * ) would mean, push the symbols 
> ( 3 4 5 and * onto the stack, and the ) would mean: pop the stack until 
> the ( symbol is encountered, pushing the symbols in between the
> parenthes onto a list, and then push the resulting list onto the stack.
> If the stack becomes empty upon popping the ( parentheses, you have
> finished constructing your nested list.  Pass that to your local neighborhood
> Lisp evaluator. 

How would this sane Forth handle "drop nip"? (Hint: stack effect is
abc--b.) That's simple -- it gets really complex if I ask you to
handle something with a SWAP.

> consume; no explicit pops are needed. Because the expression is evaluated as
> whole, the defun operator can parse everything and *write the code* that makes
> the symbols a and b refer to the two top values on the stack, ensure that
> they are popped, and that there is a result value on the stack no matter how
> the function terminates, 

All that is possible in Forth as well; strongForth is one example of
such a system, and similar things are done in the optimising Forth
compilers. Interestingly, unlike Lisp and other applicative languages,
doing this simple typechecking is VERY simple; it's done at the same
time all the other compilation is done, in a single pass.

> so that the human being doesn't have to waste
> its time looking for underflows and overflows, having to only ensure that
> parentheses balance in the source code.

> :) :) :)

:-)

What? I don't have a sense of humor. I can't -- I know both Lisp _and_
Forth.

-Billy
From: Kaz Kylheku
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <ak88oe$9sv$1@luna.vcn.bc.ca>
In article <····························@posting.google.com>, William Tanksley Google wrote:
> Kaz Kylheku <···@ashi.footprints.net> wrote:
>> Lisp does *not* use Polish notation.  In Polish notation, it is *implicit* 
>> how many operands go with an operator. Lisp's notation makes the syntax 
>> tree structure explicit using parentheses. 
> 
> Forth also doesn't use Reverse Polish notation. It's visually similar
> in its simpler expressions, but, unlike RPN, Forth is based on
> semantics, not notation. The expression "3 3" is not RPN, but it's
> perfectly valid Forth.

That's right, because RPN is not a programming language. It has operators which
statically determine how many parameters they consume, and anything which would
underflow, or leave extra stuff on the stack, were it interpreted that way, is
simply not a well-formed RPN formula.

>> It is not only the direction of reverse Polish that makes it unreadable,
> 
> Correct, of course.
> 
>> but the lack of any explicit indication of grouping. 
> 
> Incorrect -- it's the reader's lack of experience. Thousands of people

I fundamentally reject the blaming of the user. Understanding INTERCAL
programs also grows with the reader's experience.  Understanding anything grows
with experience. You can learn to read 80x86 executables for Windows if you
really apply yourself.

> read RPN-like notations every day. It's accepted mathematical notation
> in many fields; HP's calculators use it; and of course Postscript and
> Forth use it.

You can't decipher the structure implied by the notation without unravelling
the semantics of every function that appears in it. For some Forth programs,
parsing them is equivalent to running them. 

>> A symbol is either a value
>> that is pushed onto a stack, or an operator that implicitly knows how much 
>> to pop off. 
> 
> No. A symbol is a function which takes a stack and returns a stack.
> The function '3', unless defined otherwise, will return a stack
> containing one additional element than the input had, whose value is
> 3.

Another way of looking at it is that the program is a sequence of terminal
symbols. And how that program is *parsed* is entwined into the semantics of its
*computation*. So that 3 4 + actually means shift 3, shift 4, and reduce using
the rule for +.  Besides performing addition, the function also processes a
grammar production, effectively recognizing the suffix of a sentential form,
and performing the rightmost reduction.  So the stack doubles as a parsing
stack as well as an evaluation stack.  This is a gross language design
compromise, one that is acceptable only because it allows interpreters to fit
onto small firmware.

> There's a huge difference.
> 
>> The lack of grouping is silly for all kinds of reasons. What if you want, 
>> say,a multiplication operator that takes a variable number of arguments?
>> You then push some marker onto the stack, which tells the function to 
>> stop popping. 
> 
> Correct. Or you push a count on the stack after all the arguments. I'm
> missing the part where you give one of the "all kinds of reasons why
> this is silly."

Here is a test. Take some large Forth program, and introduce a new parameter to
one of its functions. Make it optional, so that existing calls which do not
supply the new argument are preserved as they are.

People make this kind of change all the time in Lisp programs, or C++ for
that matter.

For that to be possible, there has to be abstraction from the details of the
argument passing convention; the language implementation has to be free to
adjust that convention when the definition of the function changes.

>> In other words, parsing the program's syntax is conflated into evaluation. 
>> The evaluation stack doubles as a parsing stack.
> 
> Here's the part you're missing. Because you're used to Lisp (and C,
> and Algol, and Fortran, they're all the same in this respect) you
> assume that calling a function means that you have to apply parameters
> to the function as part of the parsing.

Not at all; merely that as a user of higher level languages, I don't expect to
fudge around with the low level stack mechanism for parameter passing.  I can
trust the machine to generate code which handles that. The technology for this
has been around for 45 years already, so there is no need to shy away from it,
unless you really need a complete interpreter in 500 bytes of machine code for
a microcontroller that costs pennies in high volume production.

> The Forth code "3 4 +" doesn't _really_ translate to "(+  3 4)"; it
> translates to something more like "(+ (3 (4 (?))))", where the ?
> represents the state of the stack before the snippet executes.

No, it really translates to (+ 3 4). Because applying the + function
to two operands is done that way in Lisp, and is done as 3 4 + in
Forth. It's more useful to match idiom for idiom, than to try to
emulate the semantics of one in the other. 

Otherwise you can make completely ridiculous claims, like comparing
the verbosity of this

  (let ((stack))
    (push 4 stack)
    (push 3 stack)
    (+ (pop stack) (pop stack)))

to the concisensess of this:

  3 4 +

Hey, look how easy the Forth is compared to what you have to do in Lisp. You
have to create your own stack from nothing, and then use these clumsy pushes
and pops.  Compare that to writing three simple symbols in Forth to do exactly
the same thing!

But in Lisp I can write a translator which will take the terse stack notation
and write the verbose code for me at compile time, so I can actually pretend to
take such a silly argument seriously and then defeat it on its own turf,
without even bothering to explain that, oh by the way, I would actually add
the two numbers using (+ 3 4).

> There's no other moral; this doesn't prove
> that either language is superior.

What proves that language A is superior to B is that the user of A
can avail herself of the notation and semantics of B in the form of a 
sublanguage that is seamlessly embedded into A, whereas the user of B
cannot do the reverse to obtain A within B. 

(Writing an A program that interprets B programs does not count as a
sublanguage within A; the B notation must be first-class within the A
environment. Thus for instance in a compiled implementation of A, the B
notation must also compile alongside all other notations.)

Of course this superiority is in the area of expressive power; B may still be
superior in other areas, like allowing for complete implementations that fit
into tiny systems.

>> A sane version of Forth would include parenthesis symbols, which are actually
>> pushed onto the stack. Thus ( 3 4 5 * ) would mean, push the symbols 
>> ( 3 4 5 and * onto the stack, and the ) would mean: pop the stack until 
>> the ( symbol is encountered, pushing the symbols in between the
>> parenthes onto a list, and then push the resulting list onto the stack.
>> If the stack becomes empty upon popping the ( parentheses, you have
>> finished constructing your nested list.  Pass that to your local neighborhood
>> Lisp evaluator. 
> 
> How would this sane Forth handle "drop nip"? (Hint: stack effect is
> abc--b.) That's simple -- it gets really complex if I ask you to
> handle something with a SWAP.

Your question is rather like an assembly language programmer challenging a C
programmer ``that's all very nice, but how do you swap the two halves of a 32
bit register in your language using one instruction?'' The answer is that you
don't think about programming in those terms, and that the assembly language
programmer grossly over-estimates the value of swapping two halves of a
register in one instruction.

Notice that in this ``sane Forth'', the stack is used only during parsing. It
produces a structured representation of the the program, which is then subject
to an evaluator which can have whatever semantics we choose to impose
on that structured representation.

Stack manipulation is only a tool; you want to use it only when you need to
implement some algorithm that is best expressed on a stack machine. You
shouldn't have to bend your mind into conforming to some paradigm that is
inappropriate to the problem domain; rather, the language should be capable of
adjusting.  You shouldn't be *prevented* from using the semantics of a
a stack machine, nor *required* to use it.
From: William Tanksley Google
Subject: Re: Lisp's unique feature: compiler available at run-time
Date: 
Message-ID: <de3fc1ef.0208281117.7fc8bb19@posting.google.com>
Kaz Kylheku <···@ashi.footprints.net> wrote:
> William Tanksley Google wrote:
> > Kaz Kylheku <···@ashi.footprints.net> wrote:
> >> It is not only the direction of reverse Polish that makes it unreadable,
> >> but the lack of any explicit indication of grouping. 

> > Incorrect -- it's the reader's lack of experience. Thousands of people

> I fundamentally reject the blaming of the user. Understanding INTERCAL
> programs also grows with the reader's experience.  Understanding 
> anything grows
> with experience. You can learn to read 80x86 executables for Windows if 
> you really apply yourself.

I repeat: thousands of people DO read Forthlike languages, such as
Postscript and HP-RPN. I fundamentally accept the blaming of the
user, especially when the user explicitly does not know the language 
in question.

Forth is hard for most people to read because most people don't know
Forth. This isn't a feature; it's a drawback. But it's not intrinsic
to Forth. If you can identify an intrinsic feature of Forth which
causes or contributes to that I'd be pleased to know.

> > read RPN-like notations every day. It's accepted mathematical notation
> > in many fields; HP's calculators use it; and of course Postscript and
> > Forth use it.

> You can't decipher the structure implied by the notation without unravelling
> the semantics of every function that appears in it. For some Forth programs,
> parsing them is equivalent to running them. 

This statement contains a very important misunderstanding. The problem
is that you're expecting two different notations to imply the same
structure. You want the tree-structure of an applicative program to
appear in all notations. The fact is, that's not going to happen;
Forth may be the first notation you've run into where the notation
does NOT have a tree structure.

Extracting a tree structure is possible; but in order to do that, you
have to convert from concatenative notation (which is what Forth uses)
into applicative notation (which is what most other computer languages
use).

Parsing concatenative notation is a very simple operation, because
programs in concatenative notation consist of a flat list of
functions. Because text is sequences of characters, we perform the
obvious mapping of space-delimited words representing functions.

That's all. Parsing is nothing more complex than tokenising; that
isn't an oversimplification.

> >> A symbol is either a value
> >> that is pushed onto a stack, or an operator that implicitly knows how much 
> >> to pop off. 

> > No. A symbol is a function which takes a stack and returns a stack.
> > The function '3', unless defined otherwise, will return a stack
> > containing one additional element than the input had, whose value is
> > 3.

> Another way of looking at it is that the program is a sequence of terminal
> symbols.

Yes. That's actually a very useful way of looking at it, so long as
you don't conclude that there's a tree somewhere on which the terminal
symbols are leaves.

> And how that program is *parsed* is entwined into the semantics of its
> *computation*.

Nope. Parsing is handled just like any other language, before
computation has a chance to start.

Because the ordering of the text is the same as the ordering of the
functions' executions (assuming no optimizations), compilation is
linear in the size of the text: emitted code (or interpreted actions)
depend only on what's been read, not what's ahead.

> So that 3 4 + actually means shift 3, shift 4, and reduce using
> the rule for +.

No. It means compose three functions from a stack onto a stack. "3 +",
for example, is interpretable only under this view; if you try to read
it as a grammar, you'll fail.

> Besides performing addition, the function also processes a
> grammar production, effectively recognizing the suffix of a sentential form,
> and performing the rightmost reduction.  So the stack doubles as a parsing
> stack as well as an evaluation stack.  This is a gross language design
> compromise, one that is acceptable only because it allows interpreters to fit
> onto small firmware.

No, it's a totally different language design, which results in an
almost nonexistant parser (thereby allowing interpreters to fit onto
small firmware), an elimination of the need for lambda calculus,
totally different rewriting rules, and many other results, most of
them useful in some areas and harmful in others.

For example, consider the process of extracting a function from the
body of an existing function. Let's assume no local variables, only
parameters (local variables add the same minor complication to both
cases). In Forth, this is a matter of cutting and pasting; the only
concern is that you don't cut a 'token' in half (where a token is a
space delimited word, comment, or other junk). In Lisp you can't do it
trivially; you have to analyse the code to find what variables are
used, add them to an argument list, then build the new function. You
can't even use cut and paste unless you're exactly on syntactic
boundaries (i.e. all your parentheses are balanced inside the text
you're thinking of extracting).

As another example, consider typechecking. I will assert (using only
this link, http://home.t-online.de/home/s.becher/forth/, for evidence)
that concatenative languages in general, and a version of Forth in
particular, can be staticly typechecked in linear time (in fact, in
the same linear pass as parsing). Typechecking an applicative language
is much more complex, as anyone who's attended a Compilers class can
testify. This is no trivial typechecking code, either; the page I
point to implements static polymorphism on all parameters.

> >> The lack of grouping is silly for all kinds of reasons. What if you want, 
> >> say,a multiplication operator that takes a variable number of arguments?
> >> You then push some marker onto the stack, which tells the function to 
> >> stop popping. 

> > Correct. Or you push a count on the stack after all the arguments. I'm
> > missing the part where you give one of the "all kinds of reasons why
> > this is silly."

> Here is a test. Take some large Forth program, and introduce a 
> new parameter to
> one of its functions. Make it optional, so that existing calls which 
> do not supply the new argument are preserved as they are.

Now, take a large program, and assume that every call to a particular
function is followed by a call to another, possibly with different
parameters. Implement a function which does the pair of calls, and
replace every such pair of calls with the new function. This is a
fairly simple search-and-replace in Forth; care to implement the EMACS
macro to handle it for Lisp?

> People make this kind of change all the time in Lisp programs, or C++ for
> that matter.

In most programming principles books, this kind of change is
considered bad, except in a self-contained program under the complete
control of the author. Otherwise you're changing a published API. Far
better to follow the solution most Forth programmers would use: write
a new function (with a new name) to handle the default argument.
Optionally, reimplement the old function as a call to the new one,
passing the "default" value for the argument.

If the code IS completely under your control, of course, a simple
search and replace suffices to replace "function_name" with
"default_argument function_name" throughout your program.

A comment: in the strongly typed Forth I mention above, the missing
argument would be noticed by the typechecking, allowing, in most
cases, for the new function to have the same name as the old -- giving
the same semantics as your Lisp/C++ solution.

> For that to be possible, there has to be abstraction from the details of the
> argument passing convention; the language implementation has to be free to
> adjust that convention when the definition of the function changes.

As you can see, this isn't the case.

However, there ARE disadvantages to Forth's model. Consider, for
example, that because functions are laid out as an ordered list, a
total chronological ordering is required; the programmer must
completely specify the order of execution, even if the program happens
to be purely functional and largely order-independant. In a
tree-structured program, you can implicitly leave the ordering of the
branches up to the compiler; in a list-structured one, the compiler
has to defeat your ordering if it wishes to impose its own.

I can't claim that the concatanative model (used by Forth) is perfect;
I only claim that it's a worthy peer of the applicative model in
language development.

> > The Forth code "3 4 +" doesn't _really_ translate to "(+  3 4)"; it
> > translates to something more like "(+ (3 (4 (?))))", where the ?
> > represents the state of the stack before the snippet executes.

> No, it really translates to (+ 3 4). Because applying the + function
> to two operands is done that way in Lisp, and is done as 3 4 + in
> Forth. It's more useful to match idiom for idiom, than to try to
> emulate the semantics of one in the other. 

Okay, match the idiom: "3 4". Or the idiom: "3 +". Or the idiom "foo
bar".

> Otherwise you can make completely ridiculous claims, like comparing
> the verbosity of this
>   (let ((stack))
>     (push 4 stack)
>     (push 3 stack)
>     (+ (pop stack) (pop stack)))
> to the concisensess of this:
>   3 4 +

A far better example in Lisp would be to use the features of the
language to say something like:

(stack-form 3 4 +)

All the conciseness, AND all of the advantages. And all the
disadvantages too, but that's okay.

> Hey, look how easy the Forth is compared to what you have to do in Lisp. You
> have to create your own stack from nothing, and then use these clumsy pushes
> and pops.  Compare that to writing three simple symbols in Forth to do exactly
> the same thing!

Of course, I make no such claim.

> But in Lisp I can write a translator which will take the terse stack 
> notation and write the verbose code for me at compile time, so I can 
> actually pretend to
> take such a silly argument seriously and then defeat it on its own turf,
> without even bothering to explain that, oh by the way, I would actually add
> the two numbers using (+ 3 4).

And I would add them using 7. My point? Your workaround to my
challenge is trivial; it only works for this one case, and at that it
isn't the optimal workaround.

> > There's no other moral; this doesn't prove
> > that either language is superior.

> What proves that language A is superior to B is that the user of A
> can avail herself of the notation and semantics of B in the form of a 
> sublanguage that is seamlessly embedded into A, whereas the user of B
> cannot do the reverse to obtain A within B. 

Interesting -- I don't mind that definition.

> (Writing an A program that interprets B programs does not count as a
> sublanguage within A; the B notation must be first-class within the A
> environment. Thus for instance in a compiled implementation of A, the B
> notation must also compile alongside all other notations.)

Great. So Forth and Lisp are coequals; both can seamlessly be
expressed within the other.

> Of course this superiority is in the area of expressive power; B may still be
> superior in other areas, like allowing for complete implementations that fit
> into tiny systems.

Yes, I'd agree -- and, of course, A may be superior in other ways,
such as having a larger standard library with more runtime
conveniences such as GC.

> >> A sane version of Forth would include parenthesis symbols, 
> >> which are actually
> >> pushed onto the stack. Thus ( 3 4 5 * ) would mean, push the symbols 
> >> ( 3 4 5 and * onto the stack, and the ) would mean: pop the stack until 
> >> the ( symbol is encountered, pushing the symbols in between the
> >> parenthes onto a list, and then push the resulting list onto the stack.
> >> If the stack becomes empty upon popping the ( parentheses, you have
> >> finished constructing your nested list.  Pass that to your 
> >> local neighborhood Lisp evaluator. 

> > How would this sane Forth handle "drop nip"? (Hint: stack effect is
> > abc--b.) That's simple -- it gets really complex if I ask you to
> > handle something with a SWAP.

> Your question is rather like an assembly language programmer challenging a C
> programmer ``that's all very nice, but how do you swap the two halves of a 32
> bit register in your language using one instruction?'' The answer is that you
> don't think about programming in those terms, and that the assembly language
> programmer grossly over-estimates the value of swapping two halves of a
> register in one instruction.

No, it's not like that. Assembly language is a low-level notation;
concatenative notation is a high-level one. I'm not mixing my levels,
like the assembly language programmer is.

> Notice that in this ``sane Forth'', the stack is used only during parsing. It
> produces a structured representation of the the program, which is then subject
> to an evaluator which can have whatever semantics we choose to impose
> on that structured representation.

But the Forth source code is already a structured representation of
the program, and the manipulations on it are already well-defined.

> Stack manipulation is only a tool;

Correct. So's concatenative notation (which happens to use stack
notation); but what you don't seem to realise is that so's applicative
notation.

> you want to use it only when you need to
> implement some algorithm that is best expressed on a stack machine.

No. Concatenative notation has only the stack in common with a stack
machine, and it uses stack manipulation. (That is, not all
concatenative languages implement a virtual machine. Even modern Forth
doesn't. The little concatenative language you mentioned which is
implemented in Lisp certainly wouldn't.)

> You
> shouldn't have to bend your mind into conforming to some paradigm that is
> inappropriate to the problem domain;

Definitely.

> rather, the language should be capable of adjusting.

The programmer should be capable of adjusting. The adjustability of
the language is only relevant if you HAVE to use that language.

> You shouldn't be *prevented* from using the semantics of a
> a stack machine, nor *required* to use it.

I can't be prevented or required. I'll try to choose the right
language for the job, whether or not it's embedded in another
language. I'm free to do that, because I understand the different
languages. You were a little less free, because you didn't know of the
existance of one very different language model.

Now you know.

-Billy