From: Marty Hall
Subject: Inner Classes in Java
Date: 
Message-ID: <x5n2x4utt2.fsf@rsi.jhuapl.edu>
The Java people at Sun have announced a proposal for "Inner Classes"
for Java 1.1. They've already announced an OO "funcall", whereby you can
take an object and a method name and invoke it. This new proposal
appears to add first class function (object) and closure
capability by introducing the ability to make classes within other
classes but return them to outside methods, define anonymous classes
at runtime, etc. This proposal is apparently in limbo, as many people
in the Java crowd don't see the benefits of this.

I'm not certain of the impact of having to wrap the functions in
objects instead of manipulating them directly, but this still appears
to be an area that the Lisp and Dylan communities are more experienced
than the Java community. If anyone is interested in commenting,

(A) The spec is available at 
<http://www.javasoft.com/products/JDK/1.1/designspecs/innerclasses/index.html>

(B) Sun is accepting comments at
<···················@lukasiewicz.Eng.Sun.COM>

(C) There is some discussion of this going on in
comp.lang.java.advocacy. Search for "Inner Classes" in the Subject
line.

Cheers-
					- Marty

Lisp Programming Resources: <http://www.apl.jhu.edu/~hall/lisp.html>
Java Programming Resources: <http://www.apl.jhu.edu/~hall/java/>

From: Lyman S. Taylor
Subject: Re: Inner Classes in Java
Date: 
Message-ID: <55ali0$il4@pravda.cc.gatech.edu>
In article <··············@rsi.jhuapl.edu>,
Marty Hall  <····@apl.jhu.edu> wrote:
>The Java people at Sun have announced a proposal for "Inner Classes"
>for Java 1.1. They've already announced an OO "funcall", whereby you can
...
>
>(A) The spec is available at 
><http://www.javasoft.com/products/JDK/1.1/designspecs/innerclasses/index.html>

  Haven't finished reading all of it yet but

  <http://www.javasoft.com/products/JDK/1.1/designspecs/>

  outlines that this feature is now removed from the JDK 1.1

  [ In an initial scan I'm not convinced that this going to give you
	"closures" per se. In pascal you can define a procuedure within
	 a procedure. While a nested lexical scope it isn't a "closure".
         There is still no way to dynamically create a new method and return
	 it. [ For Java the VM creates all classes/methods. In Pascal it is 
	 illegal period.   ]
-- 
Lyman S. Taylor           Comment by a professor observing two students 
(·····@cc.gatech.edu)     unconscious at their keyboards:
				"That's the trouble with graduate students.
				 Every couple of days, they fall asleep."
From: Marty Hall
Subject: Re: Inner Classes in Java
Date: 
Message-ID: <x5pw1zt0qp.fsf@rsi.jhuapl.edu>
·····@cc.gatech.edu (Lyman S. Taylor) writes:

> 
> In article <··············@rsi.jhuapl.edu>,
> Marty Hall  <····@apl.jhu.edu> wrote:
> >The Java people at Sun have announced a proposal for "Inner Classes"
> >for Java 1.1. They've already announced an OO "funcall", whereby you can
> ...
> >
> >(A) The spec is available at 
> ><http://www.javasoft.com/products/JDK/1.1/designspecs/innerclasses/index.html>
>   Haven't finished reading all of it yet but
> 
>   <http://www.javasoft.com/products/JDK/1.1/designspecs/>
> 
>   outlines that this feature is now removed from the JDK 1.1

According to the Sun people, it is actually undecided. Some of them
have said that it is more likely than not that Inner Classes will be
included. But it is certainly not guaranteed. My impression was that
the comments they receive at ···················@lukasiewicz.Eng.Sun.COM
would be likely to influence the decision. Perhaps not, though.

>   [ In an initial scan I'm not convinced that this going to give you
> 	"closures" per se. In pascal you can define a procuedure within
> 	 a procedure. While a nested lexical scope it isn't a "closure".
>          There is still no way to dynamically create a new method and return
> 	 it. ...]

I'm not following this. You can define a nested class with methods
that can access local variables and methods in the enclosing class,
and then return an instance of that local class to people *outside*
the enclosing class.

I'm *certainly* not arguing that the Java way is better, just that it
looks like a closure to me -- just wrapped up in an object. Eg, the
following seem pretty similar: (sorry for not doing it in Dylan too;
it's been a while). 

(defun return-closure ()
  (let ((a (foo))
        (b (bar)))
    #'(lambda (<args>) (baz a b))))

(setq test (return-closure))
(funcall test <values>)

===================================================================

public void returnClosure() {
  SomeClass a = foo();
  AnotherClass b = bar();
  return
    new Object() {
      public Class doIt(<args>) { return(baz(a, b)); }
    }
}

test = returnClosure();
test.doIt(<values>);

===================================================================

Cheers-
					- Marty

Lisp Resources: <http://www.apl.jhu.edu/~hall/lisp.html>
From: Oleg Moroz
Subject: Re: Inner Classes in Java
Date: 
Message-ID: <32791bb6.14446503@news-win.rinet.ru>
On 31 Oct 1996 11:53:20 -0500, ·····@cc.gatech.edu (Lyman S. Taylor) wrote:

>  [ In an initial scan I'm not convinced that this going to give you
>	"closures" per se. In pascal you can define a procuedure within
>	 a procedure. While a nested lexical scope it isn't a "closure".
>         There is still no way to dynamically create a new method and return
>	 it. [ For Java the VM creates all classes/methods. In Pascal it is 
>	 illegal period.   ]

The main difference between Pascal-style inner procedure and the real closure is
that the closure is closed over it's free variables. The closure can outlive the
invocation of the enclosing block and more than one closure can share the same
code at the same time but have it's private data (from the parent's invocation).
Java ineer classes (those defined inside the method) have this property, so they
are more like closures not just lexically-scoped procedures.

Oleg
From: Per Bothner
Subject: Re: Inner Classes in Java
Date: 
Message-ID: <55baon$5hn@rtl.cygnus.com>
In article <··········@pravda.cc.gatech.edu>,
Lyman S. Taylor <·····@cc.gatech.edu> wrote:
>  [ In an initial scan I'm not convinced that this going to give you
>	"closures" per se. In pascal you can define a procuedure within
>	 a procedure. While a nested lexical scope it isn't a "closure".
>         There is still no way to dynamically create a new method and return
>	 it. [ For Java the VM creates all classes/methods. In Pascal it is 
>	 illegal period.   ]

I have read the InnerClasses spec.
I have also implemented a compiler from Scheme to Java bytecodes.
(See ftp://ftp.cygnus.com:pub/bothner/kawa-1.0.tar.gz.)
The way the InnerClasses spec would implements nested classes is
is essentially the way I implement Scheme closures in Java.
In other words:  InnerClasses does give you closures.

-- 
	--Per Bothner
Cygnus Support     ·······@cygnus.com
From: Bill St.Clair
Subject: Re: Inner Classes in Java
Date: 
Message-ID: <327A043E.35F3@digitool.com>
Lyman S. Taylor wrote:
> 
> In article <··············@rsi.jhuapl.edu>,
> Marty Hall  <····@apl.jhu.edu> wrote:
> >The Java people at Sun have announced a proposal for "Inner Classes"
> >for Java 1.1. They've already announced an OO "funcall", whereby you can
> ...
> >
> >(A) The spec is available at
> ><http://www.javasoft.com/products/JDK/1.1/designspecs/innerclasses/index.html>
> 
>   Haven't finished reading all of it yet but
> 
>   <http://www.javasoft.com/products/JDK/1.1/designspecs/>
> 
>   outlines that this feature is now removed from the JDK 1.1

Just today it was returned.
From: Jim Kazoun
Subject: Re: Inner Classes in Java
Date: 
Message-ID: <3278FF2A.7C6A@digitalideas.com>
>The Java people at Sun have announced a proposal for "Inner Classes"
>for Java 1.1. They've already announced an OO "funcall", whereby you can
>take an object and a method name and invoke it. This new proposal
>appears to add first class function (object) and closure
>capability by introducing the ability to make classes within other
>classes but return them to outside methods, define anonymous classes
>at runtime, etc. This proposal is apparently in limbo, as many people
>in the Java crowd don't see the benefits of this.

>I'm not certain of the impact of having to wrap the functions in
>objects instead of manipulating them directly, but this still appears
>to be an area that the Lisp and Dylan communities are more experienced
>than the Java community. If anyone is interested in commenting,

>(A) The spec is available at 
>  <http://www.javasoft.com/products/JDK/1.1/designspecs/innerclasses/index.html>

>(B) Sun is accepting comments at
>  <···················@lukasiewicz.Eng.Sun.COM>

>(C) There is some discussion of this going on in
>comp.lang.java.advocacy. Search for "Inner Classes" in the Subject
>line.

I wish all those concerned would get off their duff and make some noise
and let SUN
know that this is important. The problem is that _most_ programmers have
no appreciation
for  Innerclasses and Parameterized Types because: 
1. they don't know what these terms mean.
2. Never programmed in languages that have such features, So they don't
appreciate 
the functionality, or ever used it effectively.
3. They enjoy writting switch statements that go on for miles
4. They love writting the same list of procedures for every class.
5. They hate higher abstraction capabilities, because it allows for 
more efficient code maintenance.

But seriously though, I just think that if you've never used these
tools,
in an effective way, you're not likely to appreciate the functionality.

What surprises me is that Guy steele would allow such a thing to happen,
unless
they have a better solution up their sleeves. Then again, he's only one
person
in a commercial entity, and that is too much burden to put on one
person. 

Jim Kazoun
From: Marty Hall
Subject: Re: Inner Classes in Java
Date: 
Message-ID: <x5zq12rl5j.fsf@rsi.jhuapl.edu>
Jim Kazoun <···@digitalideas.com> writes:

> ····@apl.jhu.edu wrote:
> >The Java people at Sun have announced a proposal for "Inner Classes"
> >for Java 1.1. They've already announced an OO "funcall", whereby you can
> >take an object and a method name and invoke it. This new proposal
> >appears to add first class function (object) and closure
> >capability by introducing the ability to make classes within other
> >classes but return them to outside methods, define anonymous classes
> >at runtime, etc. This proposal is apparently in limbo, as many people
> >in the Java crowd don't see the benefits of this.
[...]
> >(A) The spec is available at 
> >  <http://www.javasoft.com/products/JDK/1.1/designspecs/innerclasses/index.html>
> 
> >(B) Sun is accepting comments at
> >  <···················@lukasiewicz.Eng.Sun.COM>
[...]
> I wish all those concerned would get off their duff and make some noise
> and let SUN
> know that this is important. 

The ···················@lukasiewicz.Eng.Sun.COM address is the place
to do this, IMHO.
						- Marty
From: John Brewer
Subject: Re: Inner Classes in Java
Date: 
Message-ID: <jbrewer-0511961338320001@news.mcs.com>
In article <·············@digitalideas.com>, ···@digitalideas.com wrote:

> I wish all those concerned would get off their duff and make some noise
> and let SUN
> know that this is important. The problem is that _most_ programmers have
> no appreciation
> for  Innerclasses and Parameterized Types because: 
> 1. they don't know what these terms mean.
> 2. Never programmed in languages that have such features, So they don't
> appreciate 
> the functionality, or ever used it effectively.
> 3. They enjoy writting switch statements that go on for miles
> 4. They love writting the same list of procedures for every class.
> 5. They hate higher abstraction capabilities, because it allows for 
> more efficient code maintenance.
> 

Or maybe when they try to learn about these things by reading the
newsgroups about dynamic languages, they see themselves being ridiculed,
and go back to a language that'll pay the bills.

Seriously, I would hate to see the Java Language or the Java VM ruined by
an attack of creeping elegance.  The Java VM's got to run on cellular
phones and GameBoy-type devices.  I hope the Java VM architects in
particular keep in mind that ease of porting needs to be a primary goal,
not lots of extra features.

-- 
John Brewer             Senior Software Engineer             Spyglass, Inc.
    Opinions expressed do not necessarily reflect those of my employer.
   Heck, I'm old enough to remember the _first_ time Apple was "doomed".
From: ···@digitalideas.com
Subject: Re: Inner Classes in Java
Date: 
Message-ID: <32808216.4BB@digitalideas.com>
> Or maybe when they try to learn about these things by reading the
> newsgroups about dynamic languages, they see themselves being ridiculed,
> and go back to a language that'll pay the bills.

This message was NOT in reply to anyone seeking help, but was a _call
for action_ when Marty brought the issue of Innerclasses being moved
from the main index of JDK 1.1
I guess I am sensitive about that, cause I think of myself as a kind and
helpfull person :)

> Seriously, I would hate to see the Java Language or the Java VM ruined by
> an attack of creeping elegance.  The Java VM's got to run on cellular
> phones and GameBoy-type devices.  I hope the Java VM architects in
> particular keep in mind that ease of porting needs to be a primary goal,
> not lots of extra features.

Well, I am not worried about that. InnerClasses and Parameterized Types
are being implement as source code tranformations that _do not effect_
the Java VM. It is simply a way to make it _practical_ to use the more
advanced features of JAVA. This will not effect portability. Besides,
this is the best time to make additions to rectify any short comings of
the language before there is too much at stake.

JAVA is destined IMHO to become the reference language for computer
programming, and it should be done right. Elegance is not some fancy
feature, but the expression of well abstracted ideas put in code.
Programming is about abstraction, at least that is what I think, and it
is difficult to encapsulate code without such tools. What is so
atractive about JAVA is that with such capabilities, it becomes possible
to enhance the productivity, as there has been so much programming
productivity wasted as we stumbled our way to JAVA.

 I wept for the death of the lisp machines, and predicted way back then,
that the concept of a machine that is designed from the ground up
(processor,operating system,+ top language) around a powerfull language
was the correct one and will be the future of computing. I was off by
about 13 years. The only missing part was that I thought that this yet
to be born language would itself be a target language. It may still come
to pass. 

I am delighted to see JAVA flower as it has, and I do share your
concern, portability is the _primary_ cause for its adoption according
to a recent Infoworld survey.

Jim Kazoun
From: William Paul Vrotney
Subject: Re: Inner Classes in Java
Date: 
Message-ID: <vrotneyE0Hr4t.LM4@netcom.com>
In article <············@digitalideas.com> ···@digitalideas.com writes:

> 
> JAVA is destined IMHO to become the reference language for computer
> programming, and it should be done right. Elegance is not some fancy
> feature, but the expression of well abstracted ideas put in code.
> Programming is about abstraction, at least that is what I think, and it
> is difficult to encapsulate code without such tools. What is so
> atractive about JAVA is that with such capabilities, it becomes possible
> to enhance the productivity, as there has been so much programming
> productivity wasted as we stumbled our way to JAVA.
> 
>  I wept for the death of the lisp machines, and predicted way back then,
> that the concept of a machine that is designed from the ground up
> (processor,operating system,+ top language) around a powerfull language
> was the correct one and will be the future of computing. I was off by
> about 13 years. The only missing part was that I thought that this yet
> to be born language would itself be a target language. It may still come
> to pass. 
> 

Right on!  First I must say that I appreciate someone standing up for
"elegance".  People trapped into using inelegant programming tools tend to
throw this word around like conservatives say the "L word" for "liberal"
(sorry, no offense to conservatives).  It is as if they intend for
"elegance" to import "Man, you are so pie in the sky, the code your kind
produces must be wrong".  Who knows what they are actually thinking when
they say the "E word".  In fact it turns out (fortunately) that elegant code
is usually more efficient and stable.  Einstein's quote on "Simplicity"
applies to programming as well as physics.  The hard part is teaching people
to appreciate the "and no simpler" part and how hard but not impossible that
is to achieve.

Second (and I will get flamed for this since I did years ago when posting
same, but don't care), the reason the Lisp Machine died in the first place,
*IMHO*, is that it was *not* a Lisp Machine.  What I mean by that is that on
the Lisp Machine you still had to compile a program.  Imagine a new Lisp
Machine where you only needed to do a "read" on an expression and the
machine interpreted the translated expression codes.  Could you imagine
advertising a machine where "No Compiles are Necessary yet still Faster than
any RISC or CISC"?  This would give Lisp a new image and elegance would
become a respected word again.  I know that people are going to say "You
left out the "no simpler part"" and "good luck".  Think of it this way,
Kennedy didn't let that bother him when he was crazy enough to proclaim that
Americans were going to put a man on the moon.  Of course he was just one of
those crazy "L words".  And then again there was the "C word" guy that
proclaimed Star Wars.  Oh well.

Sorry for mixing in some politics, but if you think about it, politics has a
lot to do with why Lisp is going through some tough times.  Some of the "D
word" people can tell you that.  And good luck with the JAVA machine, I hope
it doesn't wind up making expresso for the "J word" people.  Ha Ha, no one
can say the "L word" people, that is already reserved by the conservatives.
:-)

-- 

William P. Vrotney - ·······@netcom.com
From: Rainer Joswig
Subject: Re: Inner Classes in Java
Date: 
Message-ID: <joswig-ya023180000711961303200001@news.lavielle.com>
In article <·················@netcom.com>, ·······@netcom.com (William Paul
Vrotney) wrote:

> Second (and I will get flamed for this since I did years ago when posting
> same, but don't care), the reason the Lisp Machine died in the first place,
> *IMHO*, is that it was *not* a Lisp Machine.  What I mean by that is that on
> the Lisp Machine you still had to compile a program.  Imagine a new Lisp
> Machine where you only needed to do a "read" on an expression and the
> machine interpreted the translated expression codes.  Could you imagine
> advertising a machine where "No Compiles are Necessary yet still Faster than
> any RISC or CISC"?  This would give Lisp a new image and elegance would
> become a respected word again.

Unless you describe in more detail what you mean, the
above paragraph sounds very unclear to me (I would
like to argue against it ;-) ).

Rainer Joswig
From: M'Isr
Subject: Re: Inner Classes in Java
Date: 
Message-ID: <563q42$egb$1@mhafc.production.compuserve.com>
>>a Lisp Machine with no compilers
where can I get one and/or is it possible
to build. How 'bout a special program that
translates all C/C++ code into Lisp.
It should be easy to do in Lisp,
I've played with Self-Repoducing Cellular Automa (virii for the
 non tech-heads) but need help with the net details.
In Lisp:
  Data=Program
  Every thing is first class
  Lisp has a uniform systax () ;the prens.
"Such a project should be easy and will
show how elegant Common Lisp is"

P.S. If any one is intrested in a worm written
in ANSI Common Lisp that runs in interpeted mode
email:
 ··········@Orion.Crc.Monroecc.Edu
or
 reply to this post
by December (or sooner) the sooner the better.

-- 
We make the sotfware that powers tomarrow!
From: Richard A. O'Keefe
Subject: Re: Inner Classes in Java
Date: 
Message-ID: <56bukd$l8p$1@goanna.cs.rmit.edu.au>
············@wildcard.demon.co.uk (Cyber Surfer) writes:

>My own opinion is that Lisp machiens "died" because not enough people
>wanted them. It's that simple. I like the idea of Lisp machines, tho
>I've never used one, and probably never will (unless I'm lucky). That's
>because such a machine will be seen as a machine that only runs code
>written in one language. However fine that language may be, many will
>see it that "limitation" as a defect.

The Xerox Lisp machines (1108, 1109, 1185, 1186) ran
 - MESA ("industrial strength Pascal", it's been called)
 - Cedar (I think)
 - Smalltalk
 - Lisp
 - Prolog
They had full support for 16-bit characters back when most UNIX systems
were still 7-bit characters (i.e. not 8-bit clean).  They also ran whatever
the "Star" software was written in; I've no idea what that was.

The Symbolics Lisp machines ran
 - Lisp
 - Prolog 
 - Fortran
 - C
 - Pascal (I think)
All of these could run together in the same program.

>Ironically, Lisp machines
>tend to be rather expensive, which makes it very unlikely that many
>people will use them.

Economics.  The Xerox Dorados (very good in their day) were once
described to me as "hand-built by leprechauns".  They never got the
customer base to switch to volume production.  (They were getting a
lot better at the hardware side.)  A lot of it was marketing.  I used
to work at Quintus, who wrote the Prolog system for the D-machines.
Xerox wouldn't let us have their customer list; they insisted that
only _their_ salesmen would be allowed to sell our product (which
Xerox had paid several $million to have on their hardware).  Their
salesmen only sold about 30 copies!  (It never occurred to them they
they might have been able to sell Lisp machines to people who wanted
Prolog machines...)

>Before you can sell Lisp to people, you must sell the idea that Lisp
>is something they want. Issues like "interpretered vs compiled" are
>irrelevant when most people have a bias based on the _language_ itself.

Symbolics are still in business, and you can still buy Xerox Lisp.
Maybe they should port PERL to their systems.

>How do we get Lisp memes to kick ass? Make Lisp look like Java.

It's called Dylan.  Hasn't helped so far.

>As for Lisp machines, well. Doesn't Open Genera show that you can
>do it using a relatively "conventional" machine? OSF/1 on an Alpha
>can run Open Genera, but it can also run apps not written in Lisp.

Xerox Lisp (now called Medley) has been running on SPARC/UNIX/X
systems for _years_.

-- 
Mixed Member Proportional---a *great* way to vote!
Richard A. O'Keefe; http://www.cs.rmit.edu.au/%7Eok; RMIT Comp.Sci.
From: Rainer Joswig
Subject: Re: Inner Classes in Java
Date: 
Message-ID: <joswig-ya023180001311961121520001@news.lavielle.com>
In article <············@goanna.cs.rmit.edu.au>, ··@goanna.cs.rmit.edu.au
(Richard A. O'Keefe) wrote:

> The Symbolics Lisp machines ran
>  - Lisp
>  - Prolog 
>  - Fortran
>  - C
>  - Pascal (I think)

Don't forget Ada. ;-)

> >As for Lisp machines, well. Doesn't Open Genera show that you can
> >do it using a relatively "conventional" machine? OSF/1 on an Alpha
> >can run Open Genera, but it can also run apps not written in Lisp.
> 
> Xerox Lisp (now called Medley) has been running on SPARC/UNIX/X
> systems for _years_.

Don't forget the PC version. ;-)

Rainer Joswig
From: Cyber Surfer
Subject: Re: Inner Classes in Java
Date: 
Message-ID: <847960466snz@wildcard.demon.co.uk>
In article <············@goanna.cs.rmit.edu.au>
           ··@goanna.cs.rmit.edu.au "Richard A. O'Keefe" writes:

> ············@wildcard.demon.co.uk (Cyber Surfer) writes:
> 
> >My own opinion is that Lisp machiens "died" because not enough people
> >wanted them. It's that simple. I like the idea of Lisp machines, tho
> >I've never used one, and probably never will (unless I'm lucky). That's
> >because such a machine will be seen as a machine that only runs code
> >written in one language. However fine that language may be, many will
> >see it that "limitation" as a defect.
> 
> The Xerox Lisp machines (1108, 1109, 1185, 1186) ran
>  - MESA ("industrial strength Pascal", it's been called)
>  - Cedar (I think)
>  - Smalltalk
>  - Lisp
>  - Prolog

Please note that I said that Lisp machines will be "seen as"
limited to one language, not that they are limited to one
language. It's unpleasant, but sometimes perceptions can be
more powerful than the reality.

> >Ironically, Lisp machines
> >tend to be rather expensive, which makes it very unlikely that many
> >people will use them.
> 
> Economics.  The Xerox Dorados (very good in their day) were once
> described to me as "hand-built by leprechauns".  They never got the
> customer base to switch to volume production.  (They were getting a
> lot better at the hardware side.)  A lot of it was marketing.  I used
> to work at Quintus, who wrote the Prolog system for the D-machines.
> Xerox wouldn't let us have their customer list; they insisted that
> only _their_ salesmen would be allowed to sell our product (which
> Xerox had paid several $million to have on their hardware).  Their
> salesmen only sold about 30 copies!  (It never occurred to them they
> they might have been able to sell Lisp machines to people who wanted
> Prolog machines...)

Xerox shot themselves in the feet, many times. <sigh> Things
might be very different today if they'd got the sales and marketing
side things right. Meanwhile, Apple and MS did get it right,
but took a very different approach. The rest is history.

In the early 80s, I'd have loved to have used an Alto or a
Dorado. Today, I can still admire them, but I feel a great
sadness.
 
> >Before you can sell Lisp to people, you must sell the idea that Lisp
> >is something they want. Issues like "interpretered vs compiled" are
> >irrelevant when most people have a bias based on the _language_ itself.
> 
> Symbolics are still in business, and you can still buy Xerox Lisp.
> Maybe they should port PERL to their systems.

Maybe they should! Or a VB clone...However, they're prices
may alien most of the people likely to use VB. Hmmm, Java?
 
> >How do we get Lisp memes to kick ass? Make Lisp look like Java.
> 
> It's called Dylan.  Hasn't helped so far.

It's early yet. I'm still hopeful. On the other hand, I write
code for Windows, so I need a lot of hope! This means I have
big expectations (i.e. _full_ Win32 support). I'd like someday
stop using C/C++, and still have a job.
 
> >As for Lisp machines, well. Doesn't Open Genera show that you can
> >do it using a relatively "conventional" machine? OSF/1 on an Alpha
> >can run Open Genera, but it can also run apps not written in Lisp.
> 
> Xerox Lisp (now called Medley) has been running on SPARC/UNIX/X
> systems for _years_.
 
I'd rather have an Alpha. ;-) It's just as unlikely, of course.
What've you got for Win32 and costs less than $2000? If all I
wanted was Lisp, I'd be using Linux and CMU CL, but I'd be happy
with ACL Lite if people didn't want stand-alone apps.

Most people prefer not to use Unix. That seriously cuts the size
of your market, but it's still larger than the market for Lisp
machines. It's all relative. If you don't have to worry about
such things, great, but you're an exception.

Wasn't this thread about inner class in Java, a long time ago? ;)
-- 
<URL:http://www.enrapture.com/cybes/> You can never browse enough
Future generations are relying on us
It's a world we've made - Incubus
We're living on a knife edge, looking for the ground -- Hawkwind
From: Arun Welch
Subject: Re: Inner Classes in Java
Date: 
Message-ID: <AWELCH.96Nov23214321@awelch.inhouse.compuserve.com>
>>>>> " " == Cyber Surfer <············@wildcard.demon.co.uk> writes:

    >> Symbolics are still in business, and you can still buy Xerox
    >> Lisp.  Maybe they should port PERL to their systems.

     > Maybe they should! Or a VB clone...However, they're prices may
     > alien most of the people likely to use VB. Hmmm, Java?
 
$500 is too expensive for the VB crowd? Wow, there must be a lot of
pirate copies out there.

...arun
---------------------------------------------------------------------------
Arun Welch					5000 Arlington Centre Blvd
Lead Engineer, Internet R&D			Columbus, OH 43220
CompuServe      				······@compuserve.net
From: Cyber Surfer
Subject: Re: Inner Classes in Java
Date: 
Message-ID: <848861313snz@wildcard.demon.co.uk>
In article <····················@awelch.inhouse.compuserve.com>
           ······@awelch.inhouse.compuserve.com "Arun Welch" writes:

> $500 is too expensive for the VB crowd? Wow, there must be a lot of
> pirate copies out there.

ACL for Windows costs $2500. That's more than 8 times as much as
VC++. If there's a Lisp (we were discussing Lisp, at some point)
available commercially for $500 that can compile to an .EXE or .DLL,
support callbacks, and put the runtime code in a DLL, then it'll
be _very_ popular. Once that's available, then we can ask for
support for stuff like OCX, OLE, etc. Maybe even multi-threading.

Smalltalk MT can do this stuff _now_, and it costs $295.

I really feel that we should change the subject for this thread,
tho I can't right now think of an appropriate alternative. "Where
is Lisp going?" might be good, or possibly even, "Where should
Lisp be going?", as that distinguishes between what we'd _like_,
and what we'll actually get.

My feeling is that I may write more Smalltalk code than Lisp code,
next year. As long as I write more code using ST _and_ Lisp than
C++, I'll be happy. ST and Lisp are closer to each other than they
were to C++, in the sense that they can be joy to use, while C++
is rarely better than painful to use.
-- 
<URL:http://www.enrapture.com/cybes/> You can never browse enough
Have you seen the Browser Bunny? | Glingleglinglebloodyglingle
From: Ian Joyner
Subject: Re: Inner Classes in Java
Date: 
Message-ID: <32867934.848@syacus.acus.oz.au>
Genericity is essential, but inner classes are superfluous.

Genericity gives you an important way of building new types out of
existing types. As it does something different to inheritance, it
is orthogonal to inheritance, and for a truly powerful type based
language it is essential.

Inner classes, however, add nothing to the modelling capability of
the language. There are no new types that you can build that you
can't otherwise do with the common has-a relation. The need for
inner classes only has to do with project organisation, and the
benefits here are rather dubious, and more likely out weighed
by the disadvantages. These are that the inner class has
visibility to the outer class, without the benefits of good
OO design of accessing only through interfaces. This in turn
reduces reusability.

So, genericity yes, inner classes no.
-- 
Ian Joyner           |"for when lenity and cruelty play   |All opinions
are
Unisys (ACUS)        | for a kingdom, the gentler gamester|personal and
are not
···@syacus.acus.oz.au| is the soonest winner" W.S. Henry V|Unisys
official comment
From: Marty Hall
Subject: Re: Inner Classes in Java
Date: 
Message-ID: <x5vibj6l12.fsf@rsi.jhuapl.edu>
·······@spyglass.com (John Brewer) writes:
> 
> In article <·············@digitalideas.com>, ···@digitalideas.com wrote:
> 
> > I wish all those concerned would get off their duff and make some noise
> > and let SUN know that this [Inner Classes] is important. 
[...]
> Or maybe when they try to learn about these things by reading the
> newsgroups about dynamic languages, they see themselves being ridiculed,
> and go back to a language that'll pay the bills.

I certainly agree that you don't learn how to really use these these
things from reading the newsgroups. Are you also suggesting that
closures have no use in real-life software development projects? If
so, you might want to see a recent thread in comp.lang.java.advocacy
that gave some examples on how Inner Classes let you more
easily/quickly do some mundane, ordinary, easy, low-level (is that
enough to count as paying the bills :-) tasks like writing call backs
for button events and avoiding repeating code when doing repetitive
graphics tasks. Besides, although many programmers never do the
high-level tasks that are most commonly associated with benefiting
from closures and first-class functions, *some* people do (me for
instance :-). 

> Seriously, I would hate to see the Java Language or the Java VM ruined by
> an attack of creeping elegance.  The Java VM's got to run on cellular
> phones and GameBoy-type devices.  I hope the Java VM architects in
> particular keep in mind that ease of porting needs to be a primary goal,
> not lots of extra features.

The Inner Classes spec very specifically says that it will not require
*any* changes in the Java VM.

To recapitulate, the spec is available at
http://www.javasoft.com/products/JDK/1.1/designspecs/innerclasses/index.html
and the email address for comments on the potential value (and whether
it should be included in Java 1.1) is
···················@lukasiewicz.Eng.Sun.COM. 

					- Marty

Lisp Programming Resources: <http://www.apl.jhu.edu/~hall/lisp.html>
Java Programming Resources: <http://www.apl.jhu.edu/~hall/java/>
From: Cyber Surfer
Subject: Re: Inner Classes in Java
Date: 
Message-ID: <847786545snz@wildcard.demon.co.uk>
In article <···············@best.com>
           ···@intentionally.blank "Thomas Breuel" writes:

> ············@wildcard.demon.co.uk (Cyber Surfer) writes:
> > My own opinion is that Lisp machiens "died" because not enough people
> > wanted them. It's that simple.
> 
> Actually, it's probably even simpler: for the same budget, you could
> either buy Lisp machines for a fraction of your students/researchers,
> or you could buy Sun workstations with similar raw compute power and
> better screens for all of them.  The Lisp machines were true single
> user machines, making it even difficult for a new user to log
> in without rebooting.  The Suns, on the other hand, could 
> support multiple simultaneous users and terminals if need be.

That's just an explanation of why people don't want them. ;-) Yes,
I agree. Very few people want an all-Lisp machine. Most people
want single-user machines, and don't want Unix. I'm not talking
about "most Lisp people", but the kind of people that buy software
large enough numbers to make an impact.

Obviously, there are a number of people who will want Lisp machines,
but are they enough to keep Lisp machines alive? Apparently not.
Will there be enough Open Genera users to keep Symbolics alive?
We'll see. The fact that very few people use Lisp machines today
doesn't help support the idea (err, meme?) that this would be a
good thing.
 
> Even if individual productivity was lower on the Suns, aggregate
> productivity for a department may well have been higher, since people
> didn't have to spend their time twiddling their thumbs waiting for
> machines.  If Lisp machines had cost only a little more than Suns for
> similar performance, they'd probably still be very popular in the
> research and academic community.

Is Open Genera cheap enough to compete with Suns? I dunno. I'm
not that kind of a user, nor are any of the people I know. Even
a Sun is expensive, and so the popularity of Lisp on Suns isn't
that significant, except to people who use Suns.

Most people don't even use Unix...
-- 
<URL:http://www.enrapture.com/cybes/> You can never browse enough
Future generations are relying on us
It's a world we've made - Incubus
We're living on a knife edge, looking for the ground -- Hawkwind
From: William Paul Vrotney
Subject: Re: Inner Classes in Java
Date: 
Message-ID: <vrotneyE0wBzz.D5A@netcom.com>
In article <··············@lutefisk.jetcafe.org> ···@lutefisk.jetcafe.org (James Stephen Larson) writes:
In article <·················@netcom.com> ·······@netcom.com (William Paul Vrotney) writes:
        ... read type Lisp machine stuff ...
> 
> That's an interesting analysis of the fate of Lisp Machines.  I always
> thought that it had to do with the advances in price/performance
> ratios of general-purpose hardware compared to special-purpose
> hardware.
> 
> If you don't compile then you forsake compile-time optimizations.  If
> you don't use an existing OS you have to spend all of your time
> writing device drivers.  If you don't use general-purpose hardware
> then you spend all of your money building soon-to-be-obsolete chips.

What makes you think that a Lisp interpreter machine has to be a special
purpose machine?  Why is an Intel 486 interpreter any more general than a
Lisp interpreter?  You can say that the Lisp interpreter machine is "higher
level", but that should make the hardware faster.  (I don't know what to say
about the RISC.  Perhaps just a temporary hardware trick, but a good one at
that :-)) The reason for building floating point functions into hardware is
to make the processor faster.  In some sense a Lisp interpreter machine is
*more* general than a 486 interpreter machine.  The sense being that it is
implemented using higher level abstractions leaving details to specific
implementations.  Implementing a Lisp *language* by contrast into hardware
and/or OS combination *would* make the machine a special purpose machine as
was done in the past.

My original analysis of the fate of Lisp Machines suggested that maybe they
were *not* true Lisp machines (whatever that means, it still needs to be
discovered, I defined it roughly by attempting to eliminate a compiler for
Lisp code).  They were great machines, but they were shall we say more like
"Lisp Assisted" machines.  For example consider the Symbolics machine
language and FEP (Front End Processor).  Another aspect of Lisp machines
that is orthogonal but sometimes confused with the hardware is the Lisp OS.
There is no reason why a new Lisp machine could not run a Unix kernel for
example.  The fact that the Symbolics had no Unix implementations (a least
no well known ones that I know of) was another aspect that gave it the
unique beast image.  Another ill fated aspect is that except for the TI Lisp
chip (and other kinds of architectures) the Lisp machines were old fashioned
hardware technlogy, now better suited to heating your home.

> 
> Avoiding general-purpose hardware and software can only doom efforts
> to use high-level languages.
> 

Exactly!  But surprisingly this could be a good thing.  Image if we could
program in either a Lisp assembly language if you will or macro expanded
language derivations.  It might be possible to eliminate many of the
problems associated with high level language compilers.  Compilers are some
of the most difficult programs to do well.  If I recall correctly I think
that Knuth left compilers for his last chapter or there abouts.

-- 

William P. Vrotney - ·······@netcom.com
From: Chris Page
Subject: Re: Inner Classes in Java
Date: 
Message-ID: <AEB1790A-ACB8A@205.149.164.180>
William Paul Vrotney <·······@netcom.com> wrote:
> In article <··············@lutefisk.jetcafe.org> ···@lutefisk.jetcafe.org
> (James Stephen Larson) writes:
> In article <·················@netcom.com> ·······@netcom.com (William
Paul
> Vrotney) writes:
[...]
> 
> What makes you think that a Lisp interpreter machine has to be a special
> purpose machine?  Why is an Intel 486 interpreter any more general than a
> Lisp interpreter?  You can say that the Lisp interpreter machine is
> "higher level", but that should make the hardware faster.
[...]
> In some sense a Lisp interpreter machine is *more* general than a 486
> interpreter machine.

I can see your point, and it reminds me of a comment a friend made about
how these architectures are, in a sense, C-machines. But I'm only familiar
with 680x0, PowerPC, and 80x86 hardware. I'd like to see an example or two
of the higher-level features of your proposed Lisp-machine that could be
executed more efficiently than, say, an equivalent series of PowerPC
instructions. In addition, I'd like to see how these might enhance
execution of languages other than Lisp, or at least not slow them down or
waste valuable logic gates.

> Image if we could
> program in either a Lisp assembly language if you will or macro expanded
> language derivations.  It might be possible to eliminate many of the
> problems associated with high level language compilers.  Compilers are
> some
> of the most difficult programs to do well.  If I recall correctly I think
> that Knuth left compilers for his last chapter or there abouts.

Perhaps that's because decent compilers spend their lives proving things
about your code in order to optimize it, which is a harder problem than
parsing and generating brain-dead "correct" code. But I'm not sure this
will ever go away, because code is (usually) executed more than once, and
there is always some gain to be had by optimizing the code up front (global
optimizations take time). Then again, perhaps this only argues for
optimizers, not compilers?

...........................................................................
Chris Page                                       define module usenet-post
Dylan Hacker                                       use dylan;
Harlequin, Inc.                                    use standard-disclaimer;
<http://www.best.com/~page/>                     end module usenet-post;
From: Andreas Bogk
Subject: Re: Inner Classes in Java
Date: 
Message-ID: <y8ad8xf9yjr.fsf@hertie.artcom.de>
>>>>> "Chris" == "Chris Page" <····@best.com> writes:

    Chris> I can see your point, and it reminds me of a comment a
    Chris> friend made about how these architectures are, in a sense,
    Chris> C-machines. But I'm only familiar with 680x0, PowerPC, and

This is literally true. Modern CPU design is basically a cycle of
specifying a CPU, writing an emulator, porting a C compiler, and
seeing how SPICE, TeX and GCC perform on the emulator. This is
repeated until they do.

    Chris> 80x86 hardware. I'd like to see an example or two of the
    Chris> higher-level features of your proposed Lisp-machine that
    Chris> could be executed more efficiently than, say, an
    Chris> equivalent series of PowerPC instructions. In addition,

I'm not following the thread well enough, but Hardware-assisted
garbage collection is a feature that is badly missing from modern
CPUs.

    Chris> I'd like to see how these might enhance execution of
    Chris> languages other than Lisp, or at least not slow them down
    Chris> or waste valuable logic gates.

Dylan would benefit, too.

Andreas

-- 
The truth about COM: No inheritance!
From: William Paul Vrotney
Subject: Re: Inner Classes in Java
Date: 
Message-ID: <vrotneyE14A76.EEx@netcom.com>
In article <··············@205.149.164.180> "Chris Page" <····@best.com> writes:

> 
> William Paul Vrotney <·······@netcom.com> wrote:
> > In article <··············@lutefisk.jetcafe.org> ···@lutefisk.jetcafe.org
> > (James Stephen Larson) writes:
> > In article <·················@netcom.com> ·······@netcom.com (William
> Paul
> > Vrotney) writes:
> [...]
> > 
> > What makes you think that a Lisp interpreter machine has to be a special
> > purpose machine?  Why is an Intel 486 interpreter any more general than a
> > Lisp interpreter?  You can say that the Lisp interpreter machine is
> > "higher level", but that should make the hardware faster.
> [...]
> > In some sense a Lisp interpreter machine is *more* general than a 486
> > interpreter machine.
> 
> I can see your point, and it reminds me of a comment a friend made about
> how these architectures are, in a sense, C-machines. But I'm only familiar
> with 680x0, PowerPC, and 80x86 hardware. I'd like to see an example or two
> of the higher-level features of your proposed Lisp-machine that could be
> executed more efficiently than, say, an equivalent series of PowerPC
> instructions. 

Fair enough.  But first I have to say that my proposal stressed the idea of
eliminating the need to compile (at least first for Lisp like languages).
Let's say that we could build a machine that executes Lisp like languages
with no compilation needed and that it could run as fast as, but maybe not
faster than a powerPC.  Would we have achieved something by any metric that
is worth spending any money on?  Also, I am not qualified to comment
authoritatively on such a machine. Someone like Steele could certainly
comment more substantially on such machine architectures.  But I will try
some ideas below, just don't expect too much.  In what I show below it is
assumed that CDR coding would be used on such a machine.  CDR coding is a 2
bit or small bit field in a cell that allows list expressions to be stored
efficiently as arrays.  It is very similar to the extension bits used in
conventional processor instructions.

First some background.  Note that the system would still have to do in
effect a "read" and macro expansions.  The macro expansion idea is
fundamentally important.  To a Common Lisp interpreter there are basically
only two kinds of instruction control schemas if you will.  One is function
expression evaluation, called "apply" in Common Lisp and the other is macro
expansion.  An implementation is allowed to implement any macro expansion as
a special form.  A special form is like a hard wired macro after it has been
expanded and optimized.  In Common Lisp there are a few "macros" that must
be implemented as special forms.  They are not called macros since some
special forms need to exist primitively.  Some of these that have
conventional processor counterparts are

        special form                    Conventional CPU
        ------------------------------------------------        
        go                              unconditional branch instructions
        if                              branch on condition instructions
        return-from                     Return from jump to subroutine
        setq                            load and store registers

An example of a macro that is not required to be a special form is the
"cond" macro because it can be expanded into a series of "if" special forms.

And then there are some that do not have conventional processor counterparts
but are generally important such as
        
        special form                    Meaning
        ------------------------------------------------        
        let                             bind some variables and execute
        lambda                          similar to let but can be "applied"

There are a few other special forms that have special importance, such as
handling conditions by unwinding stack frames, but not important for a
surface general discussion.

The "lambda" in Common Lisp is not specified as a special form but only as a
type of expression since lambdas are translated away by the compiler.
However in the Read Lisp machine a lambda would be considered more like a
macro or special form since it would not need to be compiled.  The lambda in
Common Lisp describes all of the semantics of functions.  Some conventional
processors have special instructions for calling functions and some have
special instructions to support when the function is called such as saving
and restoring some registers specified by a set of bits.  The lambda would
come closet to those kinds of instructions but with many other important
high level abstractions.  Also the lambda itself expresses the very notion of
computation in its simplest form in a realistic way.  Unlike the simplistic
and impossible to implement (requires an infinite tape) Turing machine the
lambda is quite attractive as a possible hardware implementation idea.

Lambda and "let" are both very important and similar in that implicit in
what the interpreter does with them involves the binding of variables to
values.  The closest thing to this in a conventional processor is loading
registers and stack offsets with values and then having a block of code
refer to those as variables.  This is one of the difficult tasks that is
left to the compiler to figure how.  That is, how to optimize, since the
conventional processor is not designed to do this implicitly.  A large
number of instructions have to be "compiled in" to the machine's memory to
make this happen successfully.  And even then there is no guarantee that the
instructions to do this will not break with an exception.  We just have to
have faith that the compiler writer and compiler user (programmer) are both
diligent.

The binding of variables could be a huge high level savings in execution
speed of the Read Lisp machine.  Especially if on chip caches could be used
in some clever way in conjunction with some design for a hardware binding
mechanism.  The safety of the Read Lisp machine would also be an improvement
because the reliance on the compiler and programmer to arrange all of the
instructions to do all this on the conventional processor would be
eliminated.

However, one of the biggest design tasks of a Read Lisp machine would be in
choosing how to do variable binding.  There have been some attempts at doing
this.  The problem on one hand is that there are so many choices.  Should
the variables be bound using a "deep" or "shallow" binding strategy?  Should
variables have dynamic and local extent?  How would function closures, a
special kind of binding come into the picture?  How could passing of
environments and continuations help in the mix?  For this surface level
discussion it is not necessary to understand these terms, only that they
present a number of choices for a hardware implementation.  On the other
hand, all of these choices offer more ways that they could be combined to
provide a more efficient hardware implementation.  That is to say that our
choices are not limited.

One big challenge of such a Read Lisp machine would be the efficient
representation and execution of symbolic functions and variable binding and
assignment.  For example if we had three sequential cells in memory

        foo
        arg1
        arg2

and this was to be interpreted as the function call

        (foo arg1 arg2)

the cell "foo" would be a pointer to some kind of object.  Ideally it would
be, as in Lisp software interpreters, a pointer to a symbol object and the
machine would extract the symbol's function object and apply to the
following arguments.  This process, ideally would be guided by the lambda
form mentioned earlier.  This also suggests some sort of hardware assist for
the lambda form itself.  If designed cleverly this could possibly be faster
that loading some registers, jumping to a subroutine and then saving some
registers.  However it seems that we could do better by having special
registers for such symbolic objects to accelerate these common operations
and avoid as much as possible any indirection usually suffered for software
interpreters on conventional processors.

This almost exact principle applies to symbolic variable assignment, for
example the three sequential cells in memory

        setq
        symbol
        value

This time instead of the register facilitating the function call it would
facilitate grabbing the symbol's value cell and set it to the value.  Binding
variables is similar

        let
        var1
        value1
        var2
        value2
        expression
 
In essence the the "var1" and "value1" are facilitated by the same kind of
register, however in the case of a variable binding a special kind of
assignment has to take place.  There could be a clever acceleration here
compared to conventional processors.  This is because the binding mechanism
is closely akin to the necessary extra instructions needed on the
conventional processor to save and restore registers.  That is, one has to
save a register before setting it to a new value then restore it to the old
value before when returning to the block of code from whence it sprang.  In
a similar way a symbol's value can change depending on it's environment.
Another difference is that on a conventional processor, and even a RISC,
there is a small number, or on a RISC, a finite number of such registers.
On the Read Lisp machine there would be an unlimited number of such bindable
symbols.

In recursion, the stack on a conventional processor grows quickly with new
bindings each time a function calls itself.  There may be possible ways to
improve on this with such a symbolic (not implying Symbolics like Machine).
like processor.  For example there could be a push stack list associated
with each symbol, eliminating extra instructions to mark stack frames on a
conventional processor and possibility eliminating the infamous stack
overflow problem.  Such "mini-stacks" could automatically be allocated to on
chip caches to accelerate the binding mechanism.  It is important to notice
that such binding environments of variables is typically per lambda or "let"
block a relatively small block of stuff that goes away after the exit of such
forms.

The design of such symbol registers on such a machine would be a critical
part of the aspect of the overall architecture.  In a Lisp world, symbols
are just objects like any other kind of object, however in the hardware they
would need to be treated as special objects.  All other objects could take
advantage of an on chip cache, where the interpretation of various
properties of the object would be done in parallel and quit fast.  One such
important property would be fast type checking and dispatching.  Another
important property would be the reference tree hardware to support garbage
collection.  Just in the area of hardware assisted garbage collection alone
would be a huge acceleration factor for such a machine.  Another advanced
opportunity for efficiency is in the area of parallel evaluation of
expression arguments.

A casual observer might say that this kind of Read Lisp machine is more
complicated than a CISC.  But once again, you could view this kind of
machine as not only more general but also as a reduced instruction set,
similar to a RISC.  For example a clever hardware implementation of a lambda
could offer reduction over the variety of instructions that would be needed
to implement it on a conventional processor and at the same time more bang
for the bits.


> instructions. In addition, I'd like to see how these might enhance
> execution of languages other than Lisp, or at least not slow them down or
> waste valuable logic gates.

If such computer architectures were to eventually flourish, for languages
that are non-Lisp like it is difficult to see how they would be enhanced,
perhaps they would, perhaps they wouldn't.  Perhaps such languages would
ultimately some day become obsolete, with the history that they were
designed too specifically for the old fashioned processor architecture.  One
might guess that processors 100 years from now would not have similar kinds
of architectures as those of today.


-- 

William P. Vrotney - ·······@netcom.com
From: Doug Landauer
Subject: Threads in Fantasyland (was Re: Inner Classes in Java)
Date: 
Message-ID: <landauer-1911961159410001@news.apple.com>
> > > > > > >   [... Reams upon reams of futile Lisp-boosting ...]

A small request:  if you continue this thread, could you folks
please repair the subject line so that the rest of us will know
that the postings contain nothing of interest, nothing related
either to Java (contrary to the offending subject line) or dylan
(despite the charter of one of the newsgroups this stuff is being
posted to)?

Please?
-- 
    Doug Landauer        ········@apple.com (work)
                         ········@scruznet.com (not-work)
From: Martin Cracauer
Subject: Re: Inner Classes in Java
Date: 
Message-ID: <1996Nov19.123629.16124@cons.org>
"Chris Page" <····@best.com> writes:

>William Paul Vrotney <·······@netcom.com> wrote:

>> What makes you think that a Lisp interpreter machine has to be a special
>> purpose machine?  Why is an Intel 486 interpreter any more general than a
>> Lisp interpreter?  You can say that the Lisp interpreter machine is
>> "higher level", but that should make the hardware faster.
>[...]
>> In some sense a Lisp interpreter machine is *more* general than a 486
>> interpreter machine.

>I can see your point, and it reminds me of a comment a friend made about
>how these architectures are, in a sense, C-machines. But I'm only familiar
>with 680x0, PowerPC, and 80x86 hardware. I'd like to see an example or two
>of the higher-level features of your proposed Lisp-machine that could be
>executed more efficiently than, say, an equivalent series of PowerPC
>instructions. In addition, I'd like to see how these might enhance
>execution of languages other than Lisp, or at least not slow them down or
>waste valuable logic gates.

The Self group at Sun published various papers about how the output of
their Self compiler looks like. According to them, there's very little
difference between C code and highly optimized code for 'dynamic'
languages. And that is not because the target machine is a "C"
machine, they claim the requirement to run other languages fast is to
be like C output anyway, so the optimal machine is the same for both.

Martin
-- 
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
Martin Cracauer <········@cons.org> http://cracauer.cons.org Fax +49405228536
"As far as I'm concerned,  if something is so complicated that you can't ex-
 plain it in 10 seconds, then it's probably not worth knowing anyway"- Calvin
From: Cyber Surfer
Subject: Re: Inner Classes in Java
Date: 
Message-ID: <848162633snz@wildcard.demon.co.uk>
In article <··············@lutefisk.jetcafe.org>
           ···@lutefisk.jetcafe.org "James Stephen Larson" writes:

> Wanting to take over the OS is the first stage of "Language Megalomania";
> wanting to take over the hardware is the second stage.  If there's a third
> stage, I don't want to know about it.

I'm inclined to agree. However nice such machines may be for
programmers, ordinary users _don't want them_. They don't want
any OS, if fact. What they want are apps.

It's the same with TV sets. Few of us know what's inside these
boxes, and most of us couldn't care. All we need to know is how
to switch them on, pick a channel, and possibly adjust the volume
from time to time. Few of us really want a video machine. What
we want is to be able to watch a show or a film at a time that
we can choose, and perhaps more than once.

Only the serious techies will want to take the back of the TV,
or open up their video machine. However, that's how programmers
look to what we may, unfortunately, think of as "mere users".
 
> Avoiding general-purpose hardware and software can only doom efforts
> to use high-level languages.

Too right. You can marginaise yourself to the point where few
people understand what you're doing, or care. Then one day they
stop your money, claiming that it can be better spent on other
things, like C++ or Java.
 
> This is not to say that existing OS's are exactly friendly to non-C
> languages and runtime systems.  I'm working on it...
 
Yes, and you may have noted that I made a distinction between
apps and the OS, not between one OS and another. While some people,
e.g Xerox PARC, have suggested that an OS is a collection of
things that don't fit into a language, there's still a strongly
help belief that an OS is the "right way" of running a machine.

Someday the user interfaces used by OSs may become totally
transparent, but I don't think that'll happen by insisting that
apps should run without an OS. There's an assumption that by
removing the OS, you'll remove the language barriers that make
computers hard to use. That may have been easier to believe in
the 70s, when the GUI idea was relatively new, but MS and Apple
have shown that you don't need a special language, like Smalltalk,
in order to do this, nor do you need to abandon the OS idea.

While it indeed be _better_ to abandon the OS, nobody has proved
that this is necessary. MS, Apple, and others continue to use
OSs _because they can_. Until you can show that they can't do
this, there's no way they'll stop! Running vertical market apps
on Lisp machines will have no effect at all, precisely because
vertical market apps will, by definition, be ignored.

As I've said before, and I'll continue saying. the way to get
Lisp - and the ideas that come with it - accepted is to target
the software market where the most attention is focused. This
is why Java is likely to have a much greater impact on software
development than Lisp, in the next few years.
-- 
<URL:http://www.enrapture.com/cybes/> You can never browse enough
Future generations are relying on us
It's a world we've made - Incubus
We're living on a knife edge, looking for the ground -- Hawkwind
From: Pete Su
Subject: Re: Inner Classes in Java
Date: 
Message-ID: <usp67qysc.fsf@jprc.com>
·······@netcom.com (William Paul Vrotney) writes:
> 
> What makes you think that a Lisp interpreter machine has to be a special
> purpose machine?  Why is an Intel 486 interpreter any more general than a
> Lisp interpreter?  

I believe that you should go back and read bits of the history of
computer architecture.  Computer Architecture is a curious mix of
design principals and a terribly Darwinian economy of ideas.  For
better or worse, there is more to building a machine that will perform
well and be useful than just putting together a nice instruction set.

Assuming that you even *could* build a piece of hardware that runs
lisp directly faster than a 486 runs it with a compiler (which I
seriously DOUBT that you can... see below) the result would be
practically useless in the marketplace.  First, you need an OS.
Second, the OS has to have applications, which would have to be
developed from scratch.  Third, the OS needs support for hardware
(device drivers) which would need to be developed from scratch.  We
are quickly approaching a time when the cost of developing these
things is much higher than the gain that you obtain by having them.
Microsoft and Apple (say) have a 15 year head start on you in this
arena, why would anyone buy such a machine?  The 486 in the PC already
has all of this infrastructure surrounding it.  It is more general
purpose not in any technical sense, but by econmic fiat.  I can do
more with a 486 PC than anyone ever could with any LISP
machine. Period, and of story.. more applications, more hardware
support, more everything... but, this is only half the story.

For better or worse, and much to the chagrin of lots of language
design types, the "von Neumann machine", as primitive as we like to
think it is, is the dominant architecture today, and has been for as
long as anyone remembers. Load/store, state based, memory hiearchy,
the whole thing.  In this environment, you don't need a compiler
because only because you can't build a high level instruction set.
With enough money, you *can* build a piece of hardware with lots of
microcode for implementing some subset of lisp, say.  The problem is,
it will run really really slowly.  Why?

The interpreter will not be able to do global code analysis to
try and figure out how to best use all the pieces of the memory
hiearchy and exectuion pipeline, this means things like register
allocation, scheduling memory read and writes to overlap with other
instructions, walking loops to best use the cache, and that sort of
nonsense. This is extremely important.  Modern processors run an order
of magnitude faster than the main memory they are hooked to... this
makes fetches and stores to main memory very expensive... which means
you never want to do that... which means you have to make sure your
code uses registers and cache well.  So, how will this magical lisp
machine achieve this?  Are you going to do this opimization by hand on
the LISP code?

No.. that's what the compiler is for.  The compiler is NOT just a box that
translates some high level code to assembly.  It translates high level
gibberish into low level gibberish in such a way as to make the low
level gibberish run faster than low level gibberish that I could have
written by hand in a comparable amount of time.  If you toss the
compiler away, you lose this factor of 2 to 4 or more in performance
gain... so would *you* buy machine without a compiler that runs at
half the speed of your current PC?  I think not.

Another point to make is that putting LISP into hardware is a very
different thing that putting FP into hardware.  FP is a compartively
small set of relatively simple primitives that are provably very
heavily used. LISP is by comparison huge.  It would violate good
design prinpals to put the whole thing into hardware... you really
only want to put the lowest level bits in, and implement the rest at a
high level so you can re-use code and hardware efficiently... but at
this point, you may as well just emulate or compile those low level
bits to a hardware, since it will be faster  for the among other
things, the reasons above... so the lisp machine dies again.

One has to realize that the current state of computer architecture is
the result of a long iterative process of gaining engineering
experience and making engineering tradeoffs.  This stuff isn't done in
a vaccum.  This stuff is done because to one extent or another, this
is the best scheme that we have worked out to date when you take all
the variables (performance, cost, OS support, inertia, compatibility,
etc) into account.

So, go back to school and study your Patterson and Hennesy.  Learn
something, then sit back and think about whether your lisp machine
will work.

Pete