From: Will Duquette
Subject: Re: Ousterhout and Tcl lost the plot with latest paper
Date: 
Message-ID: <wbybagw59h.fsf@peanut.jpl.nasa.gov>
In article <···············@aalh02.alcatel.com.au> ·············@alcatel.com.au (Chris Bitmead uid(x22068)) writes:

   The only benefit you claim for Tcl that doesn't also apply to scheme
   is #3. You would like to type...

   func arg1 arg2
   instead of
   (func arg2 arg2)

   Ok, a minor but perhaps valid point if you want dumb users to use it
   like a shell.

And then, he goes on to suggest ways to make Scheme do that.

What I was expressing with my point #3 was a psychological effect, and
it extends to functions written in the language as well as individual
commands, though I notice it most when typing commands interactively.
I also prefer

        if {$x < 3} {
             foo arg1 arg2 
        } else {
             bar arg3 arg4
        }

to something like

        (if (< x 3)
         (foo arg1 arg2)
         (bar arg3 arg4))

I confess, though I've toyed with Lisp and Scheme in the past, I'm
primarily a C programmer.  Most of the people I work with are C
or C++ programmers.  The Tcl version gives us warm fuzzies, and
the Scheme version doesn't.

I did a web search yesterday, and found a couple of Scheme
implementations which would probably work as well for my purposes as
Tcl, except for the psychological issues I mention.  If I were
starting over, though, I'd still pick Tcl.  It's plenty good enough,
and it "feels" better to me.  Frankly, that's what's important.


















-- 
--------------------------------------------------------------------------
Will Duquette, JPL  | ··················@jpl.nasa.gov
But I speak only    | ····@bean.jpl.nasa.gov
for myself.         | It's amazing what you can do with the right tools.

From: Paul Wilson
Subject: Re: Ousterhout and Tcl lost the plot with latest paper
Date: 
Message-ID: <5ja4a2$o1u@roar.cs.utexas.edu>
In article <··············@peanut.jpl.nasa.gov>,
Will Duquette <····@peanut.jpl.nasa.gov> wrote:
>In article <···············@aalh02.alcatel.com.au> ·············@alcatel.com.au (Chris Bitmead uid(x22068)) writes:
>
>   The only benefit you claim for Tcl that doesn't also apply to scheme
>   is #3. You would like to type...
>
>   func arg1 arg2
>   instead of
>   (func arg2 arg2)
>
>   Ok, a minor but perhaps valid point if you want dumb users to use it
>   like a shell.
>
>And then, he goes on to suggest ways to make Scheme do that.
>
>What I was expressing with my point #3 was a psychological effect, and
>it extends to functions written in the language as well as individual
>commands, though I notice it most when typing commands interactively.

This is understandable.  How to you feel about scripting languages
that have a more C-like syntax, so that you'd write

     func(arg1, arg2)

except when emitting a command to the operating system?

(I'd think that for tiny programs which mostly communicate with the UNIX
shell, the shell-like syntax might be preferable, but for non-tiny
scripts which mostly invoke other scripts, you'd want a more familiar
programming-language-like notation. (The called scripts which may be
written in a foreign language, but not require a call out to the OS to
start a new process).

>I also prefer
>
>        if {$x < 3} {
>             foo arg1 arg2 
>        } else {
>             bar arg3 arg4
>        }
>
>to something like
>
>        (if (< x 3)
>         (foo arg1 arg2)
>         (bar arg3 arg4))

Disregarding the parenthesis issue and the infix vs. prefix issue
(which can be solved with a simple parser on top of something like
Scheme), how do you feal about the $x substitution thing?

Would you rather be able to write something like

          if (x < 3)
            foo(arg1,arg2)
          else
            foo(arg3,arg4);

(maybe without the commas, but where you don't have to remember to
force the evaluation of x in the condition?)

>I confess, though I've toyed with Lisp and Scheme in the past, I'm
>primarily a C programmer.  Most of the people I work with are C
>or C++ programmers.  The Tcl version gives us warm fuzzies, and
>the Scheme version doesn't.

Understandable.  Familiarity can be significant, especially in
the beginning.  (Personally, I found Tcl and Perl's evaluation
rules mystifying in the beginning.)

>I did a web search yesterday, and found a couple of Scheme
>implementations which would probably work as well for my purposes as
>Tcl, except for the psychological issues I mention.  If I were
>starting over, though, I'd still pick Tcl.  

Is this because you like programming in a shell style (e.g., don't
mind having to force evaluation by controlling substitution) or
because Scheme syntax is unfamiliar in other ways?

>It's plenty good enough,
>and it "feels" better to me.  Frankly, that's what's important.

That's understandable, too.  What I'm trying to figure out is which issues
are the most important in overcoming resistance to things like Scheme
and Smalltalk.

-- 
| Paul R. Wilson, Comp. Sci. Dept., U of Texas @ Austin (······@cs.utexas.edu)
| Papers on memory allocators, garbage collection, memory hierarchies,
| persistence and  Scheme interpreters and compilers available via ftp from 
| ftp.cs.utexas.edu, in pub/garbage (or http://www.cs.utexas.edu/users/wilson/)      
From: Mark A Harrison
Subject: Re: Ousterhout and Tcl lost the plot with latest paper
Date: 
Message-ID: <5jgodk$nd7$1@news.utdallas.edu>
Paul Wilson (······@cs.utexas.edu) wrote:
: That's understandable, too.  What I'm trying to figure out is which issues
: are the most important in overcoming resistance to things like Scheme
: and Smalltalk.

I can't address smalltalk, but with regards to scheme and lisp
in general:

1.  People thing lisp is weird.
2.  People think lisp programmers are weirdos.
3.  All attempts by lisp programmers to refute #1 seem to confirm #2.

I have to agree with Karl Lehenbauer's points regarding this... I
enjoyed lisp, but there's lots of people who just don't seem
to relate to it.

In another post, you mention that the computer language community
should be embarrased by Tcl (I can't quite remember the exact wording).

I think that an equally valid question is:

Why are so many of the popular languages created not by language
theorists but by people trying to accomplish some other task?

(I think JO raised this last point in his MIT lecture.)

Mark.
From: Henry Baker
Subject: Re: Ousterhout and Tcl lost the plot with latest paper
Date: 
Message-ID: <hbaker-2104972329440001@10.0.2.1>
In article <············@news.utdallas.edu>, ········@utdallas.edu (Mark A
Harrison) wrote:

> Why are so many of the popular languages created not by language
> theorists but by people trying to accomplish some other task?
> 
> (I think JO raised this last point in his MIT lecture.)

I'll take a shot at this one.

Most 'popular' languages started life as highly specialized (i.e., 'limited
scope') languages that had access to some peculiar library -- e.g., graphics,
type-setting, wimp, linear algebra (matlab), symbolic algebra (macsyma, etc.).
People then found out that any language with sufficient power (i.e., not
brain dead) was Turing complete, and the converts to these new languages
then discovered that they would do _more_ than 'just' graphics, type-setting,
etc., etc.  Of course, many of these converts had been exposed to only one
language, and this language was so much better than that silly Fortran,
Pascal, (insert your favorite dog to kick here) language that had been forced
upon them in engineering/math/physics/.... school, that they touted it as the
next best thing to sliced bread.

The truth is that there isn't more than an ounce of spit in the differences
among most of these languages _with the exception of the specialized libraries
that they are hooked up to_, so most of this variation is non-productive.

There are major exceptions to this assessment, to be sure.  The appearance
of soft/dynamic typing, garbage collection, EVAL, sophisticated data structures,
dynamically compiled/linked code, etc., etc., quickly separate the sheep
from the goats.

So the language theorists focus on the _library-independent_ issues such
as typing, data structuring, control structuring, etc., instead of on
building large numbers of specialized libraries.  The ability of a language
to continue growing out of its original niche depends critically on this
more balanced view, but this view is seldom to be found in the initial
enthusiasm of creating the first Turing-capable interpreter for one's new
graphics/type-setting/... library.

The few languages that do manage to leave the premordial slime and move
onto dry land (Lisp, Prolog, Smalltalk, ML, etc.) are laughed at by those
still in the slime for sloughing off their specialized libraries, even
though by doing so, they can now do all of the specialized things from
not only their original area of expertise, but also many other areas, as well.
From: Steven D. Majewski
Subject: Designed vs. Jes' Grew [was: Ousterhout and Tcl lost the plot with  latest paper]
Date: 
Message-ID: <AF832279-16B9A1@128.143.7.189>
On Tue, Apr 22, 1997 3:29 AM, Henry Baker <·············@netcom.com> wrote:
>In article <············@news.utdallas.edu>, ········@utdallas.edu (Mark A
>Harrison) wrote:
>
>> Why are so many of the popular languages created not by language
>> theorists but by people trying to accomplish some other task?
>> 
>> (I think JO raised this last point in his MIT lecture.)
>
>I'll take a shot at this one.
>
>Most 'popular' languages started life as highly specialized (i.e.,
'limited
>scope') languages that had access to some peculiar library -- e.g.,
graphics,
>type-setting, wimp, linear algebra (matlab), symbolic algebra (macsyma,
etc.).
>People then found out that any language with sufficient power (i.e., not
>brain dead) was Turing complete, and the converts to these new languages
>then discovered that they would do _more_ than 'just' graphics,
type-setting,
>etc., etc. 

[ ... ]


Another binary classification of computer languages is 'academic languages'
vs. 'toolkit languages' : 
   Academic languages are typically developed to prove or demonstrate
  something. 
  Toolkit languages are invented to help someone get some work done. 
 
Of course this is a very fluid boundary.

  Academic languages that survive long enough tend to become tools:

     Fortran was written both as a tool, and to prove that the whole idea
      of automatic compilation of an algebraic language was practical (
There
      were quite a few sceptics at the time. ) 

    Smalltalk was written to demonstrate the advantages of a form of
      purely object oriented programming, but it's become a tool for
      business computing. 

   But I think it's reasonable to say that Scheme, ML and Haskell --
good picks for archetypal academic languages --  were all developed to 
demonstrate the wide utility of a few powerful concepts.  ( Maybe I 
should include N.Wirth's creations here: Pascal, Modula-2, et.al. - but
I've never been quite exactly sure what he was trying to prove. )

   Lisp, on the other hand, was one of the original toolkit languages:
It was invented so that John McCarthy and others would have an easier
tool for their AI research. 
   The HOPL-II article on the evolution of Forth, probably the 
archetypal toolkit language describes how Chuck Moore traveled 
around with his card deck, porting Forth to one system after another
as the first step in whatever application he was working on at the time.
   Perl started out as Larry Wall's personal Unix programming toolkit.
The first paper that mentioned Python was about how it was being used
to test network servers for the Amoeba OS project. And Tcl was certainly
designed as a tool.

Another "bi-chotomy" that I've used -- similar but different than the
previous one is:  designed languages vs. "Jes' Grew" languages. 
The thing that characterizes "Jes' Grew" languages is piecemeal growth. 
One of the things I've always  praised about Python is that it's somewhere 
in the golden mean of those two poles. After all, if growth doesn't start
from a good initial design, it quickly turns into a collection of hacks. 
We can admire clever hacks, but sometimes they only postpone the 
inevitable Complete ReWrite. 

Jes' Grew languages tend over time, towards an excess of often
non-productive features and redundancy as they gather layers
of not quite compatible invention. ( CAR, first, head, elt, aref, ... ) 

 I've never doubted the utility of C++, Perl or Tcl as tools, but they 
 all seem limited by their initial design.   To be fair, I've also bumped 
into limitations in Lisp and Python, but they don't seem quite as 
deeply rooted or severe to me. Also, indicating that I see a design 
flaw is not criticism of the designer: In the case of C++, one of the 
design constraints -- that it be compatible with C -- is the source of
most of it's "flaws". Sometimes a tool or language may be very well 
designed given a certain niche or application, but when it's pushed
into other realms, problems appear. ( As Henry Baker noted
above:  there is a tendency for all special purpose languages to grow
towards being more general purpose. ) 


If John Ousterhout is saying that academic comp. sci. has been 
biased against the toolkit languages and the Jes' Grew languages,
and hasn't payed enough attention to practical issues like "glue 
languages", then I agree. However, I think most of us, including
quite a few of the "academics" agree with that notion now.
 It is other issues: distortion of the truth, assertion without evidence,
failing to give credit to, if not his sources, then at least his
predecessors,
that are the cause of the controversy.  I agree strongly with much of
what I took to be the intent of that paper, but I disagree strongly 
with the specific case he makes. As someone else in this thread said:
"It's not even wrong!" 



Henry Baker (again):
>
>The truth is that there isn't more than an ounce of spit in the
differences
>among most of these languages _with the exception of the specialized
>libraries
>that they are hooked up to_, so most of this variation is non-productive.
>

And other source of irritation at that paper is as yet another 
demonstration of the non-productive reinvention of, rather than
learning from, History. And typically, without learning from the
past, it's not any better the second time around. J.O. is claiming 
progress in the fact that we now have something nearly as good 
as we had ten or fifteen years ago. Non productive variation! 


A better guide than Ousterhout is Dick Gabriel.

When I first read his "worse is better" paper, I wasn't sure 
what, if anything, he was advocating.Was he really recommending
"worse" ? I wasn't quite sure -- that paper seemed like a sort of
left handed compliment. 

 His latest book "Patterns of Software" ( mostly collected from
articles from JOOP ) expands on that paper with his notions of
"habitability" -- how to design and plan for systems that will
grow and evolve. I think that's what he was aiming at with that
first paper, and it's what I was aiming at when I first started calling
Python and Perl "Jes' Grew" languages a couple of years ago. 



- Steve Majewski
<·····@Virginia.EDU> 



BTW: The phrase "Jes' Grew" is from Ishmael Reed. I don't 
remember which book. I do almost remember the line:
 "Where did Jazz come from?"
 "No Come From, it Jes Grew!" 

P.S. I hate to see people spread lies and rumors of the "failure"
of Lisp, when lisp has been one of the greatest, most successful 
Jes' Grew Toolkit languages ever. Lisp is going to be aound when
it turns 40.  It's obvious it could use another rewrite. If J.O. had
wasted less time knocking Lisp and spent a bit more time learning
from it, maybe Tcl would not be showing signs of tired old age already. 
From: Thant Tessman
Subject: Re: Designed vs. Jes' Grew [was: Ousterhout and Tcl lost the plot with         latest paper]
Date: 
Message-ID: <335E38AD.41C6@nospam.acm.org>
Steven D. Majewski wrote:

[...]

> Another binary classification of computer languages is 'academic languages'
> vs. 'toolkit languages' :

[...]

>    But I think it's reasonable to say that Scheme, ML and Haskell --
> good picks for archetypal academic languages --  were all developed to
> demonstrate the wide utility of a few powerful concepts.  [...]

I think ML was originally created specifically for writing theorem provers.  
So ML really started out as an academician's toolkit language.

[...followups severely trimmed...]

-thant
From: SERRANO Manuel
Subject: Re: Ousterhout and Tcl lost the plot with latest paper
Date: 
Message-ID: <5jkbne$sji@uni2f.unige.ch>
Bigloo is a compiler for an extended version of Scheme.
May you should have a look.

Bigloo contains:
  - a foreign interface (not restricted to foreign _function_ interface)
  - a module language (mostly to allow batch compilation)
  - several libraries (such as parsing libraries, pattern matching, ...)
  - a native object system based on generic functions.
  - a macro system.

To taste all this, you can pick the Bigloo documentation at: 
http://cuiwww.unige.ch/~serrano/bigloo.html

I hope it will help.

  --Manuel Serrano--
From: Steven D. Majewski
Subject: Designed vs. Jes' Grew [was: Ousterhout and Tcl lost the plot with  latest paper]
Date: 
Message-ID: <AF832286-16BC7C@128.143.7.189>
On Tue, Apr 22, 1997 3:29 AM, Henry Baker <·············@netcom.com> wrote:
>In article <············@news.utdallas.edu>, ········@utdallas.edu (Mark A
>Harrison) wrote:
>
>> Why are so many of the popular languages created not by language
>> theorists but by people trying to accomplish some other task?
>> 
>> (I think JO raised this last point in his MIT lecture.)
>
>I'll take a shot at this one.
>
>Most 'popular' languages started life as highly specialized (i.e.,
'limited
>scope') languages that had access to some peculiar library -- e.g.,
graphics,
>type-setting, wimp, linear algebra (matlab), symbolic algebra (macsyma,
etc.).
>People then found out that any language with sufficient power (i.e., not
>brain dead) was Turing complete, and the converts to these new languages
>then discovered that they would do _more_ than 'just' graphics,
type-setting,
>etc., etc. 

[ ... ]


Another binary classification of computer languages is 'academic languages'
vs. 'toolkit languages' : 
   Academic languages are typically developed to prove or demonstrate
  something. 
  Toolkit languages are invented to help someone get some work done. 
 
Of course this is a very fluid boundary.

  Academic languages that survive long enough tend to become tools:

     Fortran was written both as a tool, and to prove that the whole idea
      of automatic compilation of an algebraic language was practical (
There
      were quite a few sceptics at the time. ) 

    Smalltalk was written to demonstrate the advantages of a form of
      purely object oriented programming, but it's become a tool for
      business computing. 

   But I think it's reasonable to say that Scheme, ML and Haskell --
good picks for archetypal academic languages --  were all developed to 
demonstrate the wide utility of a few powerful concepts.  ( Maybe I 
should include N.Wirth's creations here: Pascal, Modula-2, et.al. - but
I've never been quite exactly sure what he was trying to prove. )

   Lisp, on the other hand, was one of the original toolkit languages:
It was invented so that John McCarthy and others would have an easier
tool for their AI research. 
   The HOPL-II article on the evolution of Forth, probably the 
archetypal toolkit language describes how Chuck Moore traveled 
around with his card deck, porting Forth to one system after another
as the first step in whatever application he was working on at the time.
   Perl started out as Larry Wall's personal Unix programming toolkit.
The first paper that mentioned Python was about how it was being used
to test network servers for the Amoeba OS project. And Tcl was certainly
designed as a tool.

Another "bi-chotomy" that I've used -- similar but different than the
previous one is:  designed languages vs. "Jes' Grew" languages. 
The thing that characterizes "Jes' Grew" languages is piecemeal growth. 
One of the things I've always  praised about Python is that it's somewhere 
in the golden mean of those two poles. After all, if growth doesn't start
from a good initial design, it quickly turns into a collection of hacks. 
We can admire clever hacks, but sometimes they only postpone the 
inevitable Complete ReWrite. 

Jes' Grew languages tend over time, towards an excess of often
non-productive features and redundancy as they gather layers
of not quite compatible invention. ( CAR, first, head, elt, aref, ... ) 

 I've never doubted the utility of C++, Perl or Tcl as tools, but they 
 all seem limited by their initial design.   To be fair, I've also bumped 
into limitations in Lisp and Python, but they don't seem quite as 
deeply rooted or severe to me. Also, indicating that I see a design 
flaw is not criticism of the designer: In the case of C++, one of the 
design constraints -- that it be compatible with C -- is the source of
most of it's "flaws". Sometimes a tool or language may be very well 
designed given a certain niche or application, but when it's pushed
into other realms, problems appear. ( As Henry Baker noted
above:  there is a tendency for all special purpose languages to grow
towards being more general purpose. ) 


If John Ousterhout is saying that academic comp. sci. has been 
biased against the toolkit languages and the Jes' Grew languages,
and hasn't payed enough attention to practical issues like "glue 
languages", then I agree. However, I think most of us, including
quite a few of the "academics" agree with that notion now.
 It is other issues: distortion of the truth, assertion without evidence,
failing to give credit to, if not his sources, then at least his
predecessors,
that are the cause of the controversy.  I agree strongly with much of
what I took to be the intent of that paper, but I disagree strongly 
with the specific case he makes. As someone else in this thread said:
"It's not even wrong!" 



Henry Baker (again):
>
>The truth is that there isn't more than an ounce of spit in the
differences
>among most of these languages _with the exception of the specialized
>libraries
>that they are hooked up to_, so most of this variation is non-productive.
>

And other source of irritation at that paper is as yet another 
demonstration of the non-productive reinvention of, rather than
learning from, History. And typically, without learning from the
past, it's not any better the second time around. J.O. is claiming 
progress in the fact that we now have something nearly as good 
as we had ten or fifteen years ago. Non productive variation! 


A better guide than Ousterhout is Dick Gabriel.

When I first read his "worse is better" paper, I wasn't sure 
what, if anything, he was advocating.Was he really recommending
"worse" ? I wasn't quite sure -- that paper seemed like a sort of
left handed compliment. 

 His latest book "Patterns of Software" ( mostly collected from
articles from JOOP ) expands on that paper with his notions of
"habitability" -- how to design and plan for systems that will
grow and evolve. I think that's what he was aiming at with that
first paper, and it's what I was aiming at when I first started calling
Python and Perl "Jes' Grew" languages a couple of years ago. 



- Steve Majewski
<·····@Virginia.EDU> 



BTW: The phrase "Jes' Grew" is from Ishmael Reed. I don't 
remember which book. I do almost remember the line:
 "Where did Jazz come from?"
 "No Come From, it Jes Grew!" 

P.S. I hate to see people spread lies and rumors of the "failure"
of Lisp, when lisp has been one of the greatest, most successful 
Jes' Grew Toolkit languages ever. Lisp is going to be aound when
it turns 40.  It's obvious it could use another rewrite. If J.O. had
wasted less time knocking Lisp and spent a bit more time learning
from it, maybe Tcl would not be showing signs of tired old age already. 
From: Will Duquette
Subject: Re: Ousterhout and Tcl lost the plot with latest paper
Date: 
Message-ID: <wbwwpww7y3.fsf@peanut.jpl.nasa.gov>
In article <··········@roar.cs.utexas.edu> ······@cs.utexas.edu (Paul Wilson) writes:

    This is understandable.  How to you feel about scripting languages
   that have a more C-like syntax, so that you'd write

        func(arg1, arg2)

   except when emitting a command to the operating system?

It depends on the project.  In the current case, I use my Tcl-based
language to write tools, but also to exercise APIs interactively.  I
prefer to type

        func arg1 arg2

when working interactively.  When writing tools, the 

        func(arg1, arg2)

notation is nice, but I'd rather use the same in both places.


   Disregarding the parenthesis issue and the infix vs. prefix issue
   (which can be solved with a simple parser on top of something like
   Scheme), how do you feal about the $x substitution thing?

The $x substitution thing took some getting used to, but I encountered
both shells and perl first, so it wasn't too add.  I find that the 
ability to build up complex strings quickly using substitution is
really handy for quick-and-dirty tool development.  You can do much
the same thing in Lisp or Scheme, of course, using lists instead of 
strings...but I can pass a string to native function written in C
and let it parse the string any way it likes.

So, I like $x for the most part.

   >I did a web search yesterday, and found a couple of Scheme
   >implementations which would probably work as well for my purposes as
   >Tcl, except for the psychological issues I mention.  If I were
   >starting over, though, I'd still pick Tcl.  

   Is this because you like programming in a shell style (e.g., don't
   mind having to force evaluation by controlling substitution) or
   because Scheme syntax is unfamiliar in other ways?

Mostly because all of the parentheses get to me, and because I think
I can more easily teach the rudiments of Tcl to people who want to use
my tools.


   >It's plenty good enough,
   >and it "feels" better to me.  Frankly, that's what's important.

   That's understandable, too.  What I'm trying to figure out is which issues
   are the most important in overcoming resistance to things like Scheme
   and Smalltalk.

Ahhh.  Smalltalk.  That's really another discussion, but here's why 
I don't like Smalltalk.  I don't think of Smalltalk as competing with
Tcl, by the way, but then, I'm not one of the people who develops 
large applications entirely in Tcl.

1. I like traditional expressions and control structures.
   A C programmer can get the gist of a Fortran, BASIC, Pascal,
   Ada, or Java program without too much effort.  The syntax isn't 
   identical, but the basic model is much the same.

2. (And this is the kicker) I really dislike having my application
   and the Smalltalk library classes live in the same "space".  In the
   Smalltalk system I've looked at (Smalltalk V), if I wanted to
   develop two separate applications they either needed to live in the
   same class tree, or I needed to maintain too entirely separate
   "images", which included all of the system classes.  This gives me
   chills, for some reason.

As someone on this thread has commented, OOP in Java is more like OOP
in Smalltalk than it is OOP in C++, and I think this is true...but in
Java, there's a nice clean separation between my code and anybody
else's code.  I like that.  Again, this may be a purely psychological
issue, but then, I program better when I'm happy. :-)

Will

-- 
--------------------------------------------------------------------------
Will Duquette, JPL  | ··················@jpl.nasa.gov
But I speak only    | ····@bean.jpl.nasa.gov
for myself.         | It's amazing what you can do with the right tools.
From: Chris Bitmead uid(x22068)
Subject: Re: Ousterhout and Tcl lost the plot with latest paper
Date: 
Message-ID: <s6yiv1emee8.fsf@aalh02.alcatel.com.au>
····@peanut.jpl.nasa.gov (Will Duquette) writes:

> 1. I like traditional expressions and control structures.
>    A C programmer can get the gist of a Fortran, BASIC, Pascal,
>    Ada, or Java program without too much effort.  The syntax isn't 
>    identical, but the basic model is much the same.

Don't know what you mean by traditional control structures, but
Smalltalk control structures don't seem too different to C in practice.
 
> 2. (And this is the kicker) I really dislike having my application
>    and the Smalltalk library classes live in the same "space".  In the
>    Smalltalk system I've looked at (Smalltalk V), if I wanted to
>    develop two separate applications they either needed to live in the
>    same class tree, or I needed to maintain too entirely separate
>    "images", which included all of the system classes.  This gives me
>    chills, for some reason.
> 
> As someone on this thread has commented, OOP in Java is more like OOP
> in Smalltalk than it is OOP in C++, and I think this is true...but in
> Java, there's a nice clean separation between my code and anybody
> else's code.  I like that.  Again, this may be a purely psychological
> issue, but then, I program better when I'm happy. :-)

I think it's purely psychological. :-) Think of the image as an
instant compilation of your code changes. To use the same code in two
images you need to export. The equivilent in C++ is building a
library, installing the library in a common place and then linking
with it. On the whole a lot more painful for C++.
From: Ulric Eriksson
Subject: Re: Ousterhout and Tcl lost the plot with latest paper
Date: 
Message-ID: <5jebsn$b2$1@news.edu.stockholm.se>
In article <··············@peanut.jpl.nasa.gov>,
Will Duquette <····@peanut.jpl.nasa.gov> wrote:
>
>I confess, though I've toyed with Lisp and Scheme in the past, I'm
>primarily a C programmer.  Most of the people I work with are C
>or C++ programmers.  The Tcl version gives us warm fuzzies, and
>the Scheme version doesn't.

I'll bite. I do not at all think that C and Tcl resemble each other,
other than superficial things like the squigglies, which don't even
mean the same thing in the two languages.

OTOH, C and Scheme are similar in several ways: call-by-value,
pointer-chasing, free-format.


Ulric Eriksson
-- 
I proactively leverage my synergies.