From: Hugh LaMaster
Subject: Re: Wanted: programming language for 9 yr old
Date: 
Message-ID: <44c8fh$6m3@onramp.arc.nasa.gov>
Most of the replies to the following posting discussed BASIC, 
a language I don't hold in high regard.  So, this is cross-posted
in the hope of finding some more interesting alternatives:

In article <··········@blackice.winternet.com>, Michael Bresnahan <····@winternet.com> writes:
|> I'm looking for a good programming language and documentation package
|> for my 9 yr old step-son who has no experience with computer programming.
|> It can be commercial or freeware.  It must run on a i286 machine running
|> some older version of MS DOS (3.3 maybe).  I'm looking for something that
|> packages a simple language with some good tutorial documentation geared
|> for a his age range.  Something that made heavy use of graphics, sound,
|> and animation would be nice.  
|> 
|> I just spent the day searching various computer dealers in the area and found
|> absolutly nothing even close.  I found some compiler and tutorial documentation
|> packages for C/C++ and Pascal, but all were obviously geared toward an adult.
|> What happened to the days of the "home computer"?  When I was about my son's
|> age I had a Texas Instruments 99/4a computer which came with a version of 
|> BASIC and a book which I found very easy to read.  I soon upgraded to 
|> "Extended BASIC".  It included support for "sprites" which made wriring 
|> animated graphics a snap and lots of fun.  In those days they had what was 
|> called Logo also.  Whatever happend to Logo?
|> 
|> I remember a magazine I had a subscription to.  It was
|> called something like "Home Computer Magazine".  It was full of articles
|> about programming games and other stuff using the TI's BASIC interpreter.
|> Where does one find stuff like this today?  Where does one look for such
|> stuff?  I'm posting to this group as a sort of shot in the dark.  I thought
|> that at least I might find out what is being used in elementry schools to 
|> teach the art of computer programming to children.
|> 
|> Please help.
|> 
|> MikeB
|> 

-- 
  Hugh LaMaster, M/S 233-18,    Email:       Please send ASCII documents to:
  NASA Ames Research Center     Internet:    ········@ames.arc.nasa.gov
  Moffett Field, CA 94035-1000  Or:          ········@george.arc.nasa.gov 
  Phone:  415/604-1056          Disclaimer:  Unofficial, personal *opinion*.

From: Patrick D. Logan
Subject: Re: Wanted: programming language for 9 yr old
Date: 
Message-ID: <44cplj$jj4@ornews.intel.com>
········@george.arc.nasa.gov (Hugh LaMaster) wrote:

>Most of the replies to the following posting discussed BASIC, 
>a language I don't hold in high regard.  So, this is cross-posted
>in the hope of finding some more interesting alternatives:

Scheme is good. Logo may be better for a nine year old.

Smalltalk may also be appropriate, and it was left out of
your list of newsgroups.

-- 
······················@ccm.jf.intel.com
(503) 264-9309, FAX: (503) 264-3375

"Poor design is a major culprit in the software crisis...
..Beyond the tenets of structured programming, few accepted...
standards stipulate what software systems should be like [in] detail..."
-IEEE Computer, August 1995, Bruce W. Weide
From: Simon Brooke
Subject: Re: Wanted: programming language for 9 yr old
Date: 
Message-ID: <44eu08$att@caleddon.dircon.co.uk>
"Patrick D. Logan" <···············@ccm.jf.intel.com> wrote:
>········@george.arc.nasa.gov (Hugh LaMaster) wrote:
>
>Scheme is good. Logo may be better for a nine year old.
>
>Smalltalk may also be appropriate, and it was left out of

I would start with LOGO. I've taught it to a number of kids (or, more to 
the point, watched while they taught themselves). It's simple, the turtle
metaphor is easy to grasp (and kids like it), and LOGO teaches good 
programming concepts and style. You can also do pretty sophisticated things 
with it (I've seen some very good grammar parsers and ELIZAs).

-- 
------- ·····@rheged.dircon.co.uk (Simon Brooke)

	How many pentium designers does it take to change a lightbulb?
		1.99904274017
From: Randy Brown
Subject: Re: Wanted: programming language for 9 yr old
Date: 
Message-ID: <44gpn4$k36@dsinc.myxa.com>
Simon Brooke <simon> writes:

>I would start with LOGO. I've taught it to a number of kids (or, more to 
>the point, watched while they taught themselves). It's simple, the turtle
>metaphor is easy to grasp (and kids like it), and LOGO teaches good 
>programming concepts and style. You can also do pretty sophisticated things 
>with it (I've seen some very good grammar parsers and ELIZAs).

A word of caution on LOGO:

Yes, the turtle is a wonderful teaching tool, yes LOGO was one of my
first languages (at the age of uh... 9?), but... only TCL of the languages
I've learned has a worse quoting syntax.  I discovered this, trying to
teach my little brother how to something a little more complicated
than drawing a sunflower.

Next time, (+ scheme turtle-graphics)

	-Randy
From: Michael Dillon
Subject: Re: Wanted: programming language for 9 yr old
Date: 
Message-ID: <44dgra$v2g@felix.junction.net>
>In article <··········@blackice.winternet.com>, Michael Bresnahan <····@winternet.com> writes:
>|> I'm looking for a good programming language and documentation package
>|> for my 9 yr old step-son who has no experience with computer programming.
>|> It can be commercial or freeware.  It must run on a i286 machine running
>|> some older version of MS DOS (3.3 maybe).  I'm looking for something that
>|> packages a simple language with some good tutorial documentation geared
>|> for a his age range.  Something that made heavy use of graphics, sound,
>|> and animation would be nice.  

There isn't anything that I know of for DOS that makes HEAVY use of 
graphics sound and animnation but LOGO does make some use of graphics and 
UCBLOGO for DOS is free from ftp://anarres.cs.berkeley.edu/pub/ucblogo/

Get BLOGO31.EXE and USERMANUAL. If you have access to compress and tar 
then get csls.tar.Z which are the companion files to Computer Science 
Logo Style. My 8-year-old has had a great time working through the book 
"Learning with LOGO" by Danial Watt that we picked up at a library 
discard sale. The fact that the LOGO syntax was slightly different from 
UCBlogo and the book talked about differences between Commodore LOGO. 
Apple LOGO and TI LOGO just made the learning experience more interesting 
for him.

For heavy duty graphics and animation and sound he will need to use 
something like Quickbasic or Turbo Pascal but that is best learned as a 
SECOND language, not the first.

LOGO can be a stepping stone to more complex stuff as well. If you buy a 
Macintosh there is Hyperstudio (a Hypercard-like multimedia program that 
uses LOGO for it's scripting language) and Microworlds LOGO which 
introduces simulations and animations to LOGO.

-- 
Michael Dillon                                    Voice: +1-604-546-8022
Memra Software Inc.                                 Fax: +1-604-542-4130
http://www.memra.com                             E-mail: ·······@memra.com
From: Steve Weyer
Subject: Re: Wanted: programming language for 9 yr old
Date: 
Message-ID: <weyer-2809950851140001@slip-24.netaxs.com>
>>In article <··········@blackice.winternet.com>, Michael Bresnahan
<····@winternet.com> writes:
>>|> I'm looking for a good programming language and documentation package
>>|> for my 9 yr old step-son who has no experience with computer programming.
>>|> It can be commercial or freeware.  It must run on a i286 machine running
>>|> some older version of MS DOS (3.3 maybe).  I'm looking for something that
>>|> packages a simple language with some good tutorial documentation geared
>>|> for a his age range.  Something that made heavy use of graphics, sound,
>>|> and animation would be nice.  


If you happen to have a Newton PDA, then NewtonScript is a nice
object-oriented language.  I've written a turtle graphics environment and
Word and Sentence extensions for it, so you can use NewtonScript in a
Logo-like fashion.  The original "Newt" (lizard cousin of amphibious
turtles) from 2 years ago has evolved into the Newt Development
Environment for building and saving regular applications also. I don't
have any 9-year old users, but several 12-year olds are using it.  For
more info about the Newt, NewtTurT (Newt Turtle Tutorial), NewtATut (Newt
Application Tutorial), etc. see
  www.netaxs.com/~weyer/newton/releases.html

Steve
From: Bruce Bon
Subject: Re: Wanted: programming language for 9 yr old
Date: 
Message-ID: <44eu7s$dmt@netline-fddi.jpl.nasa.gov>
Seems to me that Logo is what you want, and I'm sure you'll get
plenty of replies to that effect.

If you want to check another possibility, try Python -- check out
newsgroup  comp.lang.python  or Web address 

	http://www.cwi.nl/~guido/Python.html

It is not really targetted toward children, but it has good tutorial
documentation and is a very esthetic, modern, object-oriented language.

-- 
Bruce Bon
From: John Atwood
Subject: Re: Wanted: programming language for 9 yr old
Date: 
Message-ID: <44er8b$a2c@engr.orst.edu>
>|> I'm looking for a good programming language and documentation package
>|> for my 9 yr old step-son who has no experience with computer programming.
>|> It can be commercial or freeware.  It must run on a i286 machine running
>|> some older version of MS DOS (3.3 maybe).  I'm looking for something that
>|> packages a simple language with some good tutorial documentation geared
>|> for a his age range.  Something that made heavy use of graphics, sound,
>|> and animation would be nice.


Logo and Smalltalk have already been suggested, here are some others:
(unless noted, they run on a 286, DOS3.3, VGA).

The Incredible Machine:  share/free-ware, not exactly programming, but 
let's you build Rube Goldberg type machines, adjust the gravity, air 
pressure, etc.

The Even More Incredible Machine, $40, Dynamix.  I haven't seen this one.

At Ease With Computer programming: share/free-ware.  I haven't seen this 
one, but i think i'll track it down.

Computer Works. $40, By Software Marketing.  Haven't seen this one, but the
ad indicates it's a tutorial on how a computer works, with a history of 
computers, and discussions of AI, VR, Multimedia, viruses, hacking, etc.

I'm getting this info from the catalogs of these companies:

Software Labs 800-569-7900
Media Magic   800-882-8284

Also, 2 soon to be released packages look very interesting, though they
probably will require more than a 286:

KidSim, see the July 94 issue of Communications of the ACM.

Toontalk, by Ken Kahn of Stanford, His company is Animated Programs
ftp://csli.stanford.edu/pub/Preprints/tt_jvlc.ps.Z or tt_jvlc.zip


And finally, I'm looking for these old Mac programs that are in a similar 
vein, if someone knows where I can get them, I'd appreciate it:
 Rocky's Boots
 Robot Odessey 

HTH,

John
-- 
_________________________________________________________________
Office phone: 503-737-5583 (Batcheller 349);home: 503-757-8772
Office mail:  303 Dearborn Hall, OSU, Corvallis, OR  97331
_________________________________________________________________
From: Herman Rubin
Subject: Re: Wanted: programming language for 9 yr old
Date: 
Message-ID: <44h3fr$2f67@b.stat.purdue.edu>
In article <··········@engr.orst.edu>,
John Atwood <·······@ada.CS.ORST.EDU> wrote:
>>|> I'm looking for a good programming language and documentation package
>>|> for my 9 yr old step-son who has no experience with computer programming.
>>|> It can be commercial or freeware.  It must run on a i286 machine running
>>|> some older version of MS DOS (3.3 maybe).  I'm looking for something that
>>|> packages a simple language with some good tutorial documentation geared
>>|> for a his age range.  Something that made heavy use of graphics, sound,
>>|> and animation would be nice.

		[Very much deleted.]

There have been many replies to this, touting this or that system.

But I would like to raise another point, which is similar to the 
points I have raised about teaching mathematics, reading, etc.

None of these languages gives any indication as to how a computer
operates.  Just as learning how to do arithmetic gives no understanding
of mathamtics, likewise learning to program with a restricted collection
of constructs gives no understanding of computers, or even how 
computer languages operate.

I do not know what is available at a low level.  It is even hard
to find much at higher levels now.  Possibly those in the field
could comment.
-- 
Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907-1399
······@stat.purdue.edu	 Phone: (317)494-6054	FAX: (317)494-0558
From: Hugh LaMaster
Subject: Re: Wanted: programming language for 9 yr old
Date: 
Message-ID: <44huai$304@onramp.arc.nasa.gov>
In article <···········@b.stat.purdue.edu>, 
······@b.stat.purdue.edu (Herman Rubin) writes:

|> But I would like to raise another point, which is similar to the 
|> points I have raised about teaching mathematics, reading, etc.
|> 
|> None of these languages gives any indication as to how a computer
|> operates.  Just as learning how to do arithmetic gives no understanding
|> of mathamtics, likewise learning to program with a restricted collection
|> of constructs gives no understanding of computers, or even how 
|> computer languages operate.

Since I have not yet tried it, I don't know if I understand
Logo properly.  But, allegedly, it allows lisp programming
(without(parentheses)), and certainly, you can do "anything"
in lisp, so maybe you can in Logo, too.  I just don't know.
I happen to think lisp is a superior language to learn
early-on, precisely because of its conceptual generality.
Whether Logo retains that I don't know.

It is true that no such language will teach someone very
much about the hardware.  When the time comes, I suppose
the child could write an assembler in machine language 
and a lisp interpreter in assembler.  I'm not sure I 
would try that with an average 9 year old, though.

|> I do not know what is available at a low level.  It is even hard
|> to find much at higher levels now.  Possibly those in the field
|> could comment.


-- 
  Hugh LaMaster, M/S 233-18,    Email:       Please send ASCII documents to:
  NASA Ames Research Center     Internet:    ········@ames.arc.nasa.gov
  Moffett Field, CA 94035-1000  Or:          ········@george.arc.nasa.gov 
  Phone:  415/604-1056          Disclaimer:  Unofficial, personal *opinion*.
From: Brian Harvey
Subject: Re: Wanted: programming language for 9 yr old
Date: 
Message-ID: <44k23h$l7g@agate.berkeley.edu>
········@george.arc.nasa.gov (Hugh LaMaster) writes:
>Since I have not yet tried it, I don't know if I understand
>Logo properly.  But, allegedly, it allows lisp programming
>(without(parentheses)), and certainly, you can do "anything"
>in lisp, so maybe you can in Logo, too.  I just don't know.

Indeed, Logo is a full Turing-equivalent programming language
based on Lisp.  The only difference is that Logo doesn't have
first-class procedures (i.e., no lambda), so certain idioms of
Lisp style look a little different in Logo.  (The limitation is
less severe than it may sound, but I think most readers of this
thread don't care about the details.)

My favorite short example, written in the Berkeley Logo dialect:

to choose :menu [:sofar []]
if emptyp :menu [print :sofar stop]
foreach first :menu [(choose (butfirst :menu) (sentence :sofar ?))]
end

? choose [[small medium large] [vanilla [rum raisin] ginger pumpkin]
          [cone cup]]
SMALL VANILLA CONE
SMALL VANILLA CUP
SMALL RUM RAISIN CONE
...				(I'm leaving out some of the output!)
LARGE PUMPKIN CONE
LARGE PUMPKIN CUP

>It is true that no such language will teach someone very
>much about the hardware.  [...]  I'm not sure I 
>would try that with an average 9 year old, though.

Yeah, Herman doesn't seem to understand the difference between
a child and a math professor!  (Check him out on k12.ed.math :-)
From: Herman Rubin
Subject: Re: Wanted: programming language for 9 yr old
Date: 
Message-ID: <44k9nb$3dgq@b.stat.purdue.edu>
In article <··········@agate.berkeley.edu>,
Brian Harvey <··@anarres.CS.Berkeley.EDU> wrote:
>········@george.arc.nasa.gov (Hugh LaMaster) writes:

			.................

>>It is true that no such language will teach someone very
>>much about the hardware.  [...]  I'm not sure I 
>>would try that with an average 9 year old, though.

>Yeah, Herman doesn't seem to understand the difference between
>a child and a math professor!  (Check him out on k12.ed.math :-)

I suspect that a 9 year old, even with the present lack of teaching
of mathematical concepts, would be far more interested in learning
about the hardware than the math professor.  As for the AVERAGE 9
year old, I do not know, but this should not be the sole question.
Education should not be limited to the average; this attitude is
what has trashed the content of American education.

But I suspect that many 9 year olds would be interested in how an
error in a table in division speedup caused the peculiar Pentium 
divide errors.  That might get them interested in how computers
get more speedup than merely the switching speed of the units.
-- 
Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907-1399
······@stat.purdue.edu	 Phone: (317)494-6054	FAX: (317)494-0558
From: Shaun Flisakowski
Subject: Re: Wanted: programming language for 9 yr old
Date: 
Message-ID: <44qqgk$klu@spool.cs.wisc.edu>
In article <··········@aurns1.aur.alcatel.com>,
Wayne Throop <··················@dg-rtp.dg.com> wrote:
>
>: From: ······@b.stat.purdue.edu (Herman Rubin)
>: I suspect that a 9 year old, even with the present lack of teaching of
>: mathematical concepts, would be far more interested in learning about
>: the hardware than the math professor. 
>
>I must admit that I, when I was 9 years old, was indeed more interested
>in learning about computer hardware than I was in learning about 
>math professors.  This bias continues to this day.
>--

    Not me, but you should have seen my 3rd grade math teacher,
    what a fox!

-- 
    Shaun        ········@cs.wisc.edu

   "In your heart you know its flat."
                           -Flat Earth Society
From: Nathan Urban
Subject: Re: Wanted: programming language for 9 yr old
Date: 
Message-ID: <44h4qp$ic5@csugrad.cs.vt.edu>
In article <···········@b.stat.purdue.edu>, ······@b.stat.purdue.edu (Herman Rubin) wrote:

> There have been many replies to this, touting this or that system.
> 
> But I would like to raise another point, which is similar to the 
> points I have raised about teaching mathematics, reading, etc.
> 
> None of these languages gives any indication as to how a computer
> operates.  Just as learning how to do arithmetic gives no understanding
> of mathamtics, likewise learning to program with a restricted collection
> of constructs gives no understanding of computers, or even how 
> computer languages operate.
> 
> I do not know what is available at a low level.  It is even hard
> to find much at higher levels now.  Possibly those in the field
> could comment.

Really, the only language that tells you how a computer really operates
is assembly.  I wouldn't recommend it as a first language, because it
takes so much work to get anything done.  I would definitely recommend
it as a second language however, once you are familiar with a
high-level language.  It can be a lot of fun, and you will understand
better why and how the high-level language does things the way it does
(especially C.)  Every programmer should be familiar with some form of
assembly.  It really helps to see how different ways of coding things
will get optimized.
-- 
-----------------------------------------------------------------------------
Nathan Urban                              | e-mail: ······@mail.vt.edu
Undergraduate {CS,Physics}, Virginia Tech | WWW: http://nurban.campus.vt.edu/
-----------------------------------------------------------------------------
From: Ron Nicholson
Subject: Re: Wanted: programming language for 9 yr old
Date: 
Message-ID: <44hchu$fe5@murrow.corp.sgi.com>
······@csugrad.cs.vt.edu (Nathan Urban) wrote:

>Really, the only language that tells you how a computer really operates
>is assembly. 

The language that tells you even more about how a computer really operates
is schematic diagrams and gate level logic.  There are several "edutainment"
products that easily allow someone to play with logic gates.  "Rocky's Boots"
was the first of these, circa 1981 or so.

---
Ronald H. Nicholson, Jr.                ···@engr.sgi.com, ···@netcom.com
#include <canonical.disclaimer>         // I speak only for myself, etc.
From: Michael Dillon
Subject: Re: Wanted: programming language for 9 yr old
Date: 
Message-ID: <44l9ob$bfb@felix.junction.net>
In article <···········@b.stat.purdue.edu>,
Herman Rubin <······@b.stat.purdue.edu> wrote:

>But I would like to raise another point, which is similar to the 
>points I have raised about teaching mathematics, reading, etc.
>
>None of these languages gives any indication as to how a computer
>operates.  Just as learning how to do arithmetic gives no understanding
>of mathamtics, likewise learning to program with a restricted collection
>of constructs gives no understanding of computers, or even how 
>computer languages operate.
>
>I do not know what is available at a low level.  It is even hard
>to find much at higher levels now.  Possibly those in the field
>could comment.

At a low level you need to have a skilled person supervise or at least an 
adult who can translate the technical books to kid speak. But any old XT 
or 286 with MS-DOS comes with DEBUG which is as low level as you can get. 
Since the debug command has a rudimentary assembler it can be used to 
write machine code programs and utilities. Any book on 8086 asssembler 
will get you started.

For the Macintosh world I reccommend MOPS which is an object oriented 
FORTH. While MOPS itself is kind of high-level, it comes with a 68000 
assembler which is used to assemble the MOPS kernel from the source code 
and you can do all kinds of low-level stuff in MOPS. MOPS is freeware.
http://www.netaxs.com/~jayfar

Another possibility is to get the October 1995 BYTE magazine and to read 
about the small microcontroller systems built around Z-80 and 6809 etc. 
They can be used to build all kinds of small control systems and robots 
and work well in conjunction with LEGO Technics. There are a bunch of 
robot enthusiasts in comp.robotics.misc who will point you to books, 
magazines, WWW sites etc...


-- 
Michael Dillon                                    Voice: +1-604-546-8022
Memra Software Inc.                                 Fax: +1-604-542-4130
http://www.memra.com                             E-mail: ·······@memra.com
From: Kent Paul Dolan
Subject: Compiler abstractions [was: Wanted: programming language for 9 yr old]
Date: 
Message-ID: <xanthian-0210951246310001@kdolan-mac.qualcomm.com>
In article <···········@b.stat.purdue.edu>, ······@b.stat.purdue.edu
(Herman Rubin) wrote:

> None of these languages gives any indication as to how a computer
> operates.

And of course, the modern paradigm of abstracting away whatever is not
pertinent to the task at hand makes modern programming languages' habit of
making the underlying processor complexities invisible exactly the correct
choice.

By an accident of history, I happen to understand exactly how an adder
works, down to the transistor level.

Under no circumstances do I want to be bothered with that level of detail
when the task at hand is adding A to B to yield C, and then printing the
value contained in C.

It's a bit hard on us fogies that paid our dues in learning computers down
to the bit processing level that the newbies don't value our wisdom and
experience and want to follow our course in learning computers from the
metal and up, but for the _vast_ _majority_ of programmers, working at the
applications level, such knowledge today is worse than useless, it is
distracting.

Go read Robert Persig's _Lila_ enough times to internalize the concepts
there, recognize the ratchet in operation here for what it is, and get on
with life.  Times have changed, and what is valued as "quality" in
computer knowledge for the average programmer has changed to accomodate
present reality.

-- 
Xanthian.                  | MS-Windows versus OS/2, MacOS versus AmigaOS: | 
Kent, the man from xanth.  |       Gresham's Law applies to software!      |
Kent Paul Dolan            -------------------------------------------------
········@{well,qualcomm}.com   I speak for neither The Well nor Qualcomm.
From: Erann Gat
Subject: Re: Compiler abstractions [was: Wanted: programming language for 9 yr old]
Date: 
Message-ID: <44rpoa$qqt@lo-fan.jpl.nasa.gov>
Kent Dolan writes:

>By an accident of history, I happen to understand exactly how an adder
>works, down to the transistor level.
>
>Under no circumstances do I want to be bothered with that level of detail
>when the task at hand is adding A to B to yield C, and then printing the
>value contained in C.
>

Then I hope you never get a job writing a safety-critical piece of software.
High level languages provide the illusion that you can add two numbers without
having to worry about how it is done, but the illusion has cracks around the
edges.  Sometimes 30000 + 30000 results in an error.  Sometimes it results in a
negative number.  Sometimes 0.1+0.1+0.1+0.1+0.1+0.1+0.1+0.1 is 0.7999999.
Unless you understand the hardware these cracks in the illusion will eventually
cause your software to fail.  Certainly you can get a lot done without such
deep uderstanding, but to say that under no circumstances do you want to be
bothered is a dangerous and naive attitude.

E.

-- 
---
Erann Gat      ···@jpl.nasa.gov     (818) 306-6176
From: Kent Paul Dolan
Subject: Re: Compiler abstractions [was: Wanted: programming language for 9 yr old]
Date: 
Message-ID: <xanthian-0910951354090001@kdolan-mac.qualcomm.com>
In article <··········@lo-fan.jpl.nasa.gov>, Erann Gat <gat> wrote:

> Kent Dolan writes:
> 
> >By an accident of history, I happen to understand exactly how an adder
> >works, down to the transistor level.
> >
> >Under no circumstances do I want to be bothered with that level of detail
> >when the task at hand is adding A to B to yield C, and then printing the
> >value contained in C.
> >
> 
> Then I hope you never get a job writing a safety-critical piece of software.
> High level languages provide the illusion that you can add two numbers without
> having to worry about how it is done, but the illusion has cracks around the
> edges.  Sometimes 30000 + 30000 results in an error.  Sometimes it results in
> a negative number.  Sometimes 0.1+0.1+0.1+0.1+0.1+0.1+0.1+0.1 is 0.7999999.
> Unless you understand the hardware these cracks in the illusion will
> eventually cause your software to fail.  Certainly you can get a lot done 
> without such deep uderstanding, but to say that under no circumstances do you 
> want to be bothered is a dangerous and naive attitude.

I think the correct answer has been implied in the responses.  If you are
writing a safety critical application in a language in which you must add
30000 plus 30000, but cannot trust the results due to inadequate
visibility in the language to the limitations of integer addition, then
the correct solution is not to learn the hardware in obsessive detail, but
to select a language in which you can directly specify that integer
results must be correct at least to sums of 60000.  Ada is one such
language.

More concisely, writing "safe" software in unsafe languages is unfruitful.

Certainly in the case of programming in the large, where a 100 to 1
variance in professional programmer skills and productivity is documented,
use of unsafe languages leads to, e.g., OSs that continually crash,
products widely available today.

To return to the original topic, a programmer's first exposure to the
computer is none too soon to begin teaching good programming practice,
including skills that will be useful for the programming in the large that
is the eventual destiny of most of us who earn our livings creating
software.

My goal, if I ever gain the skills, is to create a programming language
which binds comments tightly to programming constructs, in such a way that
_omitting_ internal documentation becomes a conscience act.  Were I so
talented, I'd add a tight binding to formal proof of the program, so that
the code and the proof that it works were written as a seamless whole.

"Come the revolution ..." the geezer said ...

-- 
Xanthian.                  | "..want the consequences of what you want.." | 
Kent, the man from xanth.  |        Neil A. Maxwell, LDS Apostle          |
Kent Paul Dolan            ------------------------------------------------
········@{well,qualcomm}.com     Jobhunting?  Check www.qualcomm.com!
From: ·····@imap1.asu.edu
Subject: Re: Wanted: programming language for 9 yr old
Date: 
Message-ID: <44lmqs$pap@news.asu.edu>
Try Prolog.  It utilizes a different paradigm for programming.  Although 
it probably isn't much use for graphics.

--
+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-
"The road goes ever on and on."
	- Bilbo Baggins, as quoted in _The C++ Programming Language._
-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
From: Herman Rubin
Subject: Re: Wanted: programming language for 9 yr old
Date: 
Message-ID: <44rfv4$1qnq@b.stat.purdue.edu>
In article <··········@dsinc.myxa.com>, Randy Brown <···@banjo.dsi.com> wrote:
>·····@imap1.asu.edu writes:

>>Try Prolog.  It utilizes a different paradigm for programming.  Although 
>>it probably isn't much use for graphics.

>This is the second recommendation for Prolog in this thread.  I find this
>very disturbing in that it seems to suggest that predicate logic is
>somehow easier to understand than, say an imperative description, for our
>hypothetical nine-year-old.  I think this betrays a certain lack of
>empathy.  I've tried to show my little brother (13) a little predicate
>logic, and it's not basic.

><begin flame bait>
>In fact, predicate logic isn't more basic at all --- it's just a 
>restricted toy that we mathematicians find easy to play with.  As
>a formalization, it may or may not be vaguely close to the way
>we reason (watch developments in cognitave science) but it's usefulness
>is primarily in it's simplicity.  Mathematicians should not
>confuse simplicity with comprehensibility.
><end flame bait>

This attempt to sanctify how people actually jump to conclusions,
rather than finding out how the same problems can be done in a
logical manner, is deplorable.  We should not be concerned in 
teaching anything other than psychology how people "reason", nor
should we try to emulate this.  This is especially true if, as
seems to case, especially in reasoning in the face of uncertainty,
the reasoning is found to be self-inconsistent.

Formal logic looks at a minimal set of processes, shows that these
have some desirable properties, and proceeds by using ONLY those.

As for being basic, formal logic is THE basic part of mathematics.
Now one cannot really teach a little predicate logic, any more than
one can teach reading starting with one letter.

Formal logic has been successfully taught to 10 year olds, and I
believe should be taught a couple of years earlier.

>I think it's much easier to explain imperative programming to
>a nine year old, because they already have some idea of what it means
>to follow directions.  You can probably teach them an impure functional
>language like scheme (which still has some idea of step, then next step,
>without the necessity of destructive assignment).  But I think you'd
>have a hard time teaching them prolog, at least until they've had
>11th-grade geometry.

What is a proof but step, then next step, etc.?  The choice of what 
to do at any step may be less obvious.

I think that anyone who can learn rigorous geometry, especially that
late, can learn logic at age 9 if not earlier.  This does not address
the problem of the proper programming language, 
-- 
Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907-1399
······@stat.purdue.edu	 Phone: (317)494-6054	FAX: (317)494-0558
From: Randy Brown
Subject: Re: Wanted: programming language for 9 yr old
Date: 
Message-ID: <44rkqq$6ic@dsinc.myxa.com>
······@b.stat.purdue.edu (Herman Rubin) writes:

>In article <··········@dsinc.myxa.com>, Randy Brown <···@banjo.dsi.com> wrote:
>>·····@imap1.asu.edu writes:
>><begin flame bait>
>>In fact, predicate logic isn't more basic at all --- it's just a 
>>restricted toy that we mathematicians find easy to play with.  As
>>a formalization, it may or may not be vaguely close to the way
>>we reason (watch developments in cognitave science) but it's usefulness
>>is primarily in it's simplicity.  Mathematicians should not
>>confuse simplicity with comprehensibility.
>><end flame bait>

>This attempt to sanctify how people actually jump to conclusions,
>rather than finding out how the same problems can be done in a
>logical manner, is deplorable.  We should not be concerned in 
>teaching anything other than psychology how people "reason", nor
>should we try to emulate this.  This is especially true if, as
>seems to case, especially in reasoning in the face of uncertainty,
>the reasoning is found to be self-inconsistent.

I dispute "sanctify" and I probably overstated what I meant to say.
Nor am I trying to say that how people reason is to be admired.  There
are some situations in which the vast majority of people will come
to the incorrect conclusion, and this is not to be admired.  What I
am saying is that logic is hard, for the very reason that it is a
formalization, and that people who have been working with it forget
that.

>As for being basic, formal logic is THE basic part of mathematics.

Well, the basis of mathematical systems.  And what I meant by
basic was basic to the human mind, not basic to the human construct
of mathematics.

>Now one cannot really teach a little predicate logic, any more than
>one can teach reading starting with one letter.

Right.  That's why it's hard.  The child whould have to know formal
logic for Prolog to make any sense at all.  Now perhaps kids should
be taught logic really early --- young kids will sometimes catch
onto things fast.  Hmm, teaching reading is a good analogue.
It's a basic skill which is required to go further in a field.

>>I think it's much easier to explain imperative programming to
>>a nine year old, because they already have some idea of what it means
>>to follow directions.  You can probably teach them an impure functional
>>language like scheme (which still has some idea of step, then next step,
>>without the necessity of destructive assignment).  But I think you'd
>>have a hard time teaching them prolog, at least until they've had
>>11th-grade geometry.

>What is a proof but step, then next step, etc.?  The choice of what 
>to do at any step may be less obvious.

But the kind of step is very different.

>I think that anyone who can learn rigorous geometry, especially that
>late, can learn logic at age 9 if not earlier.  This does not address
>the problem of the proper programming language, 

I was thinking of the 9 year old who wanted to learn about programming
computers, and who has then normal education of a 9 year old.  Not that
such a creature definitely exists.  But teaching Prolog means teaching
predicate logic, and then Prolog, and then you have to go back and
explain the operational behaviour of Prolog to explain the pretty
pictures that any but the most motivated 9 year old will want to see
drawn by their program.

	-Randy
From: Rayid Ghani
Subject: Re: Wanted: programming language for 9 yr old
Date: 
Message-ID: <452cp7$1hv@cherub.sewanee.edu>
In my opinion , you should try LOGO. It's not very effective but for a 9 
year old , it can prove to be a good learning experience. It has a lot 
of graphics stuff.
From: Mark McConnell
Subject: Re: Wanted: programming language for 9 yr old
Date: 
Message-ID: <453usf$1e34@bubba.ucc.okstate.edu>
Try Logo.  You give commands which make a "turtle" (read: pointer) move
around on the screen.  It's also good for other kinds of graphics.  The main
point is that the language is well built.  It encourages recursion, writing
small functions and larger programs that call the functions, etc.  It is
a descendant of Lisp-- and please don't flame Lisp or make parentheses
jokes.  Because it comes from Lisp, it's good for building systems that
grow organically, with little modules being added on at each programming
session; before you know it, you've built up some very substatial software
systems.

Look on the Web under Yahoo, Programmming Languages, Logo.  There was a long
article by someone about how his 7-year-old daughter and he learned Logo.

Now for myself, I may introduce my 7-year-old daughter to Lisp itself, next
year or so, if she's interested....
From: Alberto C Moreira
Subject: Re: Wanted: programming language for 9 yr old
Date: 
Message-ID: <alberto.456.0008D1B9@moreira.mv.com>
In article <··········@dsinc.myxa.com> ···@banjo.dsi.com (Randy Brown) writes:


>This is the second recommendation for Prolog in this thread.  I find this
>very disturbing in that it seems to suggest that predicate logic is
>somehow easier to understand than, say an imperative description, for our
>hypothetical nine-year-old.  I think this betrays a certain lack of
>empathy.  I've tried to show my little brother (13) a little predicate
>logic, and it's not basic.

><begin flame bait>

       Bait taken!

       There's different ways to look at it. While it's true that Prolog is 
       based on predicate logic, it is also true that predicate logic is
       one of the basic cornerstones of mathematical thought. You
       probably heard more than one people in this group advocating
       that predicate logic be taught to young children.

       But, more than predicate logic, Prolog is based on the splitting of
       one goal in subgoals, each of which is split in further subgoals,
       and so on, until the subgoals are elementary enough to be tackled
       by simple programming contructs. Programming in this style 
       teaches top-down thinking and a style of programming which I call
       "logical" (as opposed to "imperative" , "functional" or "object 
       oriented"). 

       One problem that's very apparent to many people who start with
       algorithmic languages is that they are very cumbersome to program
       anything non-numeric. The sort of things Prolog expresses would
       generate involuted programs if written in C++ or Pascal. For
       example, I once wrote an expert database system for plant
       classification that fit in less than 300 lines of Prolog. 

>In fact, predicate logic isn't more basic at all --- it's just a 
>restricted toy that we mathematicians find easy to play with.  As
>a formalization, it may or may not be vaguely close to the way
>we reason (watch developments in cognitave science) but it's usefulness
>is primarily in it's simplicity.  Mathematicians should not
>confuse simplicity with comprehensibility.
><end flame bait>

       The problem isn't either simplicity or comprehensibility, but to be able
       to formally state a theory which, barred pathological cases like those
       treated by Goedel's theorem, gives us a fair confidence that what we
       reason is formally correct. Programming isn't about cognitive science
       only, and isn't about just modelling what exists, but most and foremost
       being able to create worlds that don't exist. 

       The basic reason for predicate logic's existence is its usefulness as a
       basis for many mathematical theories. We have nothing else that
       simple that foots the bill, and the predicate logic model is relatively
       compact and can be applied to a wide range of modelling situations.

>I think it's much easier to explain imperative programming to
>a nine year old, because they already have some idea of what it means
>to follow directions. 

       It may be, and that was the prime idea when people created Fortran
       and Grace Hopper invented Cobol. But times change, and different
       problems appear. We now know that the imperative model can be
       very restrictive for non-algebraic applications; we also know that
       the first programming language a child or youngster learns molds
       his/her programming mind forever; and we have experimental
       results by researcher (for example, Seymour Papert) which seem to
       point elsewhere.

       I do agree with you that, all things considered, imperative languages
       should be taught early. My motives are mostly utilitarian, because
       imperative languages are so important to anyone who wants to make
       money out of programming. But I don't know if I agree that predicate
       logic shouldn't be taught at elementary level, or that Prolog isn't
       necessarily a good start.
       
>You can probably teach them an impure functional
>language like scheme (which still has some idea of step, then next step,
>without the necessity of destructive assignment).  But I think you'd
>have a hard time teaching them prolog, at least until they've had
>11th-grade geometry.

        If I had to choose between Prolog and Scheme, I'd take Prolog without
        a wink. If I had to take the functional road, I'd use ML rather than 
        Scheme; I find type inference too fundamental a tool not to have it.

        There was a book by Robert Kowalski, one of the creators of Prolog. 
        Trouble is, I read it 15 years ago and I don't remember anymore either
         its title or its publisher; maybe somebody reading this can complement
         it. This book explains the principles of logic programming, and rather
         than digging into the technicalities of the language, it explains 
         what logic programming is all about and gives plenty of examples.


                                                            _alberto_
From: Fergus Henderson
Subject: Re: Wanted: programming language for 9 yr old
Date: 
Message-ID: <9528313.7274@mulga.cs.mu.OZ.AU>
·······@moreira.mv.com (Alberto C Moreira) writes:

>       ... we also know that
>       the first programming language a child or youngster learns molds
>       his/her programming mind forever; ...

We do?  I think that is highly dubious.  The first program I ever wrote
was in COBOL, at the age of about 9; I seem to have recovered.

-- 
Fergus Henderson             |  "Australia is the richest country in the world,
···@cs.mu.oz.au              |   according to a new system of measuring wealth
http://www.cs.mu.oz.au/~fjh  |   announced by the World Bank yesterday."
PGP: finger ···@128.250.37.3 |  - Melbourne newspaper "The Age", 18 Sept 1995.
From: Matthias Blume
Subject: Re: Compiler abstractions [was: Wanted: programming language for 9 yr old]
Date: 
Message-ID: <BLUME.95Oct3135251@atomic.cs.princeton.edu>
In article <··········@lo-fan.jpl.nasa.gov> Erann Gat <gat> writes:

   Kent Dolan writes:

   >By an accident of history, I happen to understand exactly how an adder
   >works, down to the transistor level.
   >
   >Under no circumstances do I want to be bothered with that level of detail
   >when the task at hand is adding A to B to yield C, and then printing the
   >value contained in C.
   >

   Then I hope you never get a job writing a safety-critical piece of software.
   High level languages provide the illusion that you can add two numbers without
   having to worry about how it is done, but the illusion has cracks around the
   edges.  Sometimes 30000 + 30000 results in an error.  Sometimes it results in a
   negative number.  Sometimes 0.1+0.1+0.1+0.1+0.1+0.1+0.1+0.1 is 0.7999999.
   Unless you understand the hardware these cracks in the illusion will eventually
   cause your software to fail.  Certainly you can get a lot done without such
   deep uderstanding, but to say that under no circumstances do you want to be
   bothered is a dangerous and naive attitude.

This is only true in brain-dead languages.  In a sane language
30000+30000 should either yield 60000 or _raise an exception_.  Both
concepts (60000 and exceptions) do not require the understanding of
how a hardware adder works.  Both concepts can be explained entirely
within the framework of the language.

The accuracy problem for floating point numbers is trickier, but it
also doesn't require you to understand hardware.  Hardware
considerations can serve as an explanation of _why_ inaccuracies
actually occur in practice, but they are not the only possible
explanation.  Furthermore, such explanation is not really necessary.
Or do we also have to understand the reasons why the electromagetic
force is so much stronger than gravitation in order to understand
electronics in order to understand integrated circuits in order to
understand hardware adders in order to understand contemporary
computers ... in order to understand computing?!

After all, there are purely logical calculi (e.g. Turing machines,
RAMs, lambda-calculus, denotational semantics, ...), which are
adequate to explain the properties of computing and of programming
languages without any need to mention actual hardware.

Cheers,
--
-Matthias
From: William D Clinger
Subject: Re: Compiler abstractions [was: Wanted: programming language for 9 yr old]
Date: 
Message-ID: <44uaf7$krp@camelot.ccs.neu.edu>
In article <····················@moreira.mv.com>
·······@moreira.mv.com (Alberto C Moreira) writes:
>  Numerical inaccuracy is a direct consequence of hardware considerations,
>  embodied in the IEEE Floating Point standard.

No.  Numerical inaccuracy is a direct consequence of finite
representations, which are fundamental to computer science and
apply equally to hardware and software.

As asserted by the original post, it is important that the programmer
understand the abstractions provided by a programming language and its
implementation.  The programmer doesn't need to understand the
implementation itself.

The IEEE Standard for Binary Floating-Point Arithmetic is an excellent
example of this principle.  A few years ago I used this abstraction
to develop and analyze the first efficient algorithm for converting
decimal representations of floating point numbers into their closest
binary floating-point approximations.  I also implemented much of the
IEEE floating-point abstraction as part of a portable implementation
of my algorithm.  So I understand the IEEE floating-point standard far
better than most programmers, and better than most other people who
write compilers.

But I haven't the least idea how floating point is implemented by the
68882, PowerPC, SPARC, MIPS, and Alpha hardware that I use.  Furthermore
I am convinced that the time it would take me to understand those
hardware implementations (if indeed their details were available to me,
which they probably aren't) would be better spent learning how various
programming languages and compilers screw up the IEEE abstraction in
the model they present to the programmer.

In short:  Programmers need to understand the abstractions that are
presented by their high level languages and compilers, not the
abstraction that is presented by the hardware, and certainly not
the implementation of an abstraction in hardware.

William D Clinger
From: Erann Gat
Subject: Re: Compiler abstractions [was: Wanted: programming language for 9 yr old]
Date: 
Message-ID: <44uver$cfv@lo-fan.jpl.nasa.gov>
····@ccs.neu.edu (William D Clinger) wrote:

>In short:  Programmers need to understand the abstractions that are
>presented by their high level languages and compilers, not the
>abstraction that is presented by the hardware, and certainly not
>the implementation of an abstraction in hardware.

Except that C, the industry standard, defines many of its abstractions in terms
of the underlying hardware.  If you don't understand the hardware then you
can't write reliable software in C.  (Yet another reason to program in Lisp.)

E.

-- 
---
Erann Gat      ···@jpl.nasa.gov     (818) 306-6176
From: Michael Dillon
Subject: Re: Compiler abstractions [was: Wanted: programming language for 9 yr old]
Date: 
Message-ID: <44vts8$vdo@felix.junction.net>
In article <··········@lo-fan.jpl.nasa.gov>, Erann Gat  <gat> wrote:
>····@ccs.neu.edu (William D Clinger) wrote:
>
>>In short:  Programmers need to understand the abstractions that are
>>presented by their high level languages and compilers, not the
>>abstraction that is presented by the hardware, and certainly not
>>the implementation of an abstraction in hardware.
>
>Except that C, the industry standard, defines many of its abstractions in terms
>of the underlying hardware.  If you don't understand the hardware then you
>can't write reliable software in C.  (Yet another reason to program in Lisp.)

How do you know that the 9-year-old is going to have a career as a 
programmer and not as a semiconductor design engineer?

Must the 9-year-old choose their career path and start to specialize in 
grade 3?

Could there be value in teaching a child how the machines of this world 
work? 

My vote is to start the child with LOGO, advance into hardware 
simulations or real machine code, then move onto a commercial BASIC that 
can do the nifty graphics stuff used by realworld commercial programs.
From there some kids may want to branch into electronics/robotics using C 
and machine code and others may wish to branch into multimedia software 
design. That should do it up to Grade 8.


-- 
Michael Dillon                                    Voice: +1-604-546-8022
Memra Software Inc.                                 Fax: +1-604-542-4130
http://www.memra.com                             E-mail: ·······@memra.com
From: Alberto C Moreira
Subject: Re: Compiler abstractions [was: Wanted: programming language for 9 yr old]
Date: 
Message-ID: <alberto.459.0016DB1C@moreira.mv.com>
In article <··········@camelot.ccs.neu.edu> ····@ccs.neu.edu (William D Clinger) writes:

     Gosh, it's hard to disagree with somebody of the stature of Will Clinger. 

     Yet I'll try; I believe that sometimes it is necessary to leave the safe
     grounds of abstraction and dig into specifics.

>·······@moreira.mv.com (Alberto C Moreira) writes:
>>  Numerical inaccuracy is a direct consequence of hardware considerations,
>>  embodied in the IEEE Floating Point standard.

>No.  Numerical inaccuracy is a direct consequence of finite
>representations, which are fundamental to computer science and
>apply equally to hardware and software.

      I believe we're saying more or less the same; one major reason why
      we use finite representations is because of our hardware limitations.
      If our hardware could represent numbers to a precision much larger than 
      the highest precision needed by any application, finite representation   
      wouldn't hurt numerical accuracy.

      Also, today the difference between hardware and software is very blurred.
      If I write it in ML I call it software; if I write the same thing in HDL 
      I call it hardware. In the company I work for, one of the sourest 
      decisions is whether some high-level graphics function will be coded
      into the hardware or left for the device driver programmers to handle.

      I also believe that the programming abstraction is a real number, yet 
      when programming applications it is unavoidable that the actual 
      representation is well taken into consideration, specially when 
      we hit the boundaries of that representation.
 
      The limits placed on structure (a.2^b, a.16^b) 
      and precision (both a and b are bounded within an interval) 
      must be taken into account in the abstraction itself; in a sense, 
      programming languages don't offer real numbers but an approximation 
      thereof, consisting of two bounded  values and a few other bits and
      pieces. While it may be convenient to forget this and use the 
      abstraction as if we had real numbers, many problems will get close 
      enough to our precision or structural boundaries so that those must 
      be taken into account.

>As asserted by the original post, it is important that the programmer
>understand the abstractions provided by a programming language and its
>implementation.  The programmer doesn't need to understand the
>implementation itself.

      You're very right that the programmer must understand the abstractions
      provided by the programming language. Yet, I'm not sure he/she doesn't
      need to understand representation details, because they play a 
      significant role when accuracy is required at boundary conditions. 

>The IEEE Standard for Binary Floating-Point Arithmetic is an excellent
>example of this principle.  A few years ago I used this abstraction
>to develop and analyze the first efficient algorithm for converting
>decimal representations of floating point numbers into their closest
>binary floating-point approximations.  I also implemented much of the
>IEEE floating-point abstraction as part of a portable implementation
>of my algorithm.  So I understand the IEEE floating-point standard far
>better than most programmers, and better than most other people who
>write compilers.

       I believe you. 

>But I haven't the least idea how floating point is implemented by the
>68882, PowerPC, SPARC, MIPS, and Alpha hardware that I use.  Furthermore
>I am convinced that the time it would take me to understand those
>hardware implementations (if indeed their details were available to me,
>which they probably aren't) would be better spent learning how various
>programming languages and compilers screw up the IEEE abstraction in
>the model they present to the programmer.

       Again, if the kind of computing I do doesn't bang against the 
       walls of the representation, I'll probably never need to know 
       what's going on inside the hood. But if I need to squeeze that 
       extra bit of precision, or if I must give stability to an unstable 
       computation, then exact knowledge of the representation may give 
       me the extra bonus. 

>In short:  Programmers need to understand the abstractions that are
>presented by their high level languages and compilers, not the
>abstraction that is presented by the hardware, and certainly not
>the implementation of an abstraction in hardware.

       I would think a programmer should know all of the above. 
       But, again, I'm talking as a professional programmer rather
       than as a computer scientist.

                   
                                                           _alberto_
From: Herman Rubin
Subject: Re: Compiler abstractions [was: Wanted: programming language for 9 yr old]
Date: 
Message-ID: <4510nl$1hva@b.stat.purdue.edu>
In article <··········@camelot.ccs.neu.edu>,
William D Clinger <····@ccs.neu.edu> wrote:
>In article <····················@moreira.mv.com>
>·······@moreira.mv.com (Alberto C Moreira) writes:
>>  Numerical inaccuracy is a direct consequence of hardware considerations,
>>  embodied in the IEEE Floating Point standard.

>No.  Numerical inaccuracy is a direct consequence of finite
>representations, which are fundamental to computer science and
>apply equally to hardware and software.

But should this be the case in the software?  The arithmetic taught to
children of this age already does not have this limitation.  All of the
programming languages I have seen only discuss directly fixed accuracy
computation.  This corresponds to telling children that addition and
multiplication only work up to a fixed number of digits; this is what
comes out of the computer languages.

Would you teach a child that one can multiply 9 digit integers, but only
the least 9 digits of this arithmetic operation are available?  With many
computers, this is the hardware, and the software does not help.  Also,
for decimals, you can only work with a power of 10 time a 15-digit number
after the decimal point.  Likewise, this is the hardware, and the software
gives even less assistance.

>As asserted by the original post, it is important that the programmer
>understand the abstractions provided by a programming language and its
>implementation.  The programmer doesn't need to understand the
>implementation itself.

The programmer does not need to understand how the implementation itself
is carried out, but should understand exactly what it does.  Also, what
else the computer can do should be of interest.  This is even hard for
people who understand computers to figure out.

			.......................

>In short:  Programmers need to understand the abstractions that are
>presented by their high level languages and compilers, not the
>abstraction that is presented by the hardware, and certainly not
>the implementation of an abstraction in hardware.

Exactly what is provided by hardware needs to be understood if one is
to do a good job in the quite numerous situations not anticipated by
the language designers.  This is not trivial, and not adequately 
covered by any of the current high level languages.
-- 
Herman Rubin, Dept. of Statistics, Purdue Univ., West Lafayette IN47907-1399
······@stat.purdue.edu	 Phone: (317)494-6054	FAX: (317)494-0558
From: Ron Nicholson
Subject: Re: Compiler abstractions [was: Wanted: programming language for 9 yr old]
Date: 
Message-ID: <451r87$8q7@murrow.corp.sgi.com>
······@b.stat.purdue.edu (Herman Rubin) wrote:
>Would you teach a child that one can multiply 9 digit integers, but only
>the least 9 digits of this arithmetic operation are available? 

Believe it or not, at least one school district allow students to use
calculators soon after the kids learn how to do arithmetic on 3 digit
numbers by hand.  These kids already know that the number of digits in
a calculation is limited by the size of the calculator display.

---
Ronald H. Nicholson, Jr.                ···@engr.sgi.com, ···@netcom.com
#include <canonical.disclaimer>         // I speak only for myself, etc.
From: Fergus Henderson
Subject: Re: Compiler abstractions [was: Wanted: programming language for 9 yr old]
Date: 
Message-ID: <9528106.27188@mulga.cs.mu.OZ.AU>
······@b.stat.purdue.edu (Herman Rubin) writes:

>William D Clinger <····@ccs.neu.edu> wrote:
>
>>No.  Numerical inaccuracy is a direct consequence of finite
>>representations, which are fundamental to computer science and
>>apply equally to hardware and software.
>
>But should this be the case in the software?  The arithmetic taught to
>children of this age already does not have this limitation.  All of the
>programming languages I have seen only discuss directly fixed accuracy
>computation.  

The programming language Goedel provides arbitrary precision integers
and rational numbers. In fact Goedel does not have any
bounded-precision integer type at all, although it does have a
fixed-accuracy floating point type.

In many languages (Haskell and C++, to name just two) it is easy to 
write a library which provides arbitrary precision integers and
rational numbers.

-- 
Fergus Henderson             |  "Australia is the richest country in the world,
···@cs.mu.oz.au              |   according to a new system of measuring wealth
http://www.cs.mu.oz.au/~fjh  |   announced by the World Bank yesterday."
PGP: finger ···@128.250.37.3 |  - Melbourne newspaper "The Age", 18 Sept 1995.
From: Brian Harvey
Subject: Re: Compiler abstractions [was: Wanted: programming language for 9 yr old]
Date: 
Message-ID: <458t1d$jkf@agate.berkeley.edu>
Hey, gang, I think this thread has reached the point where those of you
who want to continue it should narrow the number of newsgroups involved.
I recommend comp.lang.misc; specifically, the relevance to comp.lang.logo
is asymptotically approaching zero.  IMHO of course.
From: Thant Tessman
Subject: Re: Compiler abstractions [was: Wanted: programming language for 9 yr old]
Date: 
Message-ID: <44ubkm$ks@louie.disney.com>
In article <····················@moreira.mv.com>,
·······@moreira.mv.com writes:

[...]

>      It is important to understand adders in order to be able to do 
>      programming. Basic gate logic and state machine understanding
>      is considered fundamental in many a CS course, and not without
>      reason. It is a fact today that more and more software is disappearing
>      into hardware; while chip design is more and more software design.
>      Denying that basic hardware knowledge isn't fundamental to a CS
>      major is, in my opinion, not a good idea in today's world.

[...]

I know nothing about hardware.  I've never taken a CS course in my
life.  But I do know that the implementation of Scheme that I use
(Chez) makes it far easier to get the right mathematical answer than a
language close to the hardware like C or C++.

	(+ 12345678901234567890 12345678901234567890)
	  => 24691357802469135780

	(sqrt -1)
	  => 0+1i

	(* (/ 2 3) (/ 3 4))
	  => 1/2

	(exact? (/ 2 3))
	  => #t

	(exact? (sqrt 2))
	  => #f

At the same time, it makes provisions to give up this accuracy in
exchange for performance.  (If I want the performance, yeah, I should
know something about IEEE floating-point standards in order to know
something about the answers I'm going to get.)

Of course, the language implementors better know hardware, but there
is no doubt in my mind that a program written in a non-brain-dead
language by someone who doesn't know how the hardware adds numbers is
far more likely to be safe than a program written in a low-level
language where the programmer needs to know how the hardware works.

-thant
From: Erann Gat
Subject: Re: Compiler abstractions [was: Wanted: programming language for 9 yr old]
Date: 
Message-ID: <44uv80$cfv@lo-fan.jpl.nasa.gov>
·····@atomic.cs.princeton.edu (Matthias Blume) wrote:
>In article <··········@lo-fan.jpl.nasa.gov> Erann Gat <gat> writes:
>   Sometimes 30000 + 30000 results in an error.  Sometimes it results in a
>   negative number.
>
>This is only true in brain-dead languages.

Then I suppose you consider C a brain-dead language.  I can't say that I
disagree with you, but it is the industry standard, and so I stand by my
position that a knowledge of hardware limitations is essential to writing
reliable software.

E.
-- 
---
Erann Gat      ···@jpl.nasa.gov     (818) 306-6176
From: Matt Austern
Subject: Re: Compiler abstractions [was: Wanted: programming language for 9 yr old]
Date: 
Message-ID: <MATT.95Oct4165951@godzilla.EECS.Berkeley.EDU>
In article <··········@lo-fan.jpl.nasa.gov> Erann Gat <gat> writes:

> Except that C, the industry standard, defines many of its
> abstractions in terms of the underlying hardware.  If you don't
> understand the hardware then you can't write reliable software in C.
> (Yet another reason to program in Lisp.)

Depends on what "understand the hardware" means.  You need to know
that integral values are represented in binary, you need to have some
vague idea about the sizes of various types (and you ought to
understand what the things in limits.h mean), you might want to 
know at least a tiny bit about addressing schemes.  

I wouldn't describe any of those things as understanding the hardware.
You can write lots of things in C without being able to draw the
schematics of an adder circuit, without knowing the difference between
static and dynamic RAM, without knowing what signals your computer's
bus uses, without knowing the buffering scheme of your computer's I/O
ports, without knowing how cache controllers work, without knowing the
voltage levels of your CPU.  What you need to know, whether you're
working in C or in any other language, is a very specific and limited
software abstraction of your hardware.

I also question describing C as "the industry standard": it's too
grandiose a phrase.  C is a popular and sometimes useful language, no
more and no less.
-- 
  Matt Austern                             He showed his lower teeth.  "We 
  ····@physics.berkeley.edu                all have flaws," he said, "and 
  http://dogbert.lbl.gov/~matt             mine is being wicked."
From: Erann Gat
Subject: Re: Compiler abstractions [was: Wanted: programming language for 9 yr old]
Date: 
Message-ID: <4512mk$96a@lo-fan.jpl.nasa.gov>
····@godzilla.EECS.Berkeley.EDU (Matt Austern) wrote:

>You can write lots of things in C without being able to draw the
>schematics of an adder circuit, without knowing the difference between
>static and dynamic RAM, without knowing what signals your computer's
>bus uses, without knowing the buffering scheme of your computer's I/O
>ports, without knowing how cache controllers work, without knowing the
>voltage levels of your CPU.

Yes, but that proves nothing.  I can change my oil without knowing how my car
works; that doesn't make me an effective mechanic.  There are many things you
cannot do without knowing these things.  If you don't know the difference
between static and dynamic RAM and how your cache controller works then one
day you will write control software for an X-ray machine and wonder why it
killed someone with an overdose because your code is suddenly running three
times slower than it did on your development system.  (If you think this is an
improbable scenario then you really need to start reading comp.risks.)  The
insidious thing about this sort of ignorance is that not only do programmers
not know, they don't know that they don't know, and they don't know why what
they don't know is important until it causes a problem.  Most of the time
these problems just result in extra debugging time, but every now and then
someone gets hurt or killed.

E.

-- 
---
Erann Gat      ···@jpl.nasa.gov     (818) 306-6176
From: Mike Kenney
Subject: Re: Compiler abstractions [was: Wanted: programming language for 9 yr old]
Date: 
Message-ID: <451f5d$gd5@nntp5.u.washington.edu>
In article <··········@lo-fan.jpl.nasa.gov>, Erann Gat  <gat> wrote:
>····@godzilla.EECS.Berkeley.EDU (Matt Austern) wrote:
>
>>You can write lots of things in C without being able to draw the
>>schematics of an adder circuit, without knowing the difference between
>>static and dynamic RAM, without knowing what signals your computer's
>>bus uses, without knowing the buffering scheme of your computer's I/O
>>ports, without knowing how cache controllers work, without knowing the
>>voltage levels of your CPU.
>
>Yes, but that proves nothing.  I can change my oil without knowing how my car
>works; that doesn't make me an effective mechanic.  There are many things you

True.  However, programmers tend to be more specialized than auto mechanics.

>cannot do without knowing these things.  If you don't know the difference
>between static and dynamic RAM and how your cache controller works then one
>day you will write control software for an X-ray machine and wonder why it
>killed someone with an overdose because your code is suddenly running three
>times slower than it did on your development system.  (If you think this is an
>improbable scenario then you really need to start reading comp.risks.)  The
>insidious thing about this sort of ignorance is that not only do programmers
>not know, they don't know that they don't know, and they don't know why what
>they don't know is important until it causes a problem.  Most of the time
>these problems just result in extra debugging time, but every now and then
>someone gets hurt or killed.

For a systems programmer, especially when dealing with embedded systems,
knowledge of the hardware and it's limitations is necessary.  However, I
see no need for the typical "applications" programmer to worry about
such issues.


-- 
-----------------------------------------------------------------------------
Mike Kenney			 Windoze 95  -  "You make a grown man cry"
·····@apl.washington.edu
-----------------------------------------------------------------------------
From: Matthias Blume
Subject: Re: Compiler abstractions [was: Wanted: programming language for 9 yr old]
Date: 
Message-ID: <BLUME.95Oct4105052@atomic.cs.princeton.edu>
In article <····················@moreira.mv.com> ·······@moreira.mv.com (Alberto C Moreira) writes:

   In article <··················@atomic.cs.princeton.edu> ·····@atomic.cs.princeton.edu (Matthias Blume) writes:

   >This is only true in brain-dead languages.  In a sane language
   >30000+30000 should either yield 60000 or _raise an exception_.  Both
   >concepts (60000 and exceptions) do not require the understanding of
   >how a hardware adder works.  Both concepts can be explained entirely
   >within the framework of the language.

	  If we were to raise an exception every time a floating point operation 
	  loses precision, scientific programming would be an unfeasible
	  proposition.

It is a good idea to read one another's articles to the end before
spouting off with ones counter arguments.  I was not talking about
floating point here.  Moreover, I said _myself_ that floating point
and loss of precision _are_ more complicated an issue.

There is no excuse on today's hardware for not raising an exception on
integer overflow, because almost all processors implement this in
hardware and don't slow down the common case of no overflow.

   >The accuracy problem for floating point numbers is trickier, but it
   >also doesn't require you to understand hardware.  Hardware
   >considerations can serve as an explanation of _why_ inaccuracies
   >actually occur in practice, but they are not the only possible
   >explanation.

	  Numerical inaccuracy is a direct consequence of hardware considerations,
	  embodied in the IEEE Floating Point standard.

Did I say anything to the contrary?  My point is that such things are
also understandable if you don't know the details of IEEE standards.
If I remember correctly, then we are talking about teaching
programming to nine-year olds, and I hope you are not seriously
suggesting to assault 9-year olds with such a document.

It can be explained quite simply why floating point computation can
and will loose precision.  Everybody learns about long division in
school, and this pretty much suffices to talk about the basics.

          Yes, it does help a lot to
	  know the standard and how it's implemented, and why bits are dropped;

The latter is actually trivial and follows directly from the
limitations of the binary (or decimal, or ...) representation of
numbers.  You do _not_ need to mention hardware in order to explain
it.

          Basic gate logic and state machine understanding
	  is considered fundamental in many a CS course,

IMO, gate logic belongs to EE.  The underlying theory of boolean
algebra, of automata, of formal languages, of computability, etc, are
indeed math and as such well worth knowing for any computer scientist.
But they can be taught without any references to transitors, voltage,
current, wires, etc.

          and not without
	  reason. It is a fact today that more and more software is disappearing
	  into hardware; while chip design is more and more software design.
	  Denying that basic hardware knowledge isn't fundamental to a CS
	  major is, in my opinion, not a good idea in today's world.

We are still talking about 9-year olds, not CS majors!

Regards,
--
-Matthias
From: Alberto C Moreira
Subject: Re: Compiler abstractions [was: Wanted: programming language for 9 yr old]
Date: 
Message-ID: <alberto.460.00013CCC@moreira.mv.com>
In article <··················@atomic.cs.princeton.edu> ·····@atomic.cs.princeton.edu (Matthias Blume) writes:

>It is a good idea to read one another's articles to the end before
>spouting off with ones counter arguments.  I was not talking about
>floating point here.  Moreover, I said _myself_ that floating point
>and loss of precision _are_ more complicated an issue.

     My point was that knowledge of hardware-level representation is a
     good idea; while the integer example touches a point, the floating
     point goes more down to the point I was advocating.

>There is no excuse on today's hardware for not raising an exception on
>integer overflow, because almost all processors implement this in
>hardware and don't slow down the common case of no overflow.

      I have mixed reactions about this. If I depend on the machine's
      interrupt-handling speed, I'm not too sure I want it to waste
      time interrupting. On the average Risc processor a test for
      overflow is pipelined and takes 0 cycles; an interrupt takes
      a number of them, plus two context switches. 

>Did I say anything to the contrary?  My point is that such things are
>also understandable if you don't know the details of IEEE standards.
>If I remember correctly, then we are talking about teaching
>programming to nine-year olds, and I hope you are not seriously
>suggesting to assault 9-year olds with such a document.

     At this point, the subject evolved far beyond 9-year olds! 

>It can be explained quite simply why floating point computation can
>and will loose precision.  Everybody learns about long division in
>school, and this pretty much suffices to talk about the basics.

    True enough. But that's not the point I was trying to make, but
    rather that in any sort of real programming it's not enough to know
    that floating point looses precision, but rather how to minimize 
    that loss of precision. 

>The latter is actually trivial and follows directly from the
>limitations of the binary (or decimal, or ...) representation of
>numbers.  You do _not_ need to mention hardware in order to explain
>it.

    Except that the flow is reversed: the representation of numbers is only
    done the way we do it because of the limitations placed by the hardware
    and the desire to achieve a representation that's standard accross a fair
    number of hardware platforms. Yes you can generalize out of the low-level,
    but a bit is still a bit, and mantissas and exponents only have that many
    bits; whether a bit is a hardware transistor or a 0-1 digit in base 2 is
    immaterial, we're still dealing with limitations enforced by today's 
    hardware.

>IMO, gate logic belongs to EE.  The underlying theory of boolean
>algebra, of automata, of formal languages, of computability, etc, are
>indeed math and as such well worth knowing for any computer scientist.
>But they can be taught without any references to transitors, voltage,
>current, wires, etc.

    I don't agree. This is the age of HDL and VHDL. An algorithm can be
    programmed in ML and be called software, or it can be programmed
    in HDL and be called hardware. A state machine works the same way
    whether it's in a lexer or inside a chip. Gates and logic can be taught
    without mention to current, wires and transistors; that is indeed EE.
    (By the way, you're talking to one). But when the Windows 95 Bitblt
    standard requires the implementation of all 256 logical functions of
    3 binary inputs at the bit level, there isn't much difference between
    that and a piece of hardware; actually in many cases I know the
    algorithm ends up split so that a portion of it goes to the hardware
    (typically the 2-input side) while the rest goes to software. Still,
    there are chips where it's all done in hardware.

>We are still talking about 9-year olds, not CS majors!

    You probably heard Seymour Papert and his collaborators talking about how
    much programming they managed to teach to 9-year olds and even younger
    children. But you're right, by now the debate went far overboard!


                                                       _alberto_
From: Matt Austern
Subject: Re: Compiler abstractions [was: Wanted: programming language for 9 yr old]
Date: 
Message-ID: <MATT.95Oct5235502@godzilla.EECS.Berkeley.EDU>
In article <··········@lo-fan.jpl.nasa.gov> Erann Gat <gat> writes:

> There are many things you cannot do without knowing these things.
> If you don't know the difference between static and dynamic RAM and
> how your cache controller works then one day you will write control
> software for an X-ray machine and wonder why it killed someone with
> an overdose because your code is suddenly running three times slower
> than it did on your development system.  (If you think this is an
> improbable scenario then you really need to start reading
> comp.risks.)

I do read comp.risks, and I recognize the scenerio under discussion.
Although I question whether an insufficient understanding of the
minutia of cache controllers was really the problem in that particular
case...

That's really nitpicking, though.  Yes, if you write a certain kind of
embedded system code, then you have to know something about how cache
controllers work.  If you write a slightly different kind of embedded
system code, then you have to know something about I/O ports.  If
you want to write video card device drivers, then you have to know
something about yet another kind of hardware.  Of course some
programmers have to know some things about hardware.

I really don't understand the point, though.  Surely Mr. Gat isn't
suggesting that every programmer has to have a full understanding of
every level of hardware and software, from quantum mechanical band
theory, to the physics of field-effect transistors, to gate circuits,
to CPU architecture, to VLSI layout, and all the rest of the way up
through compiler design and beyond?  I really don't think that anybody
has a deep understanding of more than two or three of those levels,
and I don't think that it's possible to acquire a deep understanding
of all of them in a single lifetime.

At some point, we all have to work with abstractions; the hard part is
making sure that the abstraction you're using are the right ones for
the problem you're working on.  As long as I know enough to understand
the limits of my own knowledge, I'll be happy.
-- 
  Matt Austern                             He showed his lower teeth.  "We 
  ····@physics.berkeley.edu                all have flaws," he said, "and 
  http://dogbert.lbl.gov/~matt             mine is being wicked."
From: Matthias Blume
Subject: Re: Compiler abstractions [was: Wanted: programming language for 9 yr old]
Date: 
Message-ID: <BLUME.95Oct6083734@atomic.cs.princeton.edu>
In article <················@hplgr2.hpl.hp.com> ···@hplgr2.hpl.hp.com (Guillermo (Bill) J. Rozas) writes:

   In article <··················@atomic.cs.princeton.edu> ·····@atomic.cs.princeton.edu (Matthias Blume) writes:

   |   After all, there are purely logical calculi (e.g. Turing machines,
   |   RAMs, lambda-calculus, denotational semantics, ...), which are
   |   adequate to explain the properties of computing and of programming
   |   languages without any need to mention actual hardware.

   Hah?  Turing machines, RAM, lambda-calculus and denotational semantics
   ARE specifications for hardware no less and no more than the ISA
   (instruction set architecture) for any commercial processor.  A
   computer architecture manual is a description of a formal langugage
   for carrying out computations.  These days such a language is often
   defined formally in terms of some other formal language.

True enough.  Moreover, it doesn't contradict what I said.

--
-Matthias
From: ···@laphroig.mch.sni.de
Subject: Re: Compiler abstractions [was: Wanted: programming language for 9 yr old]
Date: 
Message-ID: <ARE.95Oct11182404@laphroig.mch.sni.de>
Simon Brooke wrote:

>No irrational number can ever be stored or
>manipulated with complete accuracy on a digital computer. That's true, of
>course. It's a mathematical truth, not a physical one. I can demonstrate
>it in a few lines of propositional logic. 

This is simply not true. You can handle algebraic numbers by
manipulating there minimal polynomials (and an estimation of the
interval that separates the zeroes). This is an exact representation,
as for the rational numbers. Above that you can handle any constructible
real, e.g. as a lazy list of digits or as functions f : N --> N that
have as argument the required precision and give you the mantissa. This
is not just an approximation, but exact arithmetic, because you are
handling the functions as representations for the number. Look for
example in the article by H. Boehm "Exact Real Arithmetic". I forgot the
Journal but can dig up  the reference if your really interested. Besides
H. Boehm is also on the net and will be more competent to answer any
further detailde questions you might have.

Andreas Eder