From: ······@gmail.com
Subject: The Importance of Terminology's Quality
Date: 
Message-ID: <f4abdb41-be28-4628-a2ad-7fb6cea6ed65@u12g2000prd.googlegroups.com>
I'd like to introduce a blog post by Stephen Wolfram, on the design
process of Mathematica. In particular, he touches on the importance of
naming of functions.

• Ten Thousand Hours of Design Reviews (2008 Jan 10) by Stephen
Wolfram
 http://blog.wolfram.com/2008/01/10/ten-thousand-hours-of-design-reviews/

The issue is fitting here today, in our discussion of “closure”
terminology recently, as well the jargons “lisp 1 vs lisp2” (multi-
meaning space vs single-meaning space), “tail recursion”, “currying”,
“lambda”, that perennially crop up here and elsewhere in computer
language forums in wild misunderstanding and brouhaha.

The functions in Mathematica, are usually very well-name, in contrast
to most other computing languages. In particular, the naming in
Mathematica, as Stephen Wolfram implied in his blog above, takes the
perspective of naming by capturing the essense, or mathematical
essence, of the keyword in question. (as opposed to, naming it
according to convention, which often came from historical happenings)
When a thing is well-named from the perspective of what it actually
“mathematically” is, as opposed to historical developments, it avoids
vast amount of potential confusion.

Let me give a few example.

• “lambda”, widely used as a keyword in functional languages, is named
just “Function” in Mathematica. The “lambda” happend to be called so
in the field of symbolic logic, is due to use of the greek letter
lambda “λ” by happenstance. The word does not convey what it means.
While, the name “Function”, stands for the mathematical concept of
“function” as is.

• Module, Block, in Mathematica is in lisp's various “let*”. The
lisp's keywords “let”, is based on the English word “let”. That word
is one of the English word with multitudes of meanings. If you look up
its definition in a dictionary, you'll see that it means many
disparate things. One of them, as in “let's go”, has the meaning of
“permit; to cause to; allow”. This meaning is rather vague from a
mathematical sense. Mathematica's choice of Module, Block, is based on
the idea that it builds a self-contained segment of code. (however,
the choice of Block as keyword here isn't perfect, since the word also
has meanings like “obstruct; jam”)

• Functions that takes elements out of list are variously named First,
Rest, Last, Extract, Part, Take, Select, Cases, DeleteCases... as
opposed to “car”, “cdr”, “filter”, “filter”, “pop”, “shift”,
“unshift”, in lisps and perl and other langs.

The above are some examples. The thing to note is that, Mathematica's
choices are often such that the word stands for the meaning themselves
in some logical and independent way as much as possible, without
having dependent on a particular computer science field's context or
history. One easy way to confirm this, is taking a keyword and ask a
wide audience, who doesn't know about the language or even unfamiliar
of computer programing, to guess what it means. The wide audience can
be made up of mathematicians, scientists, engineers, programers,
laymen. This general audience, are more likely to guess correctly what
Mathematica's keyword is meant in the language, than the the name used
in other computer languages who's naming choices goes by convention or
context.

(for example, Perl's naming heavily relies on unix culture (grep,
pipe, hash...), while functional lang's namings are typically heavily
based on the field of mathematical logic (e.g. lambda, currying,
closure, monad, ...). Lisp's cons, car, cdr, are based on computer
hardware (this particular naming, caused a major damage to the lisp
language to this day). (Other examples: pop, shift are based on
computer science jargon of “stack”. Grep is from Global Regular
Expression Print, while Regular Expression is from theoretical
computer science of Automata... The name regex has done major hidden
damage to the computing industry, in the sense that if it have just
called it “string patterns”, then a lot explanations, literatures,
confusions, would have been avoided.))

(Note: Keywords or functions in Mathematica are not necessarily always
best named. Nor are there always one absolute choice as best, as there
are many other considerations, such as the force of wide existing
convention, the context where the function are used, brevity,
limitations of English language, different scientific context (e.g.
math, physics, engineering), or even human preferences.)

----------------------------

Many of the issues regarding the importance and effects of
terminology's quality, i've wrote about since about 2000. Here are the
relevant essays:

• Jargons of Info Tech Industry
 http://xahlee.org/UnixResource_dir/writ/jargons.html

• The Jargon “Lisp1” vs “Lisp2”
 http://xahlee.org/emacs/lisp1_vs_lisp2.html

• The Term Curring In Computer Science
 http://xahlee.org/UnixResource_dir/writ/currying.html

• What Is Closure In A Programing Language
 http://xahlee.org/UnixResource_dir/writ/closure.html

• What are OOP's Jargons and Complexities
 http://xahlee.org/Periodic_dosage_dir/t2/oop.html

• Sun Microsystem's abuse of term “API” and “Interface”
 http://xahlee.org/java-a-day/interface.html

• Math Terminology and Naming of Things
 http://xahlee.org/cmaci/notation/math_namings.html

  Xah
  ···@xahlee.org
∑ http://xahlee.org/

☄

From: Bruce C. Baker
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <5irUj.71335$y05.1766@newsfe22.lga>
<······@gmail.com> wrote in message 
·········································@u12g2000prd.googlegroups.com...

[...]

(for example, Perl's naming heavily relies on unix culture (grep,
pipe, hash...), ...

"hash" + "pipe"? Ahhhhh, /no wonder/ Perl is the syntactic mishmash it is! 
;-)
From: Kyle McGivney
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <e04ba51f-30a0-44fa-aade-c2fb6c99f944@26g2000hsk.googlegroups.com>
> • Module, Block, in Mathematica is in lisp's various “let*”. The
> lisp's keywords “let”, is based on the English word “let”. That word
> is one of the English word with multitudes of meanings. If you look up
> its definition in a dictionary, you'll see that it means many
> disparate things. One of them, as in “let's go”, has the meaning of
> “permit; to cause to; allow”. This meaning is rather vague from a
> mathematical sense. Mathematica's choice of Module, Block, is based on
> the idea that it builds a self-contained segment of code. (however,
> the choice of Block as keyword here isn't perfect, since the word also
> has meanings like “obstruct; jam”)

If the purpose of let is to introduce one or more variable bindings,
then I don't see how changing to block or module would improve
anything. I've always found it fairly intuitive to parse (let ((x
5)) ...) to "let x be five". Additionally, replacing let with the
synonyms you provided would approximately yield "permit x to be five"
or "allow x to be five". In my mind you have constructed an argument
in favor of let here (obviously it's better than block, because
nobody's going to come along and be confused about whether let will
"obstruct" or "jam" them :)

There are many easy targets to poke at in the CL spec. let isn't one
of those.
From: John Thingstad
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <op.uatsrmq2ut4oq5@pandora.alfanett.no>
På Thu, 08 May 2008 04:14:35 +0200, skrev Kyle McGivney  
<·······@gmail.com>:

>> • Module, Block, in Mathematica is in lisp's various “let*”. The
>> lisp's keywords “let”, is based on the English word “let”. That word
>> is one of the English word with multitudes of meanings. If you look up
>> its definition in a dictionary, you'll see that it means many
>> disparate things. One of them, as in “let's go”, has the meaning of
>> “permit; to cause to; allow”. This meaning is rather vague from a
>> mathematical sense. Mathematica's choice of Module, Block, is based on
>> the idea that it builds a self-contained segment of code. (however,
>> the choice of Block as keyword here isn't perfect, since the word also
>> has meanings like “obstruct; jam”)
>
> If the purpose of let is to introduce one or more variable bindings,
> then I don't see how changing to block or module would improve
> anything. I've always found it fairly intuitive to parse (let ((x
> 5)) ...) to "let x be five". Additionally, replacing let with the
> synonyms you provided would approximately yield "permit x to be five"
> or "allow x to be five". In my mind you have constructed an argument
> in favor of let here (obviously it's better than block, because
> nobody's going to come along and be confused about whether let will
> "obstruct" or "jam" them :)

How about bind?
  (bind ((v f (mod i)) ((a b) list) (t (rem q)))

1. is a multiple-value-bind
2. is a destructuring-bind
3. is a let

http://common-lisp.net/project/metabang-bind/

To me this is a example of where the ANSI group could have spent more time  
on naming.

--------------
John Thingstad
From: George Neuner
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <lmu424djdnfsb6nbqjgo4kebrpmn7vm4p9@4ax.com>
On Wed, 7 May 2008 16:13:36 -0700 (PDT), ·······@gmail.com"
<······@gmail.com> wrote:

>I'd like to introduce a blog post by Stephen Wolfram, on the design
>process of Mathematica. In particular, he touches on the importance of
>naming of functions.
>
>� Ten Thousand Hours of Design Reviews (2008 Jan 10) by Stephen
>Wolfram
> http://blog.wolfram.com/2008/01/10/ten-thousand-hours-of-design-reviews/
>
>The issue is fitting here today, in our discussion of �closure�
>terminology recently, as well the jargons �lisp 1 vs lisp2� (multi-
>meaning space vs single-meaning space), �tail recursion�, �currying�,
>�lambda�, that perennially crop up here and elsewhere in computer
>language forums in wild misunderstanding and brouhaha.
>
>The functions in Mathematica, are usually very well-name, in contrast
>to most other computing languages. In particular, the naming in
>Mathematica, as Stephen Wolfram implied in his blog above, takes the
>perspective of naming by capturing the essense, or mathematical
>essence, of the keyword in question. (as opposed to, naming it
>according to convention, which often came from historical happenings)
>When a thing is well-named from the perspective of what it actually
>�mathematically� is, as opposed to historical developments, it avoids
>vast amount of potential confusion.
>
>Let me give a few example.
>
>� �lambda�, widely used as a keyword in functional languages, is named
>just �Function� in Mathematica. The �lambda� happend to be called so
>in the field of symbolic logic, is due to use of the greek letter
>lambda �?� by happenstance. The word does not convey what it means.
>While, the name �Function�, stands for the mathematical concept of
>�function� as is.

Lambda is not a function - it is a function constructor.   A better
name for it might be MAKE-FUNCTION.  

I (and probably anyone else you might ask) will agree that the term
"lambda" is not indicative of it's meaning, but it's meaning is not
synonymous with "function" as you suggest.

I suspect Mathematica of just following historical convention itself.
Mathematica uses the term inappropriately just as it was (ab)used in
Pascal (circa 1970).  I'm not aware of earlier (ab)uses but there
probably were some.


>� Module, Block, in Mathematica is in lisp's various �let*�. The
>lisp's keywords �let�, is based on the English word �let�. That word
>is one of the English word with multitudes of meanings. If you look up
>its definition in a dictionary, you'll see that it means many
>disparate things. One of them, as in �let's go�, has the meaning of
>�permit; to cause to; allow�. This meaning is rather vague from a
>mathematical sense. Mathematica's choice of Module, Block, is based on
>the idea that it builds a self-contained segment of code. (however,
>the choice of Block as keyword here isn't perfect, since the word also
>has meanings like �obstruct; jam�)

"Let" is the preferred mathematical term for introducing a variable.
Lisp uses it in that meaning.  What a word means or doesn't in English
is not particularly relevant to its use in another language.  There
are many instances of two human languages using identical looking
words with very different meanings.  Why should computer languages be
immune?


>� Functions that takes elements out of list are variously named First,
>Rest, Last, Extract, Part, Take, Select, Cases, DeleteCases... as
>opposed to �car�, �cdr�, �filter�, �filter�, �pop�, �shift�,
>�unshift�, in lisps and perl and other langs.

Lisp has "first" and "rest" - which are just synonyms for "car" and
"cdr".  Older programmers typically prefer car and cdr for historical
reasons, but few object to the use of first and rest except for
semantic reasons - Lisp does not have a list data type, lists are
aggregates constructed from a primitive pair data type.  Pairs can be
used to construct trees as well as lists and "rest" has little meaning
for a tree.  When used with lists, first and rest are meaningful terms
and no one will object to them.

Besides which, you easily create synonyms for car and cdr (and
virtually any other Lisp function) with no more burden on the reader
of your code than using a C macro.  You can call them "first and
rest", or "first and second", or "left and right", or "red and black"
or whatever else makes sense for your data.

People coming to Lisp from other languages often complain of macros
that they have to learn "a new language" every time they read a
program.  But in fact, the same is true in all languages - the reader
always has to learn the user-defined functions and how they are used
to make sense of the code.  In that sense Lisp is no different from
any other language.

Common Lisp doesn't have "filter".  Even so, with respect to the
merits of calling a function "extract" or "select" versus "filter", I
think that's just a matter of familiarity.  The term "filter" conveys
a more general idea than the others and can, by parameterization,
perform either function.


>The above are some examples. The thing to note is that, Mathematica's
>choices are often such that the word stands for the meaning themselves
>in some logical and independent way as much as possible, without
>having dependent on a particular computer science field's context or
>history. One easy way to confirm this, is taking a keyword and ask a
>wide audience, who doesn't know about the language or even unfamiliar
>of computer programing, to guess what it means. The wide audience can
>be made up of mathematicians, scientists, engineers, programers,
>laymen. This general audience, are more likely to guess correctly what
>Mathematica's keyword is meant in the language, than the the name used
>in other computer languages who's naming choices goes by convention or
>context.
>
>(for example, Perl's naming heavily relies on unix culture (grep,
>pipe, hash...), while functional lang's namings are typically heavily
>based on the field of mathematical logic (e.g. lambda, currying,
>closure, monad, ...). Lisp's cons, car, cdr, are based on computer
>hardware (this particular naming, caused a major damage to the lisp
>language to this day). (Other examples: pop, shift are based on
>computer science jargon of �stack�. Grep is from Global Regular
>Expression Print, while Regular Expression is from theoretical
>computer science of Automata... The name regex has done major hidden
>damage to the computing industry, in the sense that if it have just
>called it �string patterns�, then a lot explanations, literatures,
>confusions, would have been avoided.))
>
>(Note: Keywords or functions in Mathematica are not necessarily always
>best named. Nor are there always one absolute choice as best, as there
>are many other considerations, such as the force of wide existing
>convention, the context where the function are used, brevity,
>limitations of English language, different scientific context (e.g.
>math, physics, engineering), or even human preferences.)
>
>----------------------------
>
>Many of the issues regarding the importance and effects of
>terminology's quality, i've wrote about since about 2000. Here are the
>relevant essays:
>
>� Jargons of Info Tech Industry
> http://xahlee.org/UnixResource_dir/writ/jargons.html
>
>� The Jargon �Lisp1� vs �Lisp2�
> http://xahlee.org/emacs/lisp1_vs_lisp2.html
>
>� The Term Curring In Computer Science
> http://xahlee.org/UnixResource_dir/writ/currying.html
>
>� What Is Closure In A Programing Language
> http://xahlee.org/UnixResource_dir/writ/closure.html
>
>� What are OOP's Jargons and Complexities
> http://xahlee.org/Periodic_dosage_dir/t2/oop.html
>
>� Sun Microsystem's abuse of term �API� and �Interface�
> http://xahlee.org/java-a-day/interface.html
>
>� Math Terminology and Naming of Things
> http://xahlee.org/cmaci/notation/math_namings.html
>
>  Xah
>  ···@xahlee.org
>? http://xahlee.org/
>
>?

George
--
for email reply remove "/" from address
From: J�rgen Exner
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <fm55241ivmal876v6nrnrh4s56t3jl3dtb@4ax.com>
George Neuner <·········@/comcast.net> wrote:
>On Wed, 7 May 2008 16:13:36 -0700 (PDT), ·······@gmail.com"
><······@gmail.com> wrote:

         +-------------------+             .:\:\:/:/:.
         |   PLEASE DO NOT   |            :.:\:\:/:/:.:
         |  FEED THE TROLLS  |           :=.' -   - '.=:
         |                   |           '=(\ 9   9 /)='
         |   Thank you,      |              (  (_)  )
         |       Management  |              /`-vvv-'\
         +-------------------+             /         \
                 |  |        @@@          / /|,,,,,|\ \
                 |  |        @@@         /_//  /^\  \\_\
   @·@@·@        |  |         |/         WW(  (   )  )WW
   \||||/        |  |        \|           __\,,\ /,,/__
    \||/         |  |         |      jgs (______Y______)
/\/\/\/\/\/\/\/\//\/\\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
==============================================================

jue
From: ·······@eurogaran.com
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <b8637c52-fad4-4b96-a100-510e0228aa3f@y38g2000hsy.googlegroups.com>
>          |   PLEASE DO NOT   |            :.:\:\:/:/:.:
>          |  FEED THE TROLLS  |           :=.' -   - '.=:

I don't think Xah is trolling here (contrary to his/her habit)
but posing an interesting matter of discussion.

Don't know to which point it fits, but I would like to do some rather
novel comment on operator naming:
As a non native english speaker, the first time I ever encountered the
word "if" was when learning to program. The same can be said of the
other words (for, then, else...) This caused my brain to adscribe them
meanings completely outside the context of everyday language. My point
is that perhaps this is advantageous. So, contrary to tradition (which
considers a desirable goal to write programs as close as possible to
everyday english), I found convenient that programming languages use
words different from the words of my native tongue. I suspect that is
why car and cdr have caught on vs. first end rest.
From: Lew
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <oaGdnQFiu-v7Rr_VnZ2dnUVZ_qrinZ2d@comcast.com>
·······@eurogaran.com wrote:
>>          |   PLEASE DO NOT   |            :.:\:\:/:/:.:
>>          |  FEED THE TROLLS  |           :=.' -   - '.=:
> 
> I don't think Xah is trolling here (contrary to his/her habit)
> but posing an interesting matter of discussion.

Interesting is in the eye of the beholder.  After you'd read the same recycled 
crud from certain posters again and again, it because trollish spam.

-- 
Lew
From: Sherman Pendley
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <m17ie4ltos.fsf@dot-app.org>
·······@eurogaran.com writes:

>> � � � � �| � PLEASE DO NOT � | � � � � � �:.:\:\:/:/:.:
>> � � � � �| �FEED THE TROLLS �| � � � � � :=.' - � - '.=:
>
> I don't think Xah is trolling here (contrary to his/her habit)
> but posing an interesting matter of discussion.

It might be interesting in the abstract, but any such discussion, when
cross-posted to multiple language groups on usenet, will inevitably
devolve into a flamewar as proponents of the various languages argue
about which language better expresses the ideas being talked about.
It's like a law of usenet or something.

If Xah wanted an interesting discussion, he could have posted this to
one language-neutral group such as comp.programming. He doesn't want
that - he wants the multi-group flamefest.

sherm--

-- 
My blog: http://shermspace.blogspot.com
Cocoa programming in Perl: http://camelbones.sourceforge.net
From: Waylen Gumbal
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <68i6b5F2soauaU1@mid.individual.net>
Sherman Pendley wrote:
> ·······@eurogaran.com writes:
> >
> > > PLEASE DO NOT | :.:\:\:/:/:.:
> > > FEED THE TROLLS | :=.' - - '.=:
> >
> > I don't think Xah is trolling here (contrary to his/her habit)
> > but posing an interesting matter of discussion.
>
> It might be interesting in the abstract, but any such discussion, when
> cross-posted to multiple language groups on usenet, will inevitably
> devolve into a flamewar as proponents of the various languages argue
> about which language better expresses the ideas being talked about.
> It's like a law of usenet or something.
>
> If Xah wanted an interesting discussion, he could have posted this to
> one language-neutral group such as comp.programming. He doesn't want
> that - he wants the multi-group flamefest.

Not everyone follows language-neutral groups (such as comp,programming 
as you pointed out), so you actually reach more people by cross posting. 
This is what I don't understand - everyone seems to assume that by cross 
posting, one intends on start a "flamefest", when in fact most such 
"flamefests" are started by those who cannot bring themselves to 
skipping over the topic that they so dislike.

-- 
wg 
From: Lew
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <fISdnYstu5R9eb7VnZ2dnUVZ_o_inZ2d@comcast.com>
Waylen Gumbal wrote:
> Not everyone follows language-neutral groups (such as comp,programming 
> as you pointed out), so you actually reach more people by cross posting. 
> This is what I don't understand - everyone seems to assume that by cross 
> posting, one intends on start a "flamefest", when in fact most such 
> "flamefests" are started by those who cannot bring themselves to 
> skipping over the topic that they so dislike.

It's not an assumption in Xah Lee's case.  He spams newsgroups irregularly 
with rehashed essays from years ago, and a number of people are just tired of 
him.  Don't blame the victims for the perpetrator's actions, OK?

-- 
Lew
From: Waylen Gumbal
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <68ia3oF2t5308U1@mid.individual.net>
Lew wrote:
> Waylen Gumbal wrote:
>> Not everyone follows language-neutral groups (such as
>> comp,programming as you pointed out), so you actually reach more
>> people by cross posting. This is what I don't understand - everyone
>> seems to assume that by cross posting, one intends on start a
>> "flamefest", when in fact most such "flamefests" are started by
>> those who cannot bring themselves to skipping over the topic that
>> they so dislike.
>
> It's not an assumption in Xah Lee's case.  He spams newsgroups
> irregularly with rehashed essays from years ago, and a number of
> people are just tired of him.

I did not know this. One should obviously not do that.

> Don't blame the victims for the perpetrator's actions, OK?

I'm not blaming any "victims", but I don't see anyone saying "read this 
or else", so why not just skip the thread or toss the OP in your 
killfile so you don't see his postings. If others want to discuss his 
topics, who are you or I to tell them not to?

-- 
wg 
From: J�rgen Exner
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <7ft7249u3a7vmgm50pi7sfcp7m8fevoq2m@4ax.com>
"Waylen Gumbal" <·······@gmail.com> wrote:
> so why not just skip the thread or toss the OP in your 
>killfile so you don't see his postings. 

Done years ago.

>If others want to discuss his 
>topics, who are you or I to tell them not to?

They are very welcome to do so in an appropriate NG for those topics.

jue
From: J�rgen Exner
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <9vo724prqd0etqlraapsv8e3j6bcigh5ea@4ax.com>
"Waylen Gumbal" <·······@gmail.com> wrote:
>Sherman Pendley wrote:
>> ·······@eurogaran.com writes:
>> >
>> > > PLEASE DO NOT | :.:\:\:/:/:.:
>> > > FEED THE TROLLS | :=.' - - '.=:
>Not everyone follows language-neutral groups (such as comp,programming 
>as you pointed out), so you actually reach more people by cross posting. 

You seem so have failed to grasp the concept of why Usenet is divided
into separate groups in the first place.

>This is what I don't understand - everyone seems to assume that by cross 
>posting, one intends on start a "flamefest", when in fact most such 
>"flamefests" are started by those who cannot bring themselves to 
>skipping over the topic that they so dislike.

By your argument there is no need for individual groups in the first
place. We could just as well use a single "HereGoesEverything" and just
skip over those topics that we so dislike.

jue
From: Waylen Gumbal
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <68iacjF2sq5gcU1@mid.individual.net>
J�rgen Exner wrote:
> "Waylen Gumbal" <·······@gmail.com> wrote:
> > Sherman Pendley wrote:
> > > ·······@eurogaran.com writes:
> > > >
> > > > > PLEASE DO NOT | :.:\:\:/:/:.:
> > > > > FEED THE TROLLS | :=.' - - '.=:
> > Not everyone follows language-neutral groups (such as
> > comp,programming as you pointed out), so you actually reach more
> > people by cross posting.
>
> You seem so have failed to grasp the concept of why Usenet is divided
> into separate groups in the first place.

No, not really. You keep group specific content in the applicable group 
or groups only. But are there not times where content overlaps the 
topics of multiple groups and to get maximum feedback, post to all 
applicable groups (the keyword being "applicable") ?

-- 
wg 
From: George Neuner
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <eth9249j65rkqhiuok07tbpisl07o569n9@4ax.com>
On Thu, 8 May 2008 22:38:44 -0700, "Waylen Gumbal" <·······@gmail.com>
wrote:

>Sherman Pendley wrote:
>> ·······@eurogaran.com writes:
>> >
>> > > PLEASE DO NOT | :.:\:\:/:/:.:
>> > > FEED THE TROLLS | :=.' - - '.=:
>> >
>> > I don't think Xah is trolling here (contrary to his/her habit)
>> > but posing an interesting matter of discussion.
>>
>> It might be interesting in the abstract, but any such discussion, when
>> cross-posted to multiple language groups on usenet, will inevitably
>> devolve into a flamewar as proponents of the various languages argue
>> about which language better expresses the ideas being talked about.
>> It's like a law of usenet or something.
>>
>> If Xah wanted an interesting discussion, he could have posted this to
>> one language-neutral group such as comp.programming. He doesn't want
>> that - he wants the multi-group flamefest.
>
>Not everyone follows language-neutral groups (such as comp,programming 
>as you pointed out), so you actually reach more people by cross posting. 
>This is what I don't understand - everyone seems to assume that by cross 
>posting, one intends on start a "flamefest", when in fact most such 
>"flamefests" are started by those who cannot bring themselves to 
>skipping over the topic that they so dislike.

The problem is that many initial posts have topics that are misleading
or simplistic.  Often an interesting discussion can start on some
point the initial poster never considered or meant to raise.

George
--
for email reply remove "/" from address
From: Waylen Gumbal
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <68l2piF2sv620U1@mid.individual.net>
George Neuner wrote:
> On Thu, 8 May 2008 22:38:44 -0700, "Waylen Gumbal" <·······@gmail.com>
> wrote:



> > Not everyone follows language-neutral groups (such as
> > comp,programming as you pointed out), so you actually reach more
> > people by cross posting. This is what I don't understand - everyone
> > seems to assume that by cross posting, one intends on start a
> > "flamefest", when in fact most such "flamefests" are started by
> > those who cannot bring themselves to skipping over the topic that
> > they so dislike.
>
> The problem is that many initial posts have topics that are misleading
> or simplistic.  Often an interesting discussion can start on some
> point the initial poster never considered or meant to raise.

Is this not a possibility for any topic, whether it's cross-posted or 
not?


-- 
wg 
From: Lew
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <IfSdnTVIiu0SCbjVnZ2dnUVZ_qrinZ2d@comcast.com>
Waylen Gumbal wrote:
> George Neuner wrote:
>> On Thu, 8 May 2008 22:38:44 -0700, "Waylen Gumbal" <·······@gmail.com>
>> wrote:
> 
> 
> 
>>> Not everyone follows language-neutral groups (such as
>>> comp,programming as you pointed out), so you actually reach more
>>> people by cross posting. This is what I don't understand - everyone
>>> seems to assume that by cross posting, one intends on start a
>>> "flamefest", when in fact most such "flamefests" are started by
>>> those who cannot bring themselves to skipping over the topic that
>>> they so dislike.
>> The problem is that many initial posts have topics that are misleading
>> or simplistic.  Often an interesting discussion can start on some
>> point the initial poster never considered or meant to raise.
> 
> Is this not a possibility for any topic, whether it's cross-posted or 
> not?

You guys are off topic.  None of the million groups to which this message was 
posted are about netiquette.

-- 
Lew
From: Sherman Pendley
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <m1k5i2rupm.fsf@dot-app.org>
Lew <···@lewscanon.com> writes:

> You guys are off topic.  None of the million groups to which this
> message was posted are about netiquette.

Netiquette has come up at one point or another in pretty much every
group I've ever read. It's pretty much a universal meta-topic.

sherm--

-- 
My blog: http://shermspace.blogspot.com
Cocoa programming in Perl: http://camelbones.sourceforge.net
From: Lew
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <UNadnTKhW71LiLvVnZ2dnUVZ_uSdnZ2d@comcast.com>
Sherman Pendley wrote:
> Lew <···@lewscanon.com> writes:
> 
>> You guys are off topic.  None of the million groups to which this
>> message was posted are about netiquette.
> 
> Netiquette has come up at one point or another in pretty much every
> group I've ever read. It's pretty much a universal meta-topic.

Good.  Then please have the courtesy not to include comp.lang.java.programmer 
in this thread's distribution any longer.

-- 
Lew
From: Lew
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <UNadnSyhW72Ci7vVnZ2dnUVZ_uSdnZ2d@comcast.com>
Sherman Pendley wrote:
> Lew <···@lewscanon.com> writes:
> 
>> You guys are off topic.  None of the million groups to which this
>> message was posted are about netiquette.
> 
> Netiquette has come up at one point or another in pretty much every
> group I've ever read. It's pretty much a universal meta-topic.

Good, then please have the courtesy not to include comp.lang.java.programmer 
in the distribution for this thread any longer.

-- 
Lew
From: David Combs
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <g1nhcc$d5e$1@reader2.panix.com>
In article <···············@mid.individual.net>,
Waylen Gumbal <·······@gmail.com> wrote:
>Sherman Pendley wrote:
>> ·······@eurogaran.com writes:
>> >
>> > > PLEASE DO NOT | :.:\:\:/:/:.:
>> > > FEED THE TROLLS | :=.' - - '.=:
>> >
>> > I don't think Xah is trolling here (contrary to his/her habit)
>> > but posing an interesting matter of discussion.
>>
>> It might be interesting in the abstract, but any such discussion, when
>> cross-posted to multiple language groups on usenet, will inevitably
>> devolve into a flamewar as proponents of the various languages argue
>> about which language better expresses the ideas being talked about.
>> It's like a law of usenet or something.
>>
>> If Xah wanted an interesting discussion, he could have posted this to
>> one language-neutral group such as comp.programming. He doesn't want
>> that - he wants the multi-group flamefest.
>
>Not everyone follows language-neutral groups (such as comp,programming 
>as you pointed out), so you actually reach more people by cross posting. 
>This is what I don't understand - everyone seems to assume that by cross 
>posting, one intends on start a "flamefest", when in fact most such 
>"flamefests" are started by those who cannot bring themselves to 
>skipping over the topic that they so dislike.
>
>-- 
>wg 

Not one person on the planet agrees with me, I believe, but
it's always seemed to me that an *advantage* to posting to
multiple groups (especially ones generally "interested" in similar
subject matter but NOT subject to huge poster/lurker/answerer overlap,
er, without too many *people* getting multiple copies of the *same*
post) is that it would provide an opportunity of a widely-dispersed
bunch of people to have a *joint* discussion, with comments hopefully
coming in from a *variety* of viewpoints.

David
From: Rob Warnock
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <cMidnY3lU-7Lh7jVnZ2dnUVZ_qqgnZ2d@speakeasy.net>
George Neuner <·········@/comcast.net> wrote:
+---------------
| Common Lisp doesn't have "filter".
+---------------

Of course it does! It just spells it REMOVE-IF-NOT!!  ;-}  ;-}

    > (remove-if-not #'oddp (iota 10))

    (1 3 5 7 9)
    > (remove-if-not (lambda (x) (> x 4)) (iota 10))

    (5 6 7 8 9)
    > 


-Rob

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: George Neuner
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <3jgc2498f1te1uqh7rrf3odhevkrt3s75e@4ax.com>
On Fri, 09 May 2008 22:45:26 -0500, ····@rpw3.org (Rob Warnock) wrote:

>George Neuner <·········@/comcast.net> wrote:
>
>>On Wed, 7 May 2008 16:13:36 -0700 (PDT), ·······@gmail.com"
>><······@gmail.com> wrote:
>
>>>� Functions [in Mathematica] that takes elements out of list
>>>are variously named First, Rest, Last, Extract, Part, Take, 
>>>Select, Cases, DeleteCases... as opposed to �car�, �cdr�, 
>>>�filter�, �filter�, �pop�, �shift�, �unshift�, in lisps and
>>>perl and other langs.
>
>>| Common Lisp doesn't have "filter".
>
>Of course it does! It just spells it REMOVE-IF-NOT!!  ;-}  ;-}

I know.  You snipped the text I replied to.  

Xah carelessly conflated functions snatched from various languages in
an attempt to make some point about intuitive naming.  If he objects
to naming a function "filter", you can just imagine what he'd have to
say about remove-if[-not].

George
--
for email reply remove "/" from address
From: Robert Maas, http://tinyurl.com/uh3t
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <rem-2008may08-005@yahoo.com>
> From: ·······@gmail.com" <······@gmail.com>
> the importance of naming of functions.

I agree, that's a useful consideration in the design of any system
based on keywords, such as names of functions or operators.
(I'm not using "keyword" in the sense of a symbol in the KEYWORD package.)

> the naming in Mathematica, as Stephen Wolfram implied in his blog
> above, takes the perspective of naming by capturing the essense, or
> mathematical essence, of the keyword in question.

For concepts adapted from mathematics, that naming meta-convention makes sense.
For concepts adapted from data-processing, it's not so clearly good.

> lambda, widely used as a keyword in functional languages, is
> named just Function in Mathematica.

That makes sense, but then what does Mathematica call the special
operator which Common Lisp calls Function? Or is that not provided,
and the programmer must case-by-case handcode what they really mean?
(function foo) == (symbol-function 'foo) ;so FUNCTION not really needed there
(function (lambda (x) (+ x y)) ;y is a lexical variable to be encapsulated
                               ; into the resultant closure
How would you do something like that in mathematica?

> Module, Block, in Mathematica is in lisp's various let*
> The lisp's keywords let, is based on the English word let.

No, that's not correct. It's based on the mathematical use of "let":
Let R be the complete ordered field of real numbers.
Let S be the subset of R consisting of numbers which satisfy
  equations expressed in transcendental functions with rational
  parameters.
Prove that S is dense in R.
(Yeah, the question is trivial, but I'm just showing how "let" might be used.)

I see nothing wrong with LET used in that way in Lisp.
Bind would be good too.
I don't like Module or Block at all, because those words convey
nothing about binding some temporary local name to represent some
globally-definable concept, and they actually mis-lead because
module can mean either something like a vector space over a ring,
or a set of functions/methods that serve some small problem domain
within a larger library of functions/methods, or a set of lectures
on a single-topic within a curriculum, while Block sounds more like
just some random consecutive interval of statements rather having
nothing specific to do with bindings. TAGBODY or PROGN would be
more properly called Block perhaps.

> One easy way to confirm this, is taking a keyword and ask a wide
> audience, who doesn't know about the language or even unfamiliar of
> computer programing, to guess what it means.

That is a biassed question because the only options are the words
you choose in advance. Better would be to reverse the question: Ask
random people on the street what they would like to call these
concepts:
1 You set up a temporary relation between a name and some datum,
   such that all references to that name give you that same datum.
   For example you might set up the name 'n' to temporarily mean 5.
   Fill in the missing word: (word (n 5) <say "ouch" n times>)
2 You set up a rule by which one step of data-processing is performed,
   taking input to produce output. For example you set up a rule by
   which the input value is divided by two if it's even but
   multiplied by three and then 1 added if it was odd.
   Fill in the missing word: (word (n) <if n even n/2 else 3*n+1>)
3 You set up a rule as above, but also give it a permanent name.
   Fill in the missing word: (word collatz (n) <if n even n/2 else 3*n+1>)
4 You already have a rule that takes two input data and produces one output.
     (oldword foo (x y) <absolute value of difference between x and y>)
   You now know one of the two inputs, and that info won't change for a while,
   and you hate to keep repeating it. So you want to define a new rule
   that has that one known parameter built-in so that you only have
   to say the *other* parameter each time you use the new rule.
   Fill in the missing newword: (newword foo <x is fixed as 3>)
So I'd choose words: 1=let 2=lambda 3=defun 4=curry.
What words would a typical man on the street choose instead?

> The name regex has done major hidden damage to the computing industry

I have my own gripe against regular expressions ("regex" for short).
I hate the extremely terse format, with horrible concoctions of
escape and anti-escape magic characters, to fit within Unix's
255-character limit on command lines, compared to a nicely
s-expression nested notation that would be possible if regex hadn't
entrenched itself so solidly.
From: Lew
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <oaGdnQBiu-stRr_VnZ2dnUVZ_qrinZ2d@comcast.com>
Robert Maas, http://tinyurl.com/uh3t wrote:
> I have my own gripe against regular expressions ("regex" for short).
> I hate the extremely terse format, with horrible concoctions of
> escape and anti-escape magic characters, to fit within Unix's
> 255-character limit on command lines, compared to a nicely
> s-expression nested notation that would be possible if regex hadn't
> entrenched itself so solidly.

This is all very interesting, but not Java.

-- 
Lew
From: George Neuner
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <l097249qdplrlq17jctfthbooqdlgol85e@4ax.com>
On Thu, 08 May 2008 03:25:54 -0700,
···················@SpamGourmet.Com (Robert Maas,
http://tinyurl.com/uh3t) wrote:

>> From: ·······@gmail.com" <······@gmail.com>
>> the importance of naming of functions.
>
>> ... [take] a keyword and ask a wide
>> audience, who doesn't know about the language or even unfamiliar of
>> computer programing, to guess what it means.

This is a dumb idea ...

>Better would be to reverse the question: Ask
>random people on the street what they would like to call these
>concepts:

... and this one is even dumber.

Terms don't exist in a vacuum - they exist to facilitate communication
within a particular knowledge or skill domain.  For example, English
is only meaningful to those who speak English.  The opinions of random
people who have no relevant domain knowledge are worthless.

Such a survey could only be meaningful if the survey population
already possessed some knowledge of programming, but were not already
aware of the particular terminology being surveyed.

George
--
for email reply remove "/" from address
From: David Combs
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <g1njc5$p1i$1@reader2.panix.com>
In article <·················@yahoo.com>,
Robert Maas, http://tinyurl.com/uh3t <···················@SpamGourmet.Com> wrote:
>> From: ·······@gmail.com" <······@gmail.com>
>> the importance of naming of functions.
>

Lisp is *so* early a language (1960?), preceeded mainly only by Fortran (1957?)?,
and for sure the far-and-away the first as a platform for *so many* concepts
of computer-science, eg lexical vs dynamic ("special") variables, passing
*unnamed* functions as args (could Algol 60 also do something like that,
via something it maybe termed a "thunk"), maybe is still the only one
in which program and data have the same representation -- that it'd 
seem logical to use it's terminology in all languages.

From C is the very nice distinction between "formal" and "actual" args.

And from algol-60, own and local -- own sure beats "static"!

And so on.


To me, it's too bad that that hacker-supreme (and certified genius)
Larry W. likes to make up his own terminology for Perl.  Sure makes
for a lot of otherwise-unnecessary pages in the various Perl texts,
as well as posts here.

Of course, a whole lot better his terminology than no language at all!


David
From: John Thingstad
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <op.ubzgobgsut4oq5@pandora.alfanett.no>
P� Fri, 30 May 2008 02:56:37 +0200, skrev David Combs <·······@panix.com>:

> In article <·················@yahoo.com>,
> Robert Maas, http://tinyurl.com/uh3t  
> <···················@SpamGourmet.Com> wrote:
>>> From: ·······@gmail.com" <······@gmail.com>
>>> the importance of naming of functions.
>>
>
> Lisp is *so* early a language (1960?), preceeded mainly only by Fortran  
> (1957?)?,
> and for sure the far-and-away the first as a platform for *so many*  
> concepts
> of computer-science, eg lexical vs dynamic ("special") variables, passing
> *unnamed* functions as args (could Algol 60 also do something like that,
> via something it maybe termed a "thunk"), maybe is still the only one
> in which program and data have the same representation -- that it'd
> seem logical to use it's terminology in all languages.
>
> From C is the very nice distinction between "formal" and "actual" args.
>
> And from algol-60, own and local -- own sure beats "static"!
>
> And so on.
>
>
> To me, it's too bad that that hacker-supreme (and certified genius)
> Larry W. likes to make up his own terminology for Perl.  Sure makes
> for a lot of otherwise-unnecessary pages in the various Perl texts,
> as well as posts here.
>
> Of course, a whole lot better his terminology than no language at all!
>
>
> David
>
>

Perl is solidly based in the UNIX world on awk, sed, bash and C.
I don't like the style, but many do.

--------------
John Thingstad
From: Lew
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <v-2dnSYQWMbyFN3VnZ2dnUVZ_rmdnZ2d@comcast.com>
John Thingstad wrote:
> Perl is solidly based in the UNIX world on awk, sed, bash and C.
> I don't like the style, but many do.

Please exclude the Java newsgroups from this discussion.

-- 
Lew
From: Gordon Etly
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <6abgirF358rgoU1@mid.individual.net>
Lew wrote:
> John Thingstad wrote:

> > Perl is solidly based in the UNIX world on awk, sed, bash and C.
> > I don't like the style, but many do.

> Please exclude the Java newsgroups from this discussion.

Why? Do you speak for everyone in that, this, or other groups?


-- 
G.Etly 
From: Lew
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <u92dncc7VuoxE93VnZ2dnUVZ_hydnZ2d@comcast.com>
Gordon Etly wrote:
> Lew wrote:
>> John Thingstad wrote:
> 
>>> Perl is solidly based in the UNIX world on awk, sed, bash and C.
>>> I don't like the style, but many do.
> 
>> Please exclude the Java newsgroups from this discussion.
> 
> Why? Do you speak for everyone in that, this, or other groups?

I don't know why you'd even want to impose your Perl conversation on the Java 
group in the first place, troll.

Plonk.

-- 
Lew
From: Stephan Bour
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <Zf00k.2464$uE5.1601@flpi144.ffdc.sbc.com>
Lew wrote:
} John Thingstad wrote:
} > Perl is solidly based in the UNIX world on awk, sed, bash and C.
} > I don't like the style, but many do.
}
} Please exclude the Java newsgroups from this discussion.

Did it ever occur to you that you don't speak for entire news groups?


Stephan. 
From: Arne Vajhøj
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <4840ab73$0$90274$14726298@news.sunsite.dk>
Stephan Bour wrote:
> Lew wrote:
> } John Thingstad wrote:
> } > Perl is solidly based in the UNIX world on awk, sed, bash and C.
> } > I don't like the style, but many do.
> }
> } Please exclude the Java newsgroups from this discussion.
> 
> Did it ever occur to you that you don't speak for entire news groups?

Did it occur to you that there are nothing about Java in the above ?

Arne
From: szr
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <g1qobk02jbj@news4.newsguy.com>
Arne Vajh�j wrote:
> Stephan Bour wrote:
>> Lew wrote:
>> } John Thingstad wrote:
>> } > Perl is solidly based in the UNIX world on awk, sed, bash and C.
>> } > I don't like the style, but many do.
>> }
>> } Please exclude the Java newsgroups from this discussion.
>>
>> Did it ever occur to you that you don't speak for entire news groups?
>
> Did it occur to you that there are nothing about Java in the above ?

Looking at the original post, it doesn't appear to be about any specific 
language.

-- 
szr 
From: Peter Duniho
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <op.ub0cq1fy8jd0ej@petes-computer.local>
On Fri, 30 May 2008 22:40:03 -0700, szr <·····@szromanMO.comVE> wrote:

> Arne Vajhøj wrote:
>> Stephan Bour wrote:
>>> Lew wrote:
>>> } John Thingstad wrote:
>>> } > Perl is solidly based in the UNIX world on awk, sed, bash and C.
>>> } > I don't like the style, but many do.
>>> }
>>> } Please exclude the Java newsgroups from this discussion.
>>>
>>> Did it ever occur to you that you don't speak for entire news groups?
>>
>> Did it occur to you that there are nothing about Java in the above ?
>
> Looking at the original post, it doesn't appear to be about any specific
> language.

Indeed.  That suggests it's probably off-topic in most, if not all, of the  
newsgroups to which it was posted, inasmuch as they exist for topics  
specific to a given programming language.

Regardless, unless you are actually reading this thread from the c.l.j.p  
newsgroup, I'm not sure I see the point in questioning someone who _is_  
about whether the thread belongs there or not.  Someone who is actually  
following the thread from c.l.j.p can speak up if they feel that Lew is  
overstepping his bounds.  Anyone else has even less justification for  
"speaking for the entire newsgroup" than Lew does, and yet that's what  
you're doing when you question his request.

And if it's a vote you want, mark me down as the third person reading  
c.l.j.p that doesn't feel this thread belongs.  I don't know whether Lew  
speaks for the entire newsgroup, but based on comments so far, it's pretty  
clear that there unanimous agreement among those who have expressed an  
opinion.

If you all in the other newsgroups are happy having the thread there,  
that's great.  Please feel free to continue with your discussion.  But  
please, drop comp.lang.java.programmer from the cross-posting.

Pete
From: szr
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <g1ru8v0v3l@news4.newsguy.com>
Peter Duniho wrote:
> On Fri, 30 May 2008 22:40:03 -0700, szr <·····@szromanMO.comVE> wrote:
>
>> Arne Vajh�j wrote:
>>> Stephan Bour wrote:
>>>> Lew wrote:
>>>> } John Thingstad wrote:
>>>> } > Perl is solidly based in the UNIX world on awk, sed, } > bash 
>>>> and C. I don't like the style, but many do.
>>>> }
>>>> } Please exclude the Java newsgroups from this discussion.
>>>>
>>>> Did it ever occur to you that you don't speak for entire news
>>>> groups?
>>>
>>> Did it occur to you that there are nothing about Java in the above ?
>>
>> Looking at the original post, it doesn't appear to be about any
>> specific language.
>
> Indeed.  That suggests it's probably off-topic in most, if not all,
> of the newsgroups to which it was posted, inasmuch as they exist for
> topics specific to a given programming language.

Perhaps - comp.programming might of been a better place, but not all 
people who follow groups for specific languages follow a general group 
like that - but let me ask you something. What is it you really have 
against discussing topics with people of neighboring groups? Keep in 
mind you don't have to read anything you do not want to read. [1]

> Regardless, unless you are actually reading this thread from the
> c.l.j.p newsgroup, I'm not sure I see the point in questioning
> someone who _is_ about whether the thread belongs there or not.

I would rather have the OP comment about that, as he started the thread. 
But what gets me is why you are against that specific group being 
included but not others? What is so special about the Java group and why 
are you so sure people there don't want to read this thread? [1] What 
right do you or I or anyone have to make decisions for everyone in a 
news group? Isn't this why most news readers allow one to block a 
thread?

> And if it's a vote you want, mark me down as the third person reading
> c.l.j.p that doesn't feel this thread belongs.  I don't know whether
> Lew speaks for the entire newsgroup, but based on comments so far,
> it's pretty clear that there unanimous agreement among those who have
> expressed an opinion.

Ok, so, perhaps 3 people out of what might be several hundred, if not 
thousand (there is no way to really know, but there are certainly a lot 
of people who read that group, and as with any group, there are far more 
readers than there are people posting, so, again, just because you or 
two other people or so don't want to read a topic or dislike it, you 
feel you can decide for EVERYONE they mustn't read it? Again, this is 
why readers allow you to ignore threads. Please don't force your views 
on others; let them decide for themselves. [1]


[1] I do not mean this topic specifically,  but in general,  if one
    dislikes a thread, they are free to ignore it. I find it rather
    inconsiderate to attempt to force a decision for everyone, when
    one has the ability to simply ignore the thread entirely.


-- 
szr 
From: J�rgen Exner
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <kjv2441a9im6bcistqcav6c87j4024s2pj@4ax.com>
"szr" <·····@szromanMO.comVE> wrote:
>I would rather have the OP comment about that, as he started the thread. 

The OP is a very well-known troll who has the habit of spitting out a
borderline OT article to a bunch of loosly related NGs ever so often and
then sits back and enjoys the complaints and counter-complaints of the
regulars. He doesn't provide anything useful in any of the groups he
targets (at least AFAIK) and he doesn't participate in the resulting
mayhem himself, either. 
He will only go away if everyone just ignores him.

With this in mind another reminder:

         +-------------------+             .:\:\:/:/:.
         |   PLEASE DO NOT   |            :.:\:\:/:/:.:
         |  FEED THE TROLLS  |           :=.' -   - '.=:
         |                   |           '=(\ 9   9 /)='
         |   Thank you,      |              (  (_)  )
         |       Management  |              /`-vvv-'\
         +-------------------+             /         \
                 |  |        @@@          / /|,,,,,|\ \
                 |  |        @@@         /_//  /^\  \\_\
   @·@@·@        |  |         |/         WW(  (   )  )WW
   \||||/        |  |        \|           __\,,\ /,,/__
    \||/         |  |         |      jgs (______Y______)
/\/\/\/\/\/\/\/\//\/\\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
==============================================================

Follow-up adjusted.

jue
From: szr
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <g1s5s7019mo@news4.newsguy.com>
J�rgen Exner wrote:
> "szr" <·····@szromanMO.comVE> wrote:
>> I would rather have the OP comment about that, as he started the
>> thread.
>
> The OP is a very well-known troll who has the habit of spitting out a
> borderline OT article to a bunch of loosly related NGs ever so often
> and then sits back and enjoys the complaints and counter-complaints
> of the regulars.

While I agree cross-posting should be chosen more carefully, it seemed 
like a harmless article to me. I did not get the impression he was just 
trolling. There are people who like to post articles they come across 
and maybe want to start a discussion on.

Like I said, this may have been better to put in a more general 
programming group, and that yes, it is kind of OT for specific language 
groups, but I really saw no harm in it (and I saw no one try to redirect 
the discussion to a more general group), and you and anyone else are 
free to ignore the thread. All I ask is you allow people to make up 
their own minds.

-- 
szr 
From: Arne Vajhøj
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <4841f0aa$0$90274$14726298@news.sunsite.dk>
szr wrote:
> Peter Duniho wrote:
>> On Fri, 30 May 2008 22:40:03 -0700, szr <·····@szromanMO.comVE> wrote:
>>> Arne Vajh�j wrote:
>>>> Stephan Bour wrote:
>>>>> Lew wrote:
>>>>> } John Thingstad wrote:
>>>>> } > Perl is solidly based in the UNIX world on awk, sed, } > bash 
>>>>> and C. I don't like the style, but many do.
>>>>> }
>>>>> } Please exclude the Java newsgroups from this discussion.
>>>>>
>>>>> Did it ever occur to you that you don't speak for entire news
>>>>> groups?
>>>> Did it occur to you that there are nothing about Java in the above ?
>>> Looking at the original post, it doesn't appear to be about any
>>> specific language.
>> Indeed.  That suggests it's probably off-topic in most, if not all,
>> of the newsgroups to which it was posted, inasmuch as they exist for
>> topics specific to a given programming language.
> 
> Perhaps - comp.programming might of been a better place, but not all 
> people who follow groups for specific languages follow a general group 
> like that - but let me ask you something. What is it you really have 
> against discussing topics with people of neighboring groups? Keep in 
> mind you don't have to read anything you do not want to read. [1]

I very much doubt that the original thread is relevant for the Java
group.

But the subthread Lew commente don was about Perl and Unix. That is
clearly off topic.

Personally I am rather tolerant for topics. But I can not blame Lew
for requesting that a Perl-Unix discussion does not get cross posted
to a Java group.

>> Regardless, unless you are actually reading this thread from the
>> c.l.j.p newsgroup, I'm not sure I see the point in questioning
>> someone who _is_ about whether the thread belongs there or not.
> 
> I would rather have the OP comment about that, as he started the thread. 
> But what gets me is why you are against that specific group being 
> included but not others? What is so special about the Java group and why 
> are you so sure people there don't want to read this thread? [1] What 
> right do you or I or anyone have to make decisions for everyone in a 
> news group? Isn't this why most news readers allow one to block a 
> thread?

I doubt Lew read any of the other groups, so it seems quite
natural that he did not comment on the on/off topic characteristics
in those.

>> And if it's a vote you want, mark me down as the third person reading
>> c.l.j.p that doesn't feel this thread belongs.  I don't know whether
>> Lew speaks for the entire newsgroup, but based on comments so far,
>> it's pretty clear that there unanimous agreement among those who have
>> expressed an opinion.
> 
> Ok, so, perhaps 3 people out of what might be several hundred, if not 
> thousand (there is no way to really know, but there are certainly a lot 
> of people who read that group, and as with any group, there are far more 
> readers than there are people posting, so, again, just because you or 
> two other people or so don't want to read a topic or dislike it, you 
> feel you can decide for EVERYONE they mustn't read it? Again, this is 
> why readers allow you to ignore threads. Please don't force your views 
> on others; let them decide for themselves. [1]

And I am sure that Lew did not intended to pretend to speak for
the entire group. He spoke for himself.

I believe there has been several posts that agreed with him and none
that disagreed, so it seems very plausible that the group indeed agree
with him.

Arguing that a huge silent majority has a different opinion
than those speaking up is a very questionable argument. Everybody
could try and count them for their view. The only reasonable
thing is not to count them.

Arne
From: szr
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <g1tfgn030l@news4.newsguy.com>
Arne Vajh�j wrote:
> szr wrote:
>> Peter Duniho wrote:
>>> On Fri, 30 May 2008 22:40:03 -0700, szr <·····@szromanMO.comVE>
>>> wrote:
>>>> Arne Vajh�j wrote:
>>>>> Stephan Bour wrote:
>>>>>> Lew wrote:
>>>>>> } John Thingstad wrote:
>>>>>> } > Perl is solidly based in the UNIX world on awk, sed, } > bash
>>>>>> and C. I don't like the style, but many do.
>>>>>> }
>>>>>> } Please exclude the Java newsgroups from this discussion.
>>>>>>
>>>>>> Did it ever occur to you that you don't speak for entire news
>>>>>> groups?
>>>>> Did it occur to you that there are nothing about Java in the
>>>>> above ?
>>>> Looking at the original post, it doesn't appear to be about any
>>>> specific language.
>>> Indeed.  That suggests it's probably off-topic in most, if not all,
>>> of the newsgroups to which it was posted, inasmuch as they exist for
>>> topics specific to a given programming language.
>>
>> Perhaps - comp.programming might of been a better place, but not all
>> people who follow groups for specific languages follow a general
>> group like that - but let me ask you something. What is it you
>> really have against discussing topics with people of neighboring
>> groups? Keep in mind you don't have to read anything you do not want
>> to read. [1]
>
> I very much doubt that the original thread is relevant for the Java
> group.
>
> But the subthread Lew commente don was about Perl and Unix. That is
> clearly off topic.

I agree with and understand what you are saying in general, but still, 
isn't it possible that were are people in the java group (and others) 
who might of been following the thread, only to discover (probably not 
right away) that someone decided to remove the group they were reading 
the thread from? I know I would not like that, even if it wasn't on 
topic at the branch.

Personally, I find it very annoying to have to switch news groups in 
order to resume a thread and weed my way down the thread to where it 
left off before it was cut off from the previous group.
-- 
szr 
From: Peter Duniho
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <op.ub14yju48jd0ej@petes-computer.local>
On Sat, 31 May 2008 23:27:35 -0700, szr <·····@szromanMO.comVE> wrote:

> [...]
>> But the subthread Lew commente don was about Perl and Unix. That is
>> clearly off topic.
>
> I agree with and understand what you are saying in general, but still,
> isn't it possible that were are people in the java group (and others)
> who might of been following the thread, only to discover (probably not
> right away) that someone decided to remove the group they were reading
> the thread from? I know I would not like that, even if it wasn't on
> topic at the branch.

All due respect, I don't really care if those people find the thread  
gone.  And no one should.

Each individual person has a wide variety of interests.  A thread that is  
off-topic in a newsgroup may in fact be concerning a topic of interest for  
someone who just happened to be reading that newsgroup.  The fact that  
that person might have been interested in it isn't justification for  
continuing the thread in that newsgroup.  The most important question  
isn't who might have been reading the thread, but rather whether the  
thread is on-topic.

What if someone cross-posted a thread about motorcycle racing in the Perl  
newsgroup as well as an actual motorcycle racing newsgroup?  No doubt, at  
least some people reading the Perl newsgroup have an interest in  
motorcycle racing.  They may in fact be racers themselves.  Those people  
may have found the thread about motorcycle racing interesting.

Does that justify the thread continuing to be cross-posted to the Perl  
newsgroup?  No, of course not.

So please.  Quit trying to justify a thread being cross-posted to a  
newsgroup that you aren't even reading just on the sole basis of the  
remote possibility that someone in that newsgroup was interested in the  
thread.  It's not a legitimate justification, and even if it were, there's  
been sufficient opportunity for someone here in the Java newsgroup to  
speak up and say "hey, wait!  I was reading that!"

But no one's said anything of the sort.  Those people who don't exist have  
no need for you to provide an irrelevant defense for them.

> Personally, I find it very annoying to have to switch news groups in
> order to resume a thread and weed my way down the thread to where it
> left off before it was cut off from the previous group.

If people use the newsgroups responsibly, that never happens.

A thread should never be cut-off midstream like that unless it was  
inappropriately cross-posted in the first place, and if the thread was  
inappropriately cross-posted in the first place, no one has any business  
expecting to be able to continue reading it in any newsgroup where it's  
off-topic.

If you're interested in discussions on Perl and Unix, go read a newsgroup  
about Perl and/or Unix.  Don't look for those discussions in the Java  
newsgroup, and don't get comfy reading the thread in the Java newsgroup  
should you happen across it.  They don't belong, and they should be  
terminated within the Java newsgroup ASAP.  Go follow the thread where  
it's on-topic.

Pete
From: szr
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <g1tmbj095o@news4.newsguy.com>
Peter Duniho wrote:
> On Sat, 31 May 2008 23:27:35 -0700, szr <·····@szromanMO.comVE> wrote:
>
>> [...]
>>> But the subthread Lew commente don was about Perl and Unix. That is
>>> clearly off topic.
>>
>> I agree with and understand what you are saying in general, but
>> still, isn't it possible that were are people in the java group (and
>> others) who might of been following the thread, only to discover
>> (probably not right away) that someone decided to remove the group
>> they were reading the thread from? I know I would not like that,
>> even if it wasn't on topic at the branch.
>
> All due respect, I don't really care if those people find the thread
> gone.  And no one should.

I prefer to be considerate of others.

> Each individual person has a wide variety of interests.  A thread
> that is off-topic in a newsgroup may in fact be concerning a topic of
> interest for someone who just happened to be reading that newsgroup.

Well if a thread has absolutely no relation to a group, then yes, 
cross-posting to said group is inappropiate, and setting follow ups may 
well be warrented. But when there is some relation, sometimes it may be 
better to mark it as [OT] i nthe subject line, a practice that is 
sometimes seen, and seems to suffice.

> What if someone cross-posted a thread about motorcycle racing in the
> Perl newsgroup as well as an actual motorcycle racing newsgroup?

You are comparing apples and oranges now; sure, if you post about 
motorcycles (to use your example) it would be wildly off topic, but the 
thread in question was relating to programming (the naming of functions 
and such) in general.

> Does that justify the thread continuing to be cross-posted to the Perl
> newsgroup?  No, of course not.

but who decides this? And why does said individual get to decide for 
everyone?

> So please.  Quit trying to justify a thread being cross-posted to a
> newsgroup that you aren't even reading

You do not know what groups I read. And I am not attempting to justify 
cross posting at all. Rather I am arguing against deciding for a whole 
news group when a thread should be discontinued.

-- 
szr 
From: Arne Vajhøj
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <48432b01$0$90272$14726298@news.sunsite.dk>
szr wrote:
> Arne Vajh�j wrote:
>> szr wrote:
>>> Peter Duniho wrote:
>>>> On Fri, 30 May 2008 22:40:03 -0700, szr <·····@szromanMO.comVE>
>>>> wrote:
>>>>> Arne Vajh�j wrote:
>>>>>> Stephan Bour wrote:
>>>>>>> Lew wrote:
>>>>>>> } John Thingstad wrote:
>>>>>>> } > Perl is solidly based in the UNIX world on awk, sed, } > bash
>>>>>>> and C. I don't like the style, but many do.
>>>>>>> }
>>>>>>> } Please exclude the Java newsgroups from this discussion.
>>>>>>>
>>>>>>> Did it ever occur to you that you don't speak for entire news
>>>>>>> groups?
>>>>>> Did it occur to you that there are nothing about Java in the
>>>>>> above ?
>>>>> Looking at the original post, it doesn't appear to be about any
>>>>> specific language.
>>>> Indeed.  That suggests it's probably off-topic in most, if not all,
>>>> of the newsgroups to which it was posted, inasmuch as they exist for
>>>> topics specific to a given programming language.
>>> Perhaps - comp.programming might of been a better place, but not all
>>> people who follow groups for specific languages follow a general
>>> group like that - but let me ask you something. What is it you
>>> really have against discussing topics with people of neighboring
>>> groups? Keep in mind you don't have to read anything you do not want
>>> to read. [1]
>> I very much doubt that the original thread is relevant for the Java
>> group.
>>
>> But the subthread Lew commente don was about Perl and Unix. That is
>> clearly off topic.
> 
> I agree with and understand what you are saying in general, but still, 
> isn't it possible that were are people in the java group (and others) 
> who might of been following the thread, only to discover (probably not 
> right away) that someone decided to remove the group they were reading 
> the thread from? I know I would not like that, even if it wasn't on 
> topic at the branch.
> 
> Personally, I find it very annoying to have to switch news groups in 
> order to resume a thread and weed my way down the thread to where it 
> left off before it was cut off from the previous group.

I am relative tolerant towards threads that are a bit off topic, if
the S/N ratio overall is good.

But I accept and respect that other people has a more strict
attitude against off topic posts.

And I am very little tolerant towards people that think they
can attack those that want only on topic posts.

One thing is to ask for a bit of slack regarding the rules
something else is attacking those that want the rules
kept.

Arne
From: szr
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <g201dm02c7@news4.newsguy.com>
Arne Vajh�j wrote:
> szr wrote:
>> Arne Vajh�j wrote:
>>> szr wrote:
>>>> Peter Duniho wrote:
>>>>> On Fri, 30 May 2008 22:40:03 -0700, szr <·····@szromanMO.comVE>
>>>>> wrote:
>>>>>> Arne Vajh�j wrote:
>>>>>>> Stephan Bour wrote:
>>>>>>>> Lew wrote:
>>>>>>>> } John Thingstad wrote:
>>>>>>>> } > Perl is solidly based in the UNIX world on awk, sed, } >
>>>>>>>> bash and C. I don't like the style, but many do.
>>>>>>>> }
>>>>>>>> } Please exclude the Java newsgroups from this discussion.
>>>>>>>>
>>>>>>>> Did it ever occur to you that you don't speak for entire news
>>>>>>>> groups?
>>>>>>> Did it occur to you that there are nothing about Java in the
>>>>>>> above ?
>>>>>> Looking at the original post, it doesn't appear to be about any
>>>>>> specific language.
>>>>> Indeed.  That suggests it's probably off-topic in most, if not
>>>>> all, of the newsgroups to which it was posted, inasmuch as they
>>>>> exist for topics specific to a given programming language.
>>>> Perhaps - comp.programming might of been a better place, but not
>>>> all people who follow groups for specific languages follow a
>>>> general group like that - but let me ask you something. What is it
>>>> you really have against discussing topics with people of
>>>> neighboring groups? Keep in mind you don't have to read anything
>>>> you do not want to read. [1]
>>> I very much doubt that the original thread is relevant for the Java
>>> group.
>>>
>>> But the subthread Lew commente don was about Perl and Unix. That is
>>> clearly off topic.
>>
>> I agree with and understand what you are saying in general, but
>> still, isn't it possible that were are people in the java group (and
>> others) who might of been following the thread, only to discover
>> (probably not right away) that someone decided to remove the group
>> they were reading the thread from? I know I would not like that,
>> even if it wasn't on topic at the branch.
>>
>> Personally, I find it very annoying to have to switch news groups in
>> order to resume a thread and weed my way down the thread to where it
>> left off before it was cut off from the previous group.
>
> I am relative tolerant towards threads that are a bit off topic, if
> the S/N ratio overall is good.

Agreed.

[...]

If a thread, that is cross-posted, branches off on a tangent that has 
nothing to do with one or more groups what so ever, then yes, it makes 
sense to prune the 'newsgroup:' list / set follow ups, but in this case, 
someone made one mention or so of 'Perl', which was being used as an 
example, and someone (lew) moved to have the Java group removed.

There was little reason to cut off the thread, when people very well may 
have been following it, over the utterance of one word, which was being 
used as an example. The bulk of the thread had to do with general 
programming, and suddenly writing the name of a language doesn't mean 
it's way off on a tangent.

I hope this clears up some waters.

Regards.

-- 
szr 
From: Arne Vajhøj
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <4841edb4$0$90265$14726298@news.sunsite.dk>
szr wrote:
> Arne Vajh�j wrote:
>> Stephan Bour wrote:
>>> Lew wrote:
>>> } John Thingstad wrote:
>>> } > Perl is solidly based in the UNIX world on awk, sed, bash and C.
>>> } > I don't like the style, but many do.
>>> }
>>> } Please exclude the Java newsgroups from this discussion.
>>>
>>> Did it ever occur to you that you don't speak for entire news groups?
>> Did it occur to you that there are nothing about Java in the above ?
> 
> Looking at the original post, it doesn't appear to be about any specific 
> language.

That does not make it on topic in the Java group.

And the subthread Lew commented on most certainly is not.

Arne
From: Robert Maas, http://tinyurl.com/uh3t
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <rem-2008jun04-005@yahoo.com>
> From: ·······@panix.com (David Combs)
> Lisp is *so* early a language (1960?), preceeded mainly only by
> Fortran (1957?)?, and for sure the far-and-away the first as a
> platform for *so many* concepts of computer-science, eg lexical vs
> dynamic ("special") variables, passing *unnamed* functions as
> args ... maybe is still the only one in which program and data
> have the same representation -- that it'd seem logical to use it's
> terminology in all languages.

Yeah, but why did you cross-post to so many newsgroups? Are you
trying to run a flame war between advocates of the various
languages? (Same accusation to the OP moreso!)

> From C is the very nice distinction between "formal" and "actual" args.

I think Lisp already had that nearly 50 years ago. Function
definition (lambda expression) has formal args, EVAL recursively
calls EVAL on sub-forms to create actual args and calls APPLY on
them and whatever function is named in the CAR position of the form.
Whether anybody bothered to use that specific jargon, or it was
just so obvious it didn't need jargon, I don't know.

> And from algol-60, own and local -- own sure beats "static"!

Yeah. But now that you mention it and I think about it, what's
really meant is "private persistent". Global variables are public
persistent. Local variables and formal args to functions are
private transient (they go away as soon as the function returns).
but OWN variables are private to the function but stay around
"forever" just like globals do, so that side effects on the OWN
variables that occurred during one call can persist to affect the
next call. Lexical closures in Common Lisp go one step further,
allowing private persistent variables to be shared between several
functions. All those functions share access to the private variable
which they co-OWN. Another way in which OWN or lexical-closure
variables aren't like what the word "own" means in ordinary
language is that it's possible to transfer ownership by selling or
giving something to somebody else, but not with OWN variables or
lexical-closure variables. So even though I like the word OWN
better than the word STATIC for this meaning, I'm not totally
comfortable with that jargon. But "persistent private" is a
mouthful compared to "OWN", and I doubt anyone can find a word of
appx. 3 characters that conveys the intended meaning so we're
probably stuck with "OWN" as the best short term.
From: Jon Harrop
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <L7qdnb6VG6dsX9rVnZ2dnUVZ8tfinZ2d@posted.plusnet>
Robert Maas, http://tinyurl.com/uh3t wrote:
>> From: ·······@panix.com (David Combs)
>> Lisp is *so* early a language (1960?), preceeded mainly only by
>> Fortran (1957?)?, and for sure the far-and-away the first as a
>> platform for *so many* concepts of computer-science, eg lexical vs
>> dynamic ("special") variables, passing *unnamed* functions as
>> args ... maybe is still the only one in which program and data
>> have the same representation -- that it'd seem logical to use it's
>> terminology in all languages.
> 
> Yeah, but why did you cross-post to so many newsgroups? Are you
> trying to run a flame war between advocates of the various
> languages?

What would be the point? We all know that Java, Perl, Python and Lisp suck.
They don't even have pattern matching over algebraic sum types if you can
imagine that. How rudimentary...

-- 
Dr Jon D Harrop, Flying Frog Consultancy
http://www.ffconsultancy.com/products/?u
From: ···················@gmail.com
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <c9120a7e-5e21-4835-9211-bd83bc958a7c@i76g2000hsf.googlegroups.com>
On 5 Giu, 12:37, Jon Harrop <····@ffconsultancy.com> wrote:
> [...]

P.S. Please don't look at my profile (at google groups), thanks!

Jon Harrop
From: Robert Maas, http://tinyurl.com/uh3t
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <rem-2008jun30-001@yahoo.com>
Why this response is so belated:
  <http://groups.google.com/group/misc.misc/msg/cea714440e591dd2>
= <······················@yahoo.com>
> Date: Thu, 5 Jun 2008 06:17:01 -0700 (PDT)
> From: ···················@gmail.com
> P.S. Please don't look at my profile (at google groups), thanks!

Please don't look at the orange and green checkered elephant
playing a harp off-key while sitting on a toilet and passing wind.
From: Robert Maas, http://tinyurl.com/uh3t
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <rem-2008jun29-003@yahoo.com>
Why this response is so belated:
  <http://groups.google.com/group/misc.misc/msg/cea714440e591dd2>
= <······················@yahoo.com>
> Date: Thu, 05 Jun 2008 11:37:48 +0100
> From: Jon Harrop <····@ffconsultancy.com>
> We all know that Java, Perl, Python and Lisp suck.

Well at least you're three-quarters correct there.
But Lisp doesn't suck. That's where you're one quarter wrong.

> They don't even have pattern matching over algebraic sum types if
> you can imagine that.

I'm still waiting for you to precisely define what you mean by
"pattern matching" in this context. All I've heard from you so-far
are crickets. If you've set up a Web page with your personal
definition of "pattern matching" as you've been using that term
here, and you've posted its URL in a newsgroup article I didn't
happen to see, please post just the URL again here so that I might
finally see it. Or e-mail me the URL.

-
Nobody in their right mind likes spammers, nor their automated assistants.
To open an account here, you must demonstrate you're not one of them.
Please spend a few seconds to try to read the text-picture in this box:

/----------------------------------------------------------------------------\
| ,-.-.                                               |                      |
| | | |,   .    ,---.,---.,---.    ,---.,---.,---.,---|,---.                 |
| | | ||   |    `---.|   ||   |    |   ||---'|---'|   |`---.                 |
| ` ' '`---|    `---'`---'`   '    `   '`---'`---'`---'`---'                 |
|      `---'                                                                 |
| |         |             o              |                 |                 |
| |---.,---.|    ,---.    .,---.    ,---.|    ,---.,---.   |---.,---.,---.   |
| |   ||---'|    |   |    ||   |    ,---||    |   ||---'---|   ||    ,---|   |
| `   '`---'`---'|---'    ``   '    `---^`---'`---|`---'   `---'`    `---^ | |
|                |                            `---'                       '  |
|           |                           |             |                      |
| ,---.,---.|---     . . .,---.,---.,---|,---.,---.   |---.,---.,---.        |
| |   ||   ||        | | ||   ||   ||   ||---'|    ---|   ||    ,---|        |
| `   '`---'`---'    `-'-'`---'`   '`---'`---'`       `---'`    `---^o       |
\--------(Rendered by means of <http://www.schnoggo.com/figlet.html>)--------/
     (You don't need JavaScript or images to see that ASCII-text image!!
      You just need to view this in a fixed-pitch font such as Monaco.)

Then enter your best guess of the text (40-50 chars) into this TextField:
          +--------------------------------------------------+
          |                                                  |
          +--------------------------------------------------+
From: Lew
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <yP-dnVoNZfgnVfXVnZ2dnUVZ_sidnZ2d@comcast.com>
Robert Maas wrote:
> /----------------------------------------------------------------------------\
> | ,-.-.                                               |                      |
> | | | |,   .    ,---.,---.,---.    ,---.,---.,---.,---|,---.                 |
> | | | ||   |    `---.|   ||   |    |   ||---'|---'|   |`---.                 |
> | ` ' '`---|    `---'`---'`   '    `   '`---'`---'`---'`---'                 |
> |      `---'                                                                 |
> | |         |             o              |                 |                 |
> | |---.,---.|    ,---.    .,---.    ,---.|    ,---.,---.   |---.,---.,---.   |
> | |   ||---'|    |   |    ||   |    ,---||    |   ||---'---|   ||    ,---|   |
> | `   '`---'`---'|---'    ``   '    `---^`---'`---|`---'   `---'`    `---^ | |
> |                |                            `---'                       '  |
> |           |                           |             |                      |
> | ,---.,---.|---     . . .,---.,---.,---|,---.,---.   |---.,---.,---.        |
> | |   ||   ||        | | ||   ||   ||   ||---'|    ---|   ||    ,---|        |
> | `   '`---'`---'    `-'-'`---'`   '`---'`---'`       `---'`    `---^o       |
> \--------(Rendered by means of <http://www.sc*****************.html>)--------/
>      (You don't need JavaScript or images to see that ASCII-text image!!
>       You just need to view this in a fixed-pitch font such as Monaco.)
> 
> Then enter your best guess of the text (40-50 chars) into this TextField:
>           +--------------------------------------------------+
>           | Your son totally needs a Wonder-Bra(r), double-D |
>           +--------------------------------------------------+

-- 
Lew
From: John W Kennedy
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <48617847$0$5021$607ed4bc@cv.net>
David Combs wrote:
> passing
> *unnamed* functions as args (could Algol 60 also do something like that,
> via something it maybe termed a "thunk")

No, the "thunks" were necessary at the machine-language level to 
/implement/ ALGOL 60, but they could not be expressed /in/ ALGOL.

-- 
John W. Kennedy
  "The first effect of not believing in God is to believe in anything...."
   -- Emile Cammaerts, "The Laughing Prophet"
From: Robert Maas, http://tinyurl.com/uh3t
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <rem-2008jun30-002@yahoo.com>
Why this response is so belated:
  <http://groups.google.com/group/misc.misc/msg/cea714440e591dd2>
= <······················@yahoo.com>
> Date: Tue, 24 Jun 2008 18:42:15 -0400
> From: John W Kennedy <·······@attglobal.net>
> ... the "thunks" were necessary at the machine-language level to
> /implement/ ALGOL 60, but they could not be expressed /in/ ALGOL.

Ah, thanks for the clarification. Is that info in the appropriate
WikiPedia page? If not, maybe you would edit it in?
From: John W Kennedy
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <486ae06b$0$5011$607ed4bc@cv.net>
Robert Maas, http://tinyurl.com/uh3t wrote:
> Why this response is so belated:
>   <http://groups.google.com/group/misc.misc/msg/cea714440e591dd2>
> = <······················@yahoo.com>
>> Date: Tue, 24 Jun 2008 18:42:15 -0400
>> From: John W Kennedy <·······@attglobal.net>
>> ... the "thunks" were necessary at the machine-language level to
>> /implement/ ALGOL 60, but they could not be expressed /in/ ALGOL.
> 
> Ah, thanks for the clarification. Is that info in the appropriate
> WikiPedia page? If not, maybe you would edit it in?

It is explained s.v. "thunk", which is referenced from "ALGOL 60". The 
ALGOL "pass-by-name" argument/parameter matching was perhaps the most 
extreme example ever of a language feature that was "elegant" but 
insane. What it meant, in effect, was that, unless otherwise marked, 
every argument was passed as two closures, one that returned a fresh 
evaluation of the expression given as the argument, which was called 
every time the parameter was read, and one that set the argument to a 
new value, which was called every time the parameter was set.

See <URL:http://www.cs.sfu.ca/~cameron/Teaching/383/PassByName.html>.

ALGOL 60 could not create generalized user-written closures, but could 
create one no more complex than a single expression with no arguments of 
its own simply by passing the expression as an argument. But it was not 
thought of as a closure; that was just how ALGOL 60 did arguments.
-- 
John W. Kennedy
  "Give up vows and dogmas, and fixed things, and you may grow like 
That. ...you may come to think a blow bad, because it hurts, and not 
because it humiliates.  You may come to think murder wrong, because it 
is violent, and not because it is unjust."
   -- G. K. Chesterton.  "The Ball and the Cross"
From: Robert Maas, http://tinyurl.com/uh3t
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <rem-2008jul20-005@yahoo.com>
> >> ... the "thunks" were necessary at the machine-language level to
> >> /implement/ ALGOL 60, but they could not be expressed /in/ ALGOL.
> > Ah, thanks for the clarification. Is that info in the appropriate
> > WikiPedia page? If not, maybe you would edit it in?
> From: John W Kennedy <·······@attglobal.net>
> It is explained s.v. "thunk", which is referenced from "ALGOL
> 60". The ALGOL "pass-by-name" argument/parameter matching was
> perhaps the most extreme example ever of a language feature that
> was "elegant" but insane. What it meant, in effect, was that,
> unless otherwise marked, every argument was passed as two closures,
> one that returned a fresh evaluation of the expression given as the
> argument, which was called every time the parameter was read, and
> one that set the argument to a new value, which was called every
> time the parameter was set.

Wow! All these years when I occasionally heard of a "thunk" I never
was told, until now, what it really meant. Thanks for the info!!

Followup question #1: I assume these are lexical closures in the
environment of the point of the call, right?

Followup question #2: For simple arithmetic expressions, I can
possibly understand how the UPDATE closure might be implemeted
(expressed in Lisp to make the intent clear):
  Call form:  MyFunction(X+2);
  GET closure:  (+ closedX 2)
  UPDATE closure:  (lambda (newval) (setf closedX (- newval 2))
Thus from inside MyFunction where formal parameter Arg1 is bound
to actual parameter X+2, after doing Arg1 := 7; X will have the
value 5 so that calling Arg1 will return 7 as expected, right?
But if the actual argument is something complicated, especially if
it makes a nested function call, how can that possibly be
implemented? Given an arbitrary expression that calls some external
function, how can assigning a value to that expression make
sufficient changes in the runtime environment such that
subsequently evaluating that expression will yield the expected
value i.e. the value that had been assigned?
Or is the default of passing two closures (GET and UPDATE) *only*
if the actual-argument expression is simple enough that it's
invertible, and in complicated cases only a GET closure is passed
(or the UPDATE closure is simply a signal of a runtime error
 that you're not allowed to assign a value to a complicated expression)?

IMO the "right" way to pass parameters that can be modified is to
use "locatatives" as in the Lisp Machine. That converts the idea of
a "place" (as used by SETF in Common Lisp) into a "first class
citizen" which can be passed around and stored etc., compared to a
SETF place which is merely a compiletime-macro trick to convert
place-references in source code into direct calls to the
appropriate accessor just above the place followed by specialized
SETter call to do the act. A hack to emulate a locative in CL would
be to pass a closure where the code to find the object directly
containing the place, and any parameters needed to find that place,
and the function needed to perform the act. Then the called
function would need to know it's going to get such a thunk-like
closure, but since it's expecting to modify one of its parameters
anyway, that's reasonable. Sketch of implementation (two special cases):
(defun make-thunk-cadr (topptr)
  (let* ((midptr (cdr topptr))
         (getclo (make-getter-closure :PARENT midptr :GETTERFN #'car
                                      :PARMS nil))
         (setclo (make-setter-closure :PARENT midptr :SETTERFN #'rplaca
                                      :PARMS nil)))
   (make-thunk getclo setclo))
(defun make-thunk-aref1 (topptr arrindex1)
  (let ((getclo (make-getter-closure :PARENT topptr :GETTERFN #'aref1
                                     :PARMS (list arrindex1)))
        (setclo (make-setter-closure :PARENT midptr :SETTERFN #'setaref1
                                     :PARMS (list arrindex1))))
   (make-thunk getclo setclo))
(defun swap (thunk1 thunk2)
  (prog (tmp)
    (setq tmp (thunk-get thunk1))
    (thunk-set thunk1 (thunk-get thunk2))
    (thunk-set thunk2 tmp)))
;Definitions of make-getter-closure make-setter-closure make-thunk
; thunk-get thunk-set not shown because they depend on whether
; closures and thunks are implemented via tagged assoc lists or
; DEFSTRUCT structures or CLOS objects or whatever. But I made the
; call to the constructors explicit enough that it should be obvious
; what components are inside each type of object. Note that with
; CLOS objects, this could all be condensed to have a single CLOS
; object which is the thunk which has two methods GET and SET, no
; need to make closures for get and set separately, templates for
; those closures are made automatically when the CLOS class is
; defined, and closures are generated from those templates whenever
; a new CLOS thunk-object is made. Thus:
; ... (make-CLOS-thunk :PARENT topptr :GETTERFN #'aref1 :SETTERFN #'setaref1
;                      :PARMS (list arrindex1)) ...

;Example that should actually work:
(format t " arr: ~S~% ixs: ~S~%" arr ixs)
 arr: #'(3 5 7 11 13))
 ixs: (2 4)
(setq arr #'(3 5 7 11 13))
(setq ixs (list 2 4))
(setq thunkcadr (make-thunk-cadr ixs))
  ;Locative to the 4 in ixs
(setq thunkaref (make-thunk-aref1 arr (thunk-get thunkcadr)))
  ;Locative to the 11 in the array
(swap thunkcadr thunkaref)
(format t " arr: ~S~% ixs: ~S~%" arr ixs)
 arr: #'(3 5 7 4 13))
 ixs: (2 11)
I haven't implemented this. I'm just specifying what the behaviour should be
and giving a sketch how it ought to be easily doable in Common Lisp.

And I'm not going to implement it because I have no use for this way
of coding, at least not currently or in the foreseeable future.
<tmi> Generally my abstract data type is at a higher level where the
      caller doesn't know that a single place is going to need to be
      SETFed, so there's no point in getting a locative to work with.
      Instead there's some *kind* of update to do, and parameters to
      that *kind* of update; what really happens internally (one or more
      SETFs, or alternately re-build anything that changed and share what
      didn't change) doesn't need to be known by the caller. All the
      caller needs to know is generically whether the update is in-place
      or non-destructive. (If it's non-destructive, then the new edition
      of the data structure is one of the return values. If it's
      in-place, then there's no need to bother with setq of the new
      value, because my structures always have a header cell that has a
      tag for the intentional datatype, and that header cell always
      points to either the in-place-modified object or the
      latest-edition-of-object.) </tmi>
<mtmi> I'd rather program in "paranoid" mode than in "risk shoot foot" mode.
       The CAR of each ADT object is a keyword identifying the
       intentional type of that object, and every function that
       operates on that intentional type first checks if the parameter
       really does have the expected CAR, just to make sure I didn't
       copy&paste some inappropriate function name in my code. Yeah, it
       takes extra CPU cycles to do that checking on every calls, but it
       sure saves me from shooting myself in the foot and having to spend
       an hour to find out how I did it before I can fix it. <mtmi>
<emtmi> TMI = Too Much Information (actually YMMV, some readers might like it)
        MTMI = More of Too Much Information
        EMTMI = Even More of Too Much Information (only newbies need read)
        Credits to Babylon Five for the "I spy" game in the cargo hold.
        I spy something that starts with the letter B.   Boxes!
        I spy something that starts with the letter M.   More boxes! <emtmi>
From: John W Kennedy
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <4883bc90$0$7327$607ed4bc@cv.net>
Robert Maas, http://tinyurl.com/uh3t wrote:
>>>> ... the "thunks" were necessary at the machine-language level to
>>>> /implement/ ALGOL 60, but they could not be expressed /in/ ALGOL.
>>> Ah, thanks for the clarification. Is that info in the appropriate
>>> WikiPedia page? If not, maybe you would edit it in?
>> From: John W Kennedy <·······@attglobal.net>
>> It is explained s.v. "thunk", which is referenced from "ALGOL
>> 60". The ALGOL "pass-by-name" argument/parameter matching was
>> perhaps the most extreme example ever of a language feature that
>> was "elegant" but insane. What it meant, in effect, was that,
>> unless otherwise marked, every argument was passed as two closures,
>> one that returned a fresh evaluation of the expression given as the
>> argument, which was called every time the parameter was read, and
>> one that set the argument to a new value, which was called every
>> time the parameter was set.
> 
> Wow! All these years when I occasionally heard of a "thunk" I never
> was told, until now, what it really meant. Thanks for the info!!
> 
> Followup question #1: I assume these are lexical closures in the
> environment of the point of the call, right?

Yes. (Actually, subprogram calls are first described as working like 
macro expansions, but then the specification adds that there must be 
magic fixups so that variable names are resolved in point-of-call 
context anyway.)

At this point in the history of computing, the conceptual distinction 
between subprograms and macros was not at all clear. It was quite 
possible to have "macros" that generated an out-of-line subprogram once 
and then generated calling code the first time and every time thereafter 
(it was a way of life on systems without linkage editors or linking 
loaders), and it was also quite possible to refer to in-line macro 
expansions as "subprograms". I suspect that the triumph of FORTRAN may 
have had something to do with cleaning up the terminological mess by the 
mid-60s.

Into the 60s, indeed, there were still machines being made that had no 
instruction comparable to the mainframe BASx/BALx family, or to Intel's 
CALL. You had to do a subprogram call by first overwriting the last 
instruction of what you were calling with a branch instruction that 
would return back to you.

> Followup question #2: For simple arithmetic expressions, I can
> possibly understand how the UPDATE closure might be implemeted
> (expressed in Lisp to make the intent clear):
>   Call form:  MyFunction(X+2);
>   GET closure:  (+ closedX 2)
>   UPDATE closure:  (lambda (newval) (setf closedX (- newval 2))
> Thus from inside MyFunction where formal parameter Arg1 is bound
> to actual parameter X+2, after doing Arg1 := 7; X will have the
> value 5 so that calling Arg1 will return 7 as expected, right?
> But if the actual argument is something complicated, especially if
> it makes a nested function call, how can that possibly be
> implemented?

It was forbidden. In the formal definition of ALGOL 60, there was no 
such thing as an external subprogram (except for non-ALGOL code, which 
was treated as magic), so the whole program was supposed to be one 
compile. Therefore, a theoretical compiler could verify that any 
parameter used as an LHS was always matched with an argument that was a 
variable. (However, a "thunk" was still required to evaluate any array 
index and, possibly, to convert between REAL and INTEGER variables.) 
Many actual compilers /did/ support separate complication as a language 
extension -- I suppose they had to use run-time checking.

-- 
John W. Kennedy
  "Compact is becoming contract,
Man only earns and pays."
   -- Charles Williams.  "Bors to Elayne:  On the King's Coins"
From: Robert Maas, http://tinyurl.com/uh3t
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <rem-2008aug12-003@yahoo.com>
John W Kennedy <·······@attglobal.net> wrote:
JWK> Into the 60s, indeed, there were still machines being made
JWK> that had no instruction comparable to the mainframe BASx/BALx
JWK> family, or to Intel's CALL. You had to do a subprogram call by
JWK> first overwriting the last instruction of what you were
JWK> calling with a branch instruction that would return back to
JWK> you.

That's not true, that you needed to do that, that there was no
other way available. The subroutine linkage I invented for S.P.S.
(Symbolic Programming System, i.e. IBM 1620 assembly language) was
to reserve a 5-digit space immediately before the subroutine entry
point for storing the return address. So the caller needed to know
only one address, the entry point, and do both store-return-address
and jump relative to that address, rather than needing to know both
the entry point and the last-instruction-JUMP-needs-patch address
as independent items of information. So calling a subroutine was
two instructions (pseudo-code here):
   literal[nextAdrOfSelf} -> memory[SubrEntryPoint-1]
   jump to SubrEntryPoint
and returning from a subroutine was two instructios:
   copy memory[SubrEntryPoint-1] -> memory[here + 11]
   jump to 00000 ;These zeroes replaced by return address just above
Of course if you needed to pass parameters and/or return value,
that was handled separately, perhaps by reserving additional
storage just before the return address. Of course this methodology
didn't support recursion.

So my method required one extra instruction per return point, but
allowed multiple return points from a single subroutine, and
allowed "encapsulation" of the relation between entry point and
return point.

Note: On IBM 1620, instructions and forward-sweeping data records
were addressed by their *first* digit, whereas arithmetic fields
were addressed by their *last* digit, the low-order position, to
support natural add-and-carry operations. Storage was decimal
digits, with two extra bits, flag to indicate negative value (if in
low-order position) or high-order-end (if in any other position),
and parity. Values larger than nine were reserved for special
purposes, such as RECORD MARK used to terminate right-sweep data
records. Because of that, the low-order position of the return
address and the first digit of the machine instruction at the
subroutine entry point differed by only machine address, hence the
SubrEntryPoint-1 instead of SubrEntryPoint-5 you would otherwise
expect.

Hmm, I suppose if I had thought it out more at the time, I might have
done it slightly differently:

Entry point like this:
         jump 00000 ;Patched by caller to contain return address
  Entry: ...(regular code)...
          ...

Each return point like this:
         jump Entry-12


I wonder if anybody ever implemented a stack on the IBM 1620?
Probably not, because it would take a lot more machine instructions
to push and pop, and if you weren't writing anything recursive then
extra work for no extra benefit except saving a few digits of
memory if your maximum stack depth is less than the total number of
subroutines you have loaded, except the extra instructions more
than kill off the storage savings.

Hmm, I suppose you could have a auxilary function that serves as
trampoline for stack-based call and return. To call, you move your
own return address and address of subroutine to fixed locations in
low memory then jump to the call trampoline, which pushes the
return address onto the stack and jumps at entry address. To
return, you just jump to the return trampoline, which pops the
return address off the stack and jumps at it. The trampoline,
occuping memory only *once*, could afford to have code to safely
check for stack over/under flow.
From: Roedy Green
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <ees3a4d8bkh0gkho1jfba05nt2m8safmmi@4ax.com>
On Tue, 12 Aug 2008 12:28:33 -0700,
··················@spamgourmet.com.remove (Robert Maas,
http://tinyurl.com/uh3t) wrote, quoted or indirectly quoted someone
who said :

>Note: On IBM 1620, instructions and forward-sweeping data records
>were addressed by their *first* digit, whereas arithmetic fields
>were addressed by their *last* digit, the low-order position, to
>support natural add-and-carry operations. Storage was decimal
>digits, with two extra bits, flag to indicate negative value (if in
>low-order position) or high-order-end (if in any other position),
>and parity.

What a memory you have to recall the precise details of work you did
45 odd years ago.
-- 

Roedy Green Canadian Mind Products
The Java Glossary
http://mindprod.com
From: Arne Vajhøj
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <48a22d60$0$90273$14726298@news.sunsite.dk>
Robert Maas, http://tinyurl.com/uh3t wrote:
> John W Kennedy <·······@attglobal.net> wrote:
> JWK> Into the 60s, indeed, there were still machines being made
> JWK> that had no instruction comparable to the mainframe BASx/BALx
> JWK> family, or to Intel's CALL. You had to do a subprogram call by
> JWK> first overwriting the last instruction of what you were
> JWK> calling with a branch instruction that would return back to
> JWK> you.
> 
> That's not true, that you needed to do that, that there was no
> other way available. The subroutine linkage I invented for S.P.S.
> (Symbolic Programming System, i.e. IBM 1620 assembly language) was
> to reserve a 5-digit space immediately before the subroutine entry
> point for storing the return address. So the caller needed to know
> only one address, the entry point, and do both store-return-address
> and jump relative to that address, rather than needing to know both
> the entry point and the last-instruction-JUMP-needs-patch address
> as independent items of information.

CDC Cyber did something very similar.

Not very recursion friendly.

Arne
From: Piet van Oostrum
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <m27iaa4ugd.fsf@cs.uu.nl>
>>>>> Arne Vajh�j <····@vajhoej.dk> (AV) wrote:

>AV> Robert Maas, http://tinyurl.com/uh3t wrote:
>>> John W Kennedy <·······@attglobal.net> wrote:
>JWK> Into the 60s, indeed, there were still machines being made
>JWK> that had no instruction comparable to the mainframe BASx/BALx
>JWK> family, or to Intel's CALL. You had to do a subprogram call by
>JWK> first overwriting the last instruction of what you were
>JWK> calling with a branch instruction that would return back to
>JWK> you.
>>> 
>>> That's not true, that you needed to do that, that there was no
>>> other way available. The subroutine linkage I invented for S.P.S.
>>> (Symbolic Programming System, i.e. IBM 1620 assembly language) was
>>> to reserve a 5-digit space immediately before the subroutine entry
>>> point for storing the return address. So the caller needed to know
>>> only one address, the entry point, and do both store-return-address
>>> and jump relative to that address, rather than needing to know both
>>> the entry point and the last-instruction-JUMP-needs-patch address
>>> as independent items of information.

>AV> CDC Cyber did something very similar.

>AV> Not very recursion friendly.

Actually, the CYBER way wasn't too bad. IIRC the CYBER had a subroutine
instruction that stored the return address in the location that the
instruction referenced and then jumped to the address following that
location. To implement a recursive procedure you started the code of the
procedure with saving the return address to a stack.
-- 
Piet van Oostrum <····@cs.uu.nl>
URL: http://pietvanoostrum.com [PGP 8DAE142BE17999C4]
Private email: ····@vanoostrum.org
From: Arne Vajhøj
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <48adf9d2$0$90269$14726298@news.sunsite.dk>
Piet van Oostrum wrote:
>>>>>> Arne Vajh�j <····@vajhoej.dk> (AV) wrote:
>> AV> Robert Maas, http://tinyurl.com/uh3t wrote:
>>>> John W Kennedy <·······@attglobal.net> wrote:
>> JWK> Into the 60s, indeed, there were still machines being made
>> JWK> that had no instruction comparable to the mainframe BASx/BALx
>> JWK> family, or to Intel's CALL. You had to do a subprogram call by
>> JWK> first overwriting the last instruction of what you were
>> JWK> calling with a branch instruction that would return back to
>> JWK> you.
>>>> That's not true, that you needed to do that, that there was no
>>>> other way available. The subroutine linkage I invented for S.P.S.
>>>> (Symbolic Programming System, i.e. IBM 1620 assembly language) was
>>>> to reserve a 5-digit space immediately before the subroutine entry
>>>> point for storing the return address. So the caller needed to know
>>>> only one address, the entry point, and do both store-return-address
>>>> and jump relative to that address, rather than needing to know both
>>>> the entry point and the last-instruction-JUMP-needs-patch address
>>>> as independent items of information.
> 
>> AV> CDC Cyber did something very similar.
>> AV> Not very recursion friendly.
> 
> Actually, the CYBER way wasn't too bad. IIRC the CYBER had a subroutine
> instruction that stored the return address in the location that the
> instruction referenced and then jumped to the address following that
> location. To implement a recursive procedure you started the code of the
> procedure with saving the return address to a stack.

It was of course doable.

Else Pascal would have been hard to implement.

Arne
From: John W Kennedy
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <48a4b2ba$0$20923$607ed4bc@cv.net>
Robert Maas, http://tinyurl.com/uh3t wrote:
> John W Kennedy <·······@attglobal.net> wrote:
> JWK> Into the 60s, indeed, there were still machines being made
> JWK> that had no instruction comparable to the mainframe BASx/BALx
> JWK> family, or to Intel's CALL. You had to do a subprogram call by
> JWK> first overwriting the last instruction of what you were
> JWK> calling with a branch instruction that would return back to
> JWK> you.
> 
> That's not true, that you needed to do that, that there was no
> other way available. The subroutine linkage I invented for S.P.S.
> (Symbolic Programming System, i.e. IBM 1620 assembly language) was
> to reserve a 5-digit space immediately before the subroutine entry
> point for storing the return address. So the caller needed to know
> only one address, the entry point, and do both store-return-address
> and jump relative to that address, rather than needing to know both
> the entry point and the last-instruction-JUMP-needs-patch address
> as independent items of information. So calling a subroutine was
> two instructions (pseudo-code here):
>    literal[nextAdrOfSelf} -> memory[SubrEntryPoint-1]
>    jump to SubrEntryPoint
> and returning from a subroutine was two instructios:
>    copy memory[SubrEntryPoint-1] -> memory[here + 11]
>    jump to 00000 ;These zeroes replaced by return address just above
> Of course if you needed to pass parameters and/or return value,
> that was handled separately, perhaps by reserving additional
> storage just before the return address. Of course this methodology
> didn't support recursion.
> 
> So my method required one extra instruction per return point, but
> allowed multiple return points from a single subroutine, and
> allowed "encapsulation" of the relation between entry point and
> return point.
> 
> Note: On IBM 1620, instructions and forward-sweeping data records
> were addressed by their *first* digit, whereas arithmetic fields
> were addressed by their *last* digit, the low-order position, to
> support natural add-and-carry operations. Storage was decimal
> digits, with two extra bits, flag to indicate negative value (if in
> low-order position) or high-order-end (if in any other position),
> and parity. Values larger than nine were reserved for special
> purposes, such as RECORD MARK used to terminate right-sweep data
> records. Because of that, the low-order position of the return
> address and the first digit of the machine instruction at the
> subroutine entry point differed by only machine address, hence the
> SubrEntryPoint-1 instead of SubrEntryPoint-5 you would otherwise
> expect.
> 
> Hmm, I suppose if I had thought it out more at the time, I might have
> done it slightly differently:
> 
> Entry point like this:
>          jump 00000 ;Patched by caller to contain return address
>   Entry: ...(regular code)...
>           ...
> 
> Each return point like this:
>          jump Entry-12
> 
> 
> I wonder if anybody ever implemented a stack on the IBM 1620?
> Probably not, because it would take a lot more machine instructions
> to push and pop, and if you weren't writing anything recursive then
> extra work for no extra benefit except saving a few digits of
> memory if your maximum stack depth is less than the total number of
> subroutines you have loaded, except the extra instructions more
> than kill off the storage savings.
> 
> Hmm, I suppose you could have a auxilary function that serves as
> trampoline for stack-based call and return. To call, you move your
> own return address and address of subroutine to fixed locations in
> low memory then jump to the call trampoline, which pushes the
> return address onto the stack and jumps at entry address. To
> return, you just jump to the return trampoline, which pops the
> return address off the stack and jumps at it. The trampoline,
> occuping memory only *once*, could afford to have code to safely
> check for stack over/under flow.

Actually, I was thinking of the 1401. But both the 1620 and the 1401 
(without the optional Advanced Programming Feature) share the basic 
omission of any instruction that could do call-and-return without 
hard-coding an adcon with the address of the point to be returned to. 
(The Advanced Programming Feature added a 1401 instruction, Store 
B-address Register, that, executed as the first instruction of a 
subroutine, could store the return-to address.)

The 1620, oddly enough, /did/ have call instructions (Branch and 
Transfer, and Branch and Transfer Immediate) and a return instruction 
(Branch Back), but with a hard-wired stack depth of 1.

-- 
John W. Kennedy
  "When a man contemplates forcing his own convictions down another 
man's throat, he is contemplating both an unchristian act and an act of 
treason to the United States."
   -- Joy Davidman, "Smoke on the Mountain"
From: Martijn Lievaart
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <pan.2008.08.16.22.10.56@rtij.nl.invlalid>
On Thu, 14 Aug 2008 18:33:30 -0400, John W Kennedy wrote:

> Actually, I was thinking of the 1401. But both the 1620 and the 1401
> (without the optional Advanced Programming Feature) share the basic
> omission of any instruction that could do call-and-return without
> hard-coding an adcon with the address of the point to be returned to.
> (The Advanced Programming Feature added a 1401 instruction, Store
> B-address Register, that, executed as the first instruction of a
> subroutine, could store the return-to address.)

Raaaagh!!!!

Don't. Bring. Back. Those. Nightmares. Please.

The 1401 was a decent enough processor for many industrial tasks -- at 
that time -- but for general programming it was sheer horror.

M4
From: John W Kennedy
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <48a782e9$0$20913$607ed4bc@cv.net>
Martijn Lievaart wrote:
> On Thu, 14 Aug 2008 18:33:30 -0400, John W Kennedy wrote:
> 
>> Actually, I was thinking of the 1401. But both the 1620 and the 1401
>> (without the optional Advanced Programming Feature) share the basic
>> omission of any instruction that could do call-and-return without
>> hard-coding an adcon with the address of the point to be returned to.
>> (The Advanced Programming Feature added a 1401 instruction, Store
>> B-address Register, that, executed as the first instruction of a
>> subroutine, could store the return-to address.)
> 
> Raaaagh!!!!
> 
> Don't. Bring. Back. Those. Nightmares. Please.
> 
> The 1401 was a decent enough processor for many industrial tasks -- at 
> that time -- but for general programming it was sheer horror.

But the easiest machine language /ever/.

-- 
John W. Kennedy
  "The grand art mastered the thudding hammer of Thor
And the heart of our lord Taliessin determined the war."
   -- Charles Williams.  "Mount Badon"
From: Martin Gregorie
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <g88v3j$jc8$3@localhost.localdomain>
On Sat, 16 Aug 2008 21:46:18 -0400, John W Kennedy wrote:

> Martijn Lievaart wrote:
>> On Thu, 14 Aug 2008 18:33:30 -0400, John W Kennedy wrote:
>> 
>>> Actually, I was thinking of the 1401. But both the 1620 and the 1401
>>> (without the optional Advanced Programming Feature) share the basic
>>> omission of any instruction that could do call-and-return without
>>> hard-coding an adcon with the address of the point to be returned to.
>>> (The Advanced Programming Feature added a 1401 instruction, Store
>>> B-address Register, that, executed as the first instruction of a
>>> subroutine, could store the return-to address.)
>> 
>> Raaaagh!!!!
>> 
>> Don't. Bring. Back. Those. Nightmares. Please.
>> 
>> The 1401 was a decent enough processor for many industrial tasks -- at
>> that time -- but for general programming it was sheer horror.
> 
> But the easiest machine language /ever/.

What? Even easier than ICL 1900 PLAN or MC68000 assembler? That would be 
difficult to achieve.


-- 
······@   | Martin Gregorie
gregorie. | Essex, UK
org       |
From: John W Kennedy
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <48a8df27$0$7362$607ed4bc@cv.net>
Martin Gregorie wrote:
> On Sat, 16 Aug 2008 21:46:18 -0400, John W Kennedy wrote:
> 
>> Martijn Lievaart wrote:
>>> On Thu, 14 Aug 2008 18:33:30 -0400, John W Kennedy wrote:
>>>
>>>> Actually, I was thinking of the 1401. But both the 1620 and the 1401
>>>> (without the optional Advanced Programming Feature) share the basic
>>>> omission of any instruction that could do call-and-return without
>>>> hard-coding an adcon with the address of the point to be returned to.
>>>> (The Advanced Programming Feature added a 1401 instruction, Store
>>>> B-address Register, that, executed as the first instruction of a
>>>> subroutine, could store the return-to address.)
>>> Raaaagh!!!!
>>>
>>> Don't. Bring. Back. Those. Nightmares. Please.
>>>
>>> The 1401 was a decent enough processor for many industrial tasks -- at
>>> that time -- but for general programming it was sheer horror.
>> But the easiest machine language /ever/.
> 
> What? Even easier than ICL 1900 PLAN or MC68000 assembler? That would be 
> difficult to achieve.

I said "machine language" and I meant it. I haven't touched a 1401 since 
1966, and haven't dealt with a 1401 emulator since 1968, but I can 
/still/ write a self-booting program. In 1960, some people still looked 
on assemblers (to say nothing of compilers) as a useless waste of 
resources that could be better applied to end-user applications, and the 
1401 was designed to be programmable in raw machine language. Even shops 
that used assembler nevertheless frequently did bug fixes as 
machine-language patches, rather than take the time to run the assembler 
again. (SPS, the non-macro basic assembler, ran at about 70 lines a 
minute, tops.)

-- 
John W. Kennedy
  "The bright critics assembled in this volume will doubtless show, in 
their sophisticated and ingenious new ways, that, just as /Pooh/ is 
suffused with humanism, our humanism itself, at this late date, has 
become full of /Pooh./"
   -- Frederick Crews.  "Postmodern Pooh", Preface
From: Martin Gregorie
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <g8bflo$984$1@localhost.localdomain>
On Sun, 17 Aug 2008 22:30:35 -0400, John W Kennedy wrote:

> I said "machine language" and I meant it.
>
OK - I haven't touched that since typing ALTER commands into the console 
of a 1903 running the UDAS executive or, even better, patching the 
executive on the hand switches.

I was fascinated, though by the designs of early assemblers: I first 
learnt Elliott assembler, which required the op codes to be typed on 
octal but used symbolic labels and variable names. Meanwhile a colleague 
had started on a KDF6 which was the opposite - op codes were mnemonics 
but all addresses were absolute and entered in octal. I always wondered 
about the rationale of the KDF6 assembler writers in tackling only the 
easy part of the job.

> Even shops that used assembler nevertheless frequently did bug fixes as
> machine-language patches, rather than take the time to run the assembler
> again. (SPS, the non-macro basic assembler, ran at about 70 lines a
> minute, tops.)
>
Even a steam powered 1901 (3.6 uS for a half-word add IIRC) running a 
tape based assembler was faster than that. It could just about keep up 
with a 300 cpm card reader.


-- 
······@   | Martin Gregorie
gregorie. | Essex, UK
org       |
From: Rob Warnock
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <2OedndEZrPbzTTHVnZ2dnUVZ_qfinZ2d@speakeasy.net>
Martin Gregorie  <······@see.sig.for.address.invalid> wrote:
+---------------
| I was fascinated, though by the designs of early assemblers: I first 
| learnt Elliott assembler, which required the op codes to be typed on 
| octal but used symbolic labels and variable names. Meanwhile a colleague 
| had started on a KDF6 which was the opposite - op codes were mnemonics 
| but all addresses were absolute and entered in octal. I always wondered 
| about the rationale of the KDF6 assembler writers in tackling only the 
| easy part of the job.
+---------------

In the LGP-30, they used hex addresses, sort of[1], but the opcodes
(all 16 of them) had single-letter mnemonics chosen so that the
low 4 bits of the character codes *were* the correct nibble for
the opcode!  ;-}

[Or you could type in the actual hex digits, since the low 4 bits
of *their* character codes were also their corresponding binary
nibble values... "but that would have been wrong".]


-Rob

[1] The LGP-30 character code was defined before the industry had
    yet standardized on a common "hex" character set, so instead of
    "0123456789abcdef" they used "0123456789fgjkqw". [The "fgjkqw"
    were some random characters on the Flexowriter keyboard whose low
    4 bits just happened to be what we now call 0xa-0xf]. Even worse,
    the sector addresses of instructions were *not* right-justified
    in the machine word (off by one bit), plus because of the shift-
    register nature of the accumulator you lost the low bit of each
    machine word when you typed in instructions (or read them from
    tape), so the address values you used in coding went up by *4*!
    That is, machine locations were counted [*and* coded, in both
    absolute machine code & assembler] as "0", "4", "8", "j", "10",
    "14", "18", "1j" (pronounced "J-teen"!!), etc.

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: ···@netherlands.com
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <4fkpa4prudu3gkdt9095coho3srb52o82d@4ax.com>
On Wed, 20 Aug 2008 21:18:22 -0500, ····@rpw3.org (Rob Warnock) wrote:

>Martin Gregorie  <······@see.sig.for.address.invalid> wrote:
>+---------------
>| I was fascinated, though by the designs of early assemblers: I first 
>| learnt Elliott assembler, which required the op codes to be typed on 
>| octal but used symbolic labels and variable names. Meanwhile a colleague 
>| had started on a KDF6 which was the opposite - op codes were mnemonics 
>| but all addresses were absolute and entered in octal. I always wondered 
>| about the rationale of the KDF6 assembler writers in tackling only the 
>| easy part of the job.
>+---------------
>
>In the LGP-30, they used hex addresses, sort of[1], but the opcodes
>(all 16 of them) had single-letter mnemonics chosen so that the
>low 4 bits of the character codes *were* the correct nibble for
>the opcode!  ;-}
>
>[Or you could type in the actual hex digits, since the low 4 bits
>of *their* character codes were also their corresponding binary
>nibble values... "but that would have been wrong".]
>
>
>-Rob
>
>[1] The LGP-30 character code was defined before the industry had
>    yet standardized on a common "hex" character set, so instead of
>    "0123456789abcdef" they used "0123456789fgjkqw". [The "fgjkqw"
>    were some random characters on the Flexowriter keyboard whose low
>    4 bits just happened to be what we now call 0xa-0xf]. Even worse,
>    the sector addresses of instructions were *not* right-justified
>    in the machine word (off by one bit), plus because of the shift-
>    register nature of the accumulator you lost the low bit of each
>    machine word when you typed in instructions (or read them from
>    tape), so the address values you used in coding went up by *4*!
>    That is, machine locations were counted [*and* coded, in both
>    absolute machine code & assembler] as "0", "4", "8", "j", "10",
>    "14", "18", "1j" (pronounced "J-teen"!!), etc.
>
>-----
>Rob Warnock			<····@rpw3.org>
>627 26th Avenue			<URL:http://rpw3.org/>
>San Mateo, CA 94403		(650)572-2607


Whats os interresting about all this hullabaloo is that nobody has
coded machine code here, and know's squat about it.

I'm not talking assembly language. Don't you know that there are routines
that program machine code? Yes, burned in, bitwise encodings that enable
machine instructions? Nothing below that.

There is nobody here, who ever visited/replied with any thought relavence that can
be brought foward to any degree, meaning anything, nobody....

sln
From: ···@netherlands.com
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <v2lpa49o65j228vpcec9d03d560b18rkl7@4ax.com>
On Thu, 21 Aug 2008 02:30:27 GMT, ···@netherlands.com wrote:

>On Wed, 20 Aug 2008 21:18:22 -0500, ····@rpw3.org (Rob Warnock) wrote:
>
>>Martin Gregorie  <······@see.sig.for.address.invalid> wrote:
>>+---------------
>>| I was fascinated, though by the designs of early assemblers: I first 
>>| learnt Elliott assembler, which required the op codes to be typed on 
>>| octal but used symbolic labels and variable names. Meanwhile a colleague 
>>| had started on a KDF6 which was the opposite - op codes were mnemonics 
>>| but all addresses were absolute and entered in octal. I always wondered 
>>| about the rationale of the KDF6 assembler writers in tackling only the 
>>| easy part of the job.
>>+---------------
>>
>>In the LGP-30, they used hex addresses, sort of[1], but the opcodes
>>(all 16 of them) had single-letter mnemonics chosen so that the
>>low 4 bits of the character codes *were* the correct nibble for
>>the opcode!  ;-}
>>
>>[Or you could type in the actual hex digits, since the low 4 bits
>>of *their* character codes were also their corresponding binary
>>nibble values... "but that would have been wrong".]
>>
>>
>>-Rob
>>
>>[1] The LGP-30 character code was defined before the industry had
>>    yet standardized on a common "hex" character set, so instead of
>>    "0123456789abcdef" they used "0123456789fgjkqw". [The "fgjkqw"
>>    were some random characters on the Flexowriter keyboard whose low
>>    4 bits just happened to be what we now call 0xa-0xf]. Even worse,
>>    the sector addresses of instructions were *not* right-justified
>>    in the machine word (off by one bit), plus because of the shift-
>>    register nature of the accumulator you lost the low bit of each
>>    machine word when you typed in instructions (or read them from
>>    tape), so the address values you used in coding went up by *4*!
>>    That is, machine locations were counted [*and* coded, in both
>>    absolute machine code & assembler] as "0", "4", "8", "j", "10",
>>    "14", "18", "1j" (pronounced "J-teen"!!), etc.
>>
>>-----
>>Rob Warnock			<····@rpw3.org>
>>627 26th Avenue			<URL:http://rpw3.org/>
>>San Mateo, CA 94403		(650)572-2607
>
>
>Whats os interresting about all this hullabaloo is that nobody has
>coded machine code here, and know's squat about it.
>
>I'm not talking assembly language. Don't you know that there are routines
>that program machine code? Yes, burned in, bitwise encodings that enable
>machine instructions? Nothing below that.
>
>There is nobody here, who ever visited/replied with any thought relavence that can
>be brought foward to any degree, meaning anything, nobody....
>
>sln

At most, your trying to validate you understanding. But you don't pose questions,
you pose terse inflamatory declarations.

You make me sick!

sln
From: Andrew Reilly
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <6h4entFioa48U1@mid.individual.net>
On Thu, 21 Aug 2008 02:36:39 +0000, sln wrote:

>>Whats os interresting about all this hullabaloo is that nobody has coded
>>machine code here, and know's squat about it.
>>
>>I'm not talking assembly language. Don't you know that there are
>>routines that program machine code? Yes, burned in, bitwise encodings
>>that enable machine instructions? Nothing below that.
>>
>>There is nobody here, who ever visited/replied with any thought
>>relavence that can be brought foward to any degree, meaning anything,
>>nobody....
>>
>>sln
> 
> At most, your trying to validate you understanding. But you don't pose
> questions, you pose terse inflamatory declarations.
> 
> You make me sick!

Could you elaborate a little on what it is that you're upset about?  I 
suspect that there are probably quite a few readers of these posts that 
have designed and built their own processors, and coded them in their own 
machine language.  I have, and that was before FPGAs started to make that 
exercise quite commonplace.  But I don't see how that's at all relevant 
to the debate about the power or other characteristics of programming 
languages.  Certainly anyone who's programmed a machine in assembly 
language has a pretty fair understanding of what the machine and the 
machine language is doing, even though they don't choose to bang the bits 
together manually.

Hope you get better.

-- 
Andrew
From: Rob Warnock
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <RpadnQoEdeU56jDVnZ2dnUVZ_oninZ2d@speakeasy.net>
···@netherlands.com> wrote:
+---------------
| ····@rpw3.org (Rob Warnock) wrote:
| >In the LGP-30, they used hex addresses, sort of[1], but the opcodes
| >(all 16 of them) had single-letter mnemonics chosen so that the
| >low 4 bits of the character codes *were* the correct nibble for
| >the opcode!  ;-}
...
| >[1] The LGP-30 character code was defined before the industry had
| >    yet standardized on a common "hex" character set, so instead of
| >    "0123456789abcdef" they used "0123456789fgjkqw". [The "fgjkqw"
| >    were some random characters on the Flexowriter keyboard whose low
| >    4 bits just happened to be what we now call 0xa-0xf]. Even worse,
| >    the sector addresses of instructions were *not* right-justified
| >    in the machine word (off by one bit), plus because of the shift-
| >    register nature of the accumulator you lost the low bit of each
| >    machine word when you typed in instructions (or read them from
| >    tape), so the address values you used in coding went up by *4*!
| >    That is, machine locations were counted [*and* coded, in both
| >    absolute machine code & assembler] as "0", "4", "8", "j", "10",
| >    "14", "18", "1j" (pronounced "J-teen"!!), etc.
| 
| Whats os interresting about all this hullabaloo is that nobody has
| coded machine code here, and know's squat about it.
+---------------

Think again! *BOTH* of the two examples I gave -- for the LGP-30 & the
IBM 1410 -- *WERE* raw machine code, *NOT* assembler!!! Please read again
what I wrote about the character codes for the instruction mnemonics
*BEING* the machine instruction codes. For the IBM 1410, the bootstrap
code that ones types in:

    v         v
    L%B000012$N

*IS* raw machine code, *NOT* assembler!! [See my previous post for
the meaning of the fields of that instruction.] The IBM 1410 is a
*character* machine, not a binary machine, so each memory location is
a (6+1 bit) character. As a result, one can [and *must*, during bootup!]
type absolute machine code directly into memory using the console
typewriter [an only-slightly-modified IBM Selectric, as it happens].
The fact that -- for the sake of user convenience -- the hardware
supports manual entry of machine code into memory by the operator
in a form that closely *resembles* assembler (but *ISN'T*!) was a
deliberate design feature.

Similarly, the characters you type for the LGP-30's opcodes *ARE* the
opcodes, at least by the time they get into the machine, since when
typing in code (or reading it from paper tape) one sets the I/O system
to "4-bit input mode", in which the upper bits of the characters are
stripped off by the I/O hardware during input. Thus when you type a
"Bring" (load) instruction such as this ["Bring" (load) location 0xd7c
(track 0xd, sector 0x31)]:

    b0q7j

you're typing *RAW* machine code, *NOT* assembler!!  You see, the
lower 4 bits of character "b" -- the "mnemonic" for "Bring" -- were
binary 0001, the *same* as the lower 4 bits of the digit "1" (and two
other characters as well). So when you typed a "b" in that position
in "4-bit input mode" you were typing the same thing as the character "1"
which was the same thing as *binary* 0001 which was the absolute machine
opcode for the ""Bring" instruction!!  "No assembler required" (pardon
the pun).

+---------------
| I'm not talking assembly language.
+---------------

Neither was I. The fact that for the two machines I mentioned
absolute machine code was somewhat readable to humans seems to
have confused you, but that's the way some people liked to design
their hardware back in those days -- with clever punning of character
codes with absolute machine opcodes (for the convenience of the user).

+---------------
| Don't you know that there are routines that program machine code?
+---------------

What do you mean "routines"?!? For the above two machines, you can
enter machine code with *no* programs ("routines?") running; that is,
no boot ROM is required (or even available, as it happened).

+---------------
| Yes, burned in, bitwise encodings that enable machine instructions?
+---------------

The two machines mentioned above did not *HAVE* a "boot ROM"!!
You had to *manually* type raw, absolute machine code into
them by hand every time you wanted to boot them up [at least
a little bit, just enough to load a larger bootstrap program].

+---------------
| Nothing below that.
+---------------

You're assuming that all machines *have* some sort of "boot ROM".
Before the microprocessor days, that was certainly not always
the case. The "boot ROM", or other methods of booting a machine
without manually entering at least a small amount of "shoelace"
code [enough the *load* the real bootstrap], was a fairly late
invention.


-Rob

p.s. Similarly, the DEC PDP-8 & PDP-11 were also originally booted
by manually toggling the console switches in order to deposit a few
instructions into memory, and then the starting address was toggled
in and "Start" was pushed. It was only later that a boot ROM became
available for the PDP-11 (as an expensive option!) -and only much
later still for the PDP-8 series (e.g., the MI8E for the PDP-8/E).

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: Martin Gregorie
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <g8kma0$e5q$2@localhost.localdomain>
On Thu, 21 Aug 2008 09:11:48 -0500, Rob Warnock wrote:

> You're assuming that all machines *have* some sort of "boot ROM". Before
> the microprocessor days, that was certainly not always the case. The
> "boot ROM", or other methods of booting a machine without manually
> entering at least a small amount of "shoelace" code [enough the *load*
> the real bootstrap], was a fairly late invention.
>
Quite. I never knew how to boot the Elliott 503 (never got closer to the 
console than the other side of a plate glass window). However, I dealt 
with that aspect of ICL 1900s. They had ferrite core memory and NO ROM. 
When you hit Start this cleared the memory and then pulsed a wire that 
wrote the bootstrap into memory and executed it. The wire wove through 
the cores and wrote 1 bits to the the right places to:
- set word 8 (the PC) to 20
- set 25 words from 20 as bootstrap instructions to boot from disk.

Then it started the CPU running. 

On the 1902 this sequence often didn't work, so a good operator knew the 
25 words by heart and would toggle them in on hand switches, set PC to 20 
and hit the GO switch.
 
 

 
> 
> -Rob
> 
> p.s. Similarly, the DEC PDP-8 & PDP-11 were also originally booted by
> manually toggling the console switches in order to deposit a few
> instructions into memory, and then the starting address was toggled in
> and "Start" was pushed. It was only later that a boot ROM became
> available for the PDP-11 (as an expensive option!) -and only much later
> still for the PDP-8 series (e.g., the MI8E for the PDP-8/E).
> 
> -----
> Rob Warnock			<····@rpw3.org>
> 627 26th Avenue			<URL:http://rpw3.org/> San Mateo, 
CA 94403	
> (650)572-2607



-- 
······@   | Martin Gregorie
gregorie. | Essex, UK
org       |
From: ···@netherlands.com
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <l9gua4hv68o25l4ks0jsird8ou7gesflvl@4ax.com>
On Thu, 21 Aug 2008 09:11:48 -0500, ····@rpw3.org (Rob Warnock) wrote:

>···@netherlands.com> wrote:
>+---------------
>| ····@rpw3.org (Rob Warnock) wrote:
>| >In the LGP-30, they used hex addresses, sort of[1], but the opcodes
>| >(all 16 of them) had single-letter mnemonics chosen so that the
>| >low 4 bits of the character codes *were* the correct nibble for
>| >the opcode!  ;-}
>...
>| >[1] The LGP-30 character code was defined before the industry had
>| >    yet standardized on a common "hex" character set, so instead of
>| >    "0123456789abcdef" they used "0123456789fgjkqw". [The "fgjkqw"
>| >    were some random characters on the Flexowriter keyboard whose low
>| >    4 bits just happened to be what we now call 0xa-0xf]. Even worse,
>| >    the sector addresses of instructions were *not* right-justified
>| >    in the machine word (off by one bit), plus because of the shift-
>| >    register nature of the accumulator you lost the low bit of each
>| >    machine word when you typed in instructions (or read them from
>| >    tape), so the address values you used in coding went up by *4*!
>| >    That is, machine locations were counted [*and* coded, in both
>| >    absolute machine code & assembler] as "0", "4", "8", "j", "10",
>| >    "14", "18", "1j" (pronounced "J-teen"!!), etc.
>| 
>| Whats os interresting about all this hullabaloo is that nobody has
>| coded machine code here, and know's squat about it.
>+---------------
>
>Think again! *BOTH* of the two examples I gave -- for the LGP-30 & the
>IBM 1410 -- *WERE* raw machine code, *NOT* assembler!!! Please read again
>what I wrote about the character codes for the instruction mnemonics
>*BEING* the machine instruction codes. For the IBM 1410, the bootstrap
>code that ones types in:
>
>    v         v
>    L%B000012$N
>
>*IS* raw machine code, *NOT* assembler!!
[snip]

I don't see the distinction.
Just dissasemble it and find out.

>
>you're typing *RAW* machine code, *NOT* assembler!!  You see, the
>lower 4 bits of character "b" -- the "mnemonic" for "Bring" -- were
>binary 0001, the *same* as the lower 4 bits of the digit "1" (and two
>other characters as well). So when you typed a "b" in that position
>in "4-bit input mode" you were typing the same thing as the character "1"
>which was the same thing as *binary* 0001 which was the absolute machine
>opcode for the ""Bring" instruction!!  "No assembler required" (pardon
>the pun).
>
>+---------------
>| I'm not talking assembly language.
>+---------------
>
>Neither was I. The fact that for the two machines I mentioned
>absolute machine code was somewhat readable to humans seems to
>have confused you, but that's the way some people liked to design
>their hardware back in those days -- with clever punning of character
>codes with absolute machine opcodes (for the convenience of the user).
>
>+---------------
>| Don't you know that there are routines that program machine code?
>+---------------
>
>What do you mean "routines"?!? For the above two machines, you can
>enter machine code with *no* programs ("routines?") running; that is,
>no boot ROM is required (or even available, as it happened).
>
[snip all the bullshit]

Each op is a routine in microcode. 
That is machine code. Those op routines use machine cycles.

You hit the wall of understanding along time ago, a wall you
never looked past.

sln
From: Martin Gregorie
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <g8nhqd$6ri$2@localhost.localdomain>
On Fri, 22 Aug 2008 22:56:09 +0000, sln wrote:

> On Thu, 21 Aug 2008 09:11:48 -0500, ····@rpw3.org (Rob Warnock) wrote:
> 
>>···@netherlands.com> wrote:
>>*IS* raw machine code, *NOT* assembler!!
> [snip]
> 
> I don't see the distinction.
> Just dissasemble it and find out.
>
There's a 1:1 relationship between machine code and assembler. 
Unless its a macro-assembler, of course!
 
> 
> Each op is a routine in microcode.
> That is machine code. Those op routines use machine cycles.
>
Not necessarily. An awful lot of CPU cycles were used before microcode 
was introduced. Mainframes and minis designed before about 1970 didn't 
use or need it and I'm pretty sure that there was no microcode in the 
original 8/16 bit microprocessors either (6800, 6809, 6502, 8080, 8086, 
Z80 and friends).

The number of clock cycles per instruction isn't a guide either. The only 
processors I know that got close to 1 cycle/instruction were all RISC, 
all used large lumps of microcode and were heavily pipelined.

By contrast the ICL 1900 series (3rd generation mainframe, no microcode, 
no pipeline, 24 bit word) averaged 3 clock cycles per instruction. 
Motorola 6800 and 6809 (no microcode or pipelines either, 1 byte fetch) 
average 4 - 5 cycles/instruction.


-- 
······@   | Martin Gregorie
gregorie. | Essex, UK
org       |
From: Paul Wallich
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <g8nqgr$1t8$1@reader1.panix.com>
Martin Gregorie wrote:
> On Fri, 22 Aug 2008 22:56:09 +0000, sln wrote:
> 
>> On Thu, 21 Aug 2008 09:11:48 -0500, ····@rpw3.org (Rob Warnock) wrote:
>>
>>> ···@netherlands.com> wrote:
>>> *IS* raw machine code, *NOT* assembler!!
>> [snip]
>>
>> I don't see the distinction.
>> Just dissasemble it and find out.
>>
> There's a 1:1 relationship between machine code and assembler. 
> Unless its a macro-assembler, of course!
>  
>> Each op is a routine in microcode.
>> That is machine code. Those op routines use machine cycles.
>>
> Not necessarily. An awful lot of CPU cycles were used before microcode 
> was introduced. Mainframes and minis designed before about 1970 didn't 
> use or need it and I'm pretty sure that there was no microcode in the 
> original 8/16 bit microprocessors either (6800, 6809, 6502, 8080, 8086, 
> Z80 and friends).
> 
> The number of clock cycles per instruction isn't a guide either. The only 
> processors I know that got close to 1 cycle/instruction were all RISC, 
> all used large lumps of microcode and were heavily pipelined.
> 
> By contrast the ICL 1900 series (3rd generation mainframe, no microcode, 
> no pipeline, 24 bit word) averaged 3 clock cycles per instruction. 
> Motorola 6800 and 6809 (no microcode or pipelines either, 1 byte fetch) 
> average 4 - 5 cycles/instruction.

One problem with this discussion is that the term "microcode" isn't 
really well-defined. There's the vertical kind, the horizontal kind, 
with and without internal control-flow constructs, and then there are 
various levels of visibility to the user -- see e.g. the pdp-8 manual, 
where "microcoding" is used to mean piling the bits for a bunch of 
instructions together in the same memory location, which works fine as 
long as the instructions in question don't use conflicting sets of bits.

paul
From: =?UTF-8?B?QXJuZSBWYWpow7hq?=
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <48af72c1$0$90264$14726298@news.sunsite.dk>
Paul Wallich wrote:
> Martin Gregorie wrote:
>> On Fri, 22 Aug 2008 22:56:09 +0000, sln wrote:
>>> On Thu, 21 Aug 2008 09:11:48 -0500, ····@rpw3.org (Rob Warnock) wrote:
>>>> ···@netherlands.com> wrote:
>>>> *IS* raw machine code, *NOT* assembler!!
>>> [snip]
>>>
>>> I don't see the distinction.
>>> Just dissasemble it and find out.
>>>
>> There's a 1:1 relationship between machine code and assembler. Unless 
>> its a macro-assembler, of course!
>>  
>>> Each op is a routine in microcode.
>>> That is machine code. Those op routines use machine cycles.
>>>
>> Not necessarily. An awful lot of CPU cycles were used before microcode 
>> was introduced. Mainframes and minis designed before about 1970 didn't 
>> use or need it and I'm pretty sure that there was no microcode in the 
>> original 8/16 bit microprocessors either (6800, 6809, 6502, 8080, 
>> 8086, Z80 and friends).
>>
>> The number of clock cycles per instruction isn't a guide either. The 
>> only processors I know that got close to 1 cycle/instruction were all 
>> RISC, all used large lumps of microcode and were heavily pipelined.
>>
>> By contrast the ICL 1900 series (3rd generation mainframe, no 
>> microcode, no pipeline, 24 bit word) averaged 3 clock cycles per 
>> instruction. Motorola 6800 and 6809 (no microcode or pipelines either, 
>> 1 byte fetch) average 4 - 5 cycles/instruction.
> 
> One problem with this discussion is that the term "microcode" isn't 
> really well-defined. There's the vertical kind, the horizontal kind, 
> with and without internal control-flow constructs, and then there are 
> various levels of visibility to the user -- see e.g. the pdp-8 manual, 
> where "microcoding" is used to mean piling the bits for a bunch of 
> instructions together in the same memory location, which works fine as 
> long as the instructions in question don't use conflicting sets of bits.

I thought microcode was relative well defined as being the software
used to implement instructions that were not fully implemented in
hardware.

http://en.wikipedia.org/wiki/Microcode does not make me think otherwise.

Arne
From: ···@netherlands.com
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <q4jua4hchbccbn75scph8tkrjp7mu3423t@4ax.com>
On Fri, 22 Aug 2008 23:23:57 +0000 (UTC), Martin Gregorie <······@see.sig.for.address.invalid> wrote:

>On Fri, 22 Aug 2008 22:56:09 +0000, sln wrote:
>
>> On Thu, 21 Aug 2008 09:11:48 -0500, ····@rpw3.org (Rob Warnock) wrote:
>> 
>>>···@netherlands.com> wrote:
>>>*IS* raw machine code, *NOT* assembler!!
>> [snip]
>> 
>> I don't see the distinction.
>> Just dissasemble it and find out.
>>
>There's a 1:1 relationship between machine code and assembler. 
>Unless its a macro-assembler, of course!
> 
>> 
>> Each op is a routine in microcode.
>> That is machine code. Those op routines use machine cycles.
>>
>Not necessarily. An awful lot of CPU cycles were used before microcode 
>was introduced. Mainframes and minis designed before about 1970 didn't 
>use or need it and I'm pretty sure that there was no microcode in the 
>original 8/16 bit microprocessors either (6800, 6809, 6502, 8080, 8086, 
>Z80 and friends).
>
>The number of clock cycles per instruction isn't a guide either. The only 
>processors I know that got close to 1 cycle/instruction were all RISC, 
>all used large lumps of microcode and were heavily pipelined.
>
>By contrast the ICL 1900 series (3rd generation mainframe, no microcode, 
>no pipeline, 24 bit word) averaged 3 clock cycles per instruction. 
>Motorola 6800 and 6809 (no microcode or pipelines either, 1 byte fetch) 
>average 4 - 5 cycles/instruction.

Surely you have caved to intelligence. And there is nothing beyond op.
What has the friggin world come to!!!


sln
From: John W Kennedy
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <48af8cc4$0$20919$607ed4bc@cv.net>
Martin Gregorie wrote:
> Not necessarily. An awful lot of CPU cycles were used before microcode 
> was introduced. Mainframes and minis designed before about 1970 didn't 
> use or need it

No, most S/360s used microcode.

-- 
John W. Kennedy
  "There are those who argue that everything breaks even in this old 
dump of a world of ours. I suppose these ginks who argue that way hold 
that because the rich man gets ice in the summer and the poor man gets 
it in the winter things are breaking even for both. Maybe so, but I'll 
swear I can't see it that way."
   -- The last words of Bat Masterson
From: Martin Gregorie
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <g8oq91$gq7$2@localhost.localdomain>
On Sat, 23 Aug 2008 00:06:28 -0400, John W Kennedy wrote:

> Martin Gregorie wrote:
>> Not necessarily. An awful lot of CPU cycles were used before microcode
>> was introduced. Mainframes and minis designed before about 1970 didn't
>> use or need it
> 
> No, most S/360s used microcode.

I never used an S/360.

I thought microcode came into the IBM world with S/370 and Future Series 
(which later reappeared as the AS/400, which I did use). Didn't the S/370 
load its microcode off an 8 inch floppy?


-- 
······@   | Martin Gregorie
gregorie. | Essex, UK
org       |
From: John W Kennedy
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <48b0b81f$0$7353$607ed4bc@cv.net>
Martin Gregorie wrote:
> On Sat, 23 Aug 2008 00:06:28 -0400, John W Kennedy wrote:
> 
>> Martin Gregorie wrote:
>>> Not necessarily. An awful lot of CPU cycles were used before microcode
>>> was introduced. Mainframes and minis designed before about 1970 didn't
>>> use or need it
>> No, most S/360s used microcode.
> 
> I never used an S/360.
> 
> I thought microcode came into the IBM world with S/370 and Future Series 
> (which later reappeared as the AS/400, which I did use). Didn't the S/370 
> load its microcode off an 8 inch floppy?

Some did, but not all. The 370/145 was the first, and made a big splash 
thereby.

As to the 360s:

  20  (Incompatible subset)      I don't know
  22  (Recycled end-of-life 30)  CROS
  25                             Loaded from punched cards
  30                             CROS
  40                             TROS
  44  (Subset)                   None
  50                             CROS
  60, 62, 65                     ROS
  64, 66, 67                     ROS
  70, 75                         None
  85                             I don't know
  91, 95                         I don't know -- probably none
  195                            I don't know

CROS used plastic-coated foil punched cards as the dielectrics of 960 
capacitors each.

TROS used little transformer coils that might or might not be severed.

ROS means it was there, but I don't know the technology.
-- 
John W. Kennedy
  "Those in the seat of power oft forget their failings and seek only 
the obeisance of others!  Thus is bad government born!  Hold in your 
heart that you and the people are one, human beings all, and good 
government shall arise of its own accord!  Such is the path of virtue!"
   -- Kazuo Koike.  "Lone Wolf and Cub:  Thirteen Strings" (tr. Dana Lewis)
From: Martin Gregorie
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <g8rk4h$amj$1@localhost.localdomain>
On Sat, 23 Aug 2008 21:22:05 -0400, John W Kennedy wrote:

> Martin Gregorie wrote:
>> On Sat, 23 Aug 2008 00:06:28 -0400, John W Kennedy wrote:
>> 
>>> Martin Gregorie wrote:
>>>> Not necessarily. An awful lot of CPU cycles were used before
>>>> microcode was introduced. Mainframes and minis designed before about
>>>> 1970 didn't use or need it
>>> No, most S/360s used microcode.
>> 
>> I never used an S/360.
>> 
>> I thought microcode came into the IBM world with S/370 and Future
>> Series (which later reappeared as the AS/400, which I did use). Didn't
>> the S/370 load its microcode off an 8 inch floppy?
> 
> Some did, but not all. The 370/145 was the first, and made a big splash
> thereby.
>
..snip...

Thanks for that. As I said, during most of that era I was using ICL kit. 
Microcode was never mentioned in the 1900 contect. Hoiwever, they had a 
very rough approximation called extracodes. though they were closer to 
software traps than microcode: if hardware didn't implement an op code 
the OS intercepted it and ran equivalent code. This was used for i/o 
operations and for FP instructions on boxes that didn't have FP hardware.
As a result all boxes executed the same instruction set. Some opcodes 
might be very slow on some hardware but it would execute.

The 2900 series had huge amounts of microcode - it even defined both 
memory mapping and opcodes. You could run 1900 code (24 bit words, fixed 
length instructions, ISO character codes) simultaneously with 'native' 
code (8 bit bytes, v/l instructions, EBCDIC) with each program running 
under its usual OS (George 3 for 1900, VME/B for 1900). 

The only other systems I'm aware of that could do this were the big 
Burroughs boxes (6700 ?), which used a byte-based VM for COBOL and a word-
based VM for FORTRAN and Algol 60) and IBM AS/400 (OS/400 could run S/34 
code alongside S/38 and AS/400 code). AFAICT Intel virtualisation doesn't 
do this - all code running under VMware or any of the other VMs is still 
running in a standard Intel environment.


-- 
······@   | Martin Gregorie
gregorie. | Essex, UK
org       |
From: George Neuner
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <s2lta4lorjk49dn3joo3gj3juc8rfqqinf@4ax.com>
On Thu, 21 Aug 2008 02:30:27 GMT, ···@netherlands.com wrote:

>On Wed, 20 Aug 2008 21:18:22 -0500, ····@rpw3.org (Rob Warnock) wrote:
>
>>Martin Gregorie  <······@see.sig.for.address.invalid> wrote:
>>+---------------
>>| I was fascinated, though by the designs of early assemblers: I first 
>>| learnt Elliott assembler, which required the op codes to be typed on 
>>| octal but used symbolic labels and variable names. Meanwhile a colleague 
>>| had started on a KDF6 which was the opposite - op codes were mnemonics 
>>| but all addresses were absolute and entered in octal. I always wondered 
>>| about the rationale of the KDF6 assembler writers in tackling only the 
>>| easy part of the job.
>>+---------------
>>
>>In the LGP-30, they used hex addresses, sort of[1], but the opcodes
>>(all 16 of them) had single-letter mnemonics chosen so that the
>>low 4 bits of the character codes *were* the correct nibble for
>>the opcode!  ;-}
>>
>>[Or you could type in the actual hex digits, since the low 4 bits
>>of *their* character codes were also their corresponding binary
>>nibble values... "but that would have been wrong".]
>>
>>
>>-Rob
>>
>>[1] The LGP-30 character code was defined before the industry had
>>    yet standardized on a common "hex" character set, so instead of
>>    "0123456789abcdef" they used "0123456789fgjkqw". [The "fgjkqw"
>>    were some random characters on the Flexowriter keyboard whose low
>>    4 bits just happened to be what we now call 0xa-0xf]. Even worse,
>>    the sector addresses of instructions were *not* right-justified
>>    in the machine word (off by one bit), plus because of the shift-
>>    register nature of the accumulator you lost the low bit of each
>>    machine word when you typed in instructions (or read them from
>>    tape), so the address values you used in coding went up by *4*!
>>    That is, machine locations were counted [*and* coded, in both
>>    absolute machine code & assembler] as "0", "4", "8", "j", "10",
>>    "14", "18", "1j" (pronounced "J-teen"!!), etc.
>>
>>-----
>>Rob Warnock			<····@rpw3.org>
>>627 26th Avenue			<URL:http://rpw3.org/>
>>San Mateo, CA 94403		(650)572-2607
>
>
>Whats os interresting about all this hullabaloo is that nobody has
>coded machine code here, and know's squat about it.

A friend of mine had an early 8080 micros that was programmed through
the front panel using knife switches ... toggle in the binary address,
latch it, toggle in the binary data, latch it, repeat ad nauseam.  It
had no storage device initially ... to use it you had to input your
program by hand every time you turned it on.

I did a little bit of programming on it, but I tired of it quickly.
As did my friend - once he got the tape storage working (a new prom)
and got hold of a shareware assembler, we hardly ever touched the
switch panel again.  Then came CP/M and all the initial pain was
forgotten (replaced by CP/M pain 8-).


>I'm not talking assembly language. Don't you know that there are routines
>that program machine code? Yes, burned in, bitwise encodings that enable
>machine instructions? Nothing below that.
>
>There is nobody here, who ever visited/replied with any thought relavence that can
>be brought foward to any degree, meaning anything, nobody....

What are you looking for?  An emulator you can play with?  

Machine coding is not relevant anymore - it's completely infeasible to
input all but the smallest program.  My friend had a BASIC interpreter
for his 8080 - about 2KB which took hours to input by hand and heaven
help you if you screwed up or the computer crashed.

>sln

George
From: ···@netherlands.com
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <o7hua49gt4l4jfi2mv6lgg3njguc0476tj@4ax.com>
On Fri, 22 Aug 2008 11:11:09 -0400, George Neuner <········@comcast.net> wrote:

>On Thu, 21 Aug 2008 02:30:27 GMT, ···@netherlands.com wrote:
>
>>On Wed, 20 Aug 2008 21:18:22 -0500, ····@rpw3.org (Rob Warnock) wrote:
>>
>>>Martin Gregorie  <······@see.sig.for.address.invalid> wrote:
>>>+---------------
>>>| I was fascinated, though by the designs of early assemblers: I first 
>>>| learnt Elliott assembler, which required the op codes to be typed on 
>>>| octal but used symbolic labels and variable names. Meanwhile a colleague 
>>>| had started on a KDF6 which was the opposite - op codes were mnemonics 
>>>| but all addresses were absolute and entered in octal. I always wondered 
>>>| about the rationale of the KDF6 assembler writers in tackling only the 
>>>| easy part of the job.
>>>+---------------
>>>
>>>In the LGP-30, they used hex addresses, sort of[1], but the opcodes
>>>(all 16 of them) had single-letter mnemonics chosen so that the
>>>low 4 bits of the character codes *were* the correct nibble for
>>>the opcode!  ;-}
>>>
>>>[Or you could type in the actual hex digits, since the low 4 bits
>>>of *their* character codes were also their corresponding binary
>>>nibble values... "but that would have been wrong".]
>>>
>>>
>>>-Rob
>>>
>>>[1] The LGP-30 character code was defined before the industry had
>>>    yet standardized on a common "hex" character set, so instead of
>>>    "0123456789abcdef" they used "0123456789fgjkqw". [The "fgjkqw"
>>>    were some random characters on the Flexowriter keyboard whose low
>>>    4 bits just happened to be what we now call 0xa-0xf]. Even worse,
>>>    the sector addresses of instructions were *not* right-justified
>>>    in the machine word (off by one bit), plus because of the shift-
>>>    register nature of the accumulator you lost the low bit of each
>>>    machine word when you typed in instructions (or read them from
>>>    tape), so the address values you used in coding went up by *4*!
>>>    That is, machine locations were counted [*and* coded, in both
>>>    absolute machine code & assembler] as "0", "4", "8", "j", "10",
>>>    "14", "18", "1j" (pronounced "J-teen"!!), etc.
>>>
>>>-----
>>>Rob Warnock			<····@rpw3.org>
>>>627 26th Avenue			<URL:http://rpw3.org/>
>>>San Mateo, CA 94403		(650)572-2607
>>
>>
>>Whats os interresting about all this hullabaloo is that nobody has
>>coded machine code here, and know's squat about it.
>
>A friend of mine had an early 8080 micros that was programmed through
>the front panel using knife switches ... toggle in the binary address,
>latch it, toggle in the binary data, latch it, repeat ad nauseam.  It
>had no storage device initially ... to use it you had to input your
>program by hand every time you turned it on.
>
>I did a little bit of programming on it, but I tired of it quickly.
>As did my friend - once he got the tape storage working (a new prom)
>and got hold of a shareware assembler, we hardly ever touched the
>switch panel again.  Then came CP/M and all the initial pain was
>forgotten (replaced by CP/M pain 8-).
>
>
>>I'm not talking assembly language. Don't you know that there are routines
>>that program machine code? Yes, burned in, bitwise encodings that enable
>>machine instructions? Nothing below that.
>>
>>There is nobody here, who ever visited/replied with any thought relavence that can
>>be brought foward to any degree, meaning anything, nobody....
>
>What are you looking for?  An emulator you can play with?  
>
>Machine coding is not relevant anymore - it's completely infeasible to
>input all but the smallest program.  My friend had a BASIC interpreter
>for his 8080 - about 2KB which took hours to input by hand and heaven
>help you if you screwed up or the computer crashed.
>
>>sln
>
>George

I'm not looking for anything didn't you know?

I guess I wasn't talking assembly language, I was talking machine language.
The language behind the op codes, the one that takes cycles per 'op'eration.

It is entirely disgusting to listen to a littany of old time has beens who
did or did not work on old time cpu's that did or did not have an accumulator.

I worked with many cpu's at the assembly level, hell my first endevour was
writing dissasemblers, like Zilogs and pass through's to gain a whole different
instruction set.

I moved on, but before I did, I fully understood everyting I touched.
I've programmed over 14 million lines of code in my lifetime, but who gives
a rats ass.

Cpu's aren't complicated, far from it. Its having to eat the dubious constructs
of higher level languages built on those platforms.

Get your head out of the ground!


sln
From: Robert Maas, http://tinyurl.com/uh3t
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <rem-2008sep01-004@yahoo.com>
> From: George Neuner <········@comcast.net>
> A friend of mine had an early 8080 micros that was programmed
> through the front panel using knife switches

When you say "knife switches", do you mean the kind that are shaped
like flat paddles? I think that would be the IMSAI, which came
after the ALTAIR. Those flat paddle-shaped switched would
presumably be much more comfortable on the fingers than the
standard metal toggle switches on the ALTAIR. I had an ALTAIR,
which made front-panel programming rather painful on the fingers.

> ... toggle in the binary address, latch it, toggle in the binary
> data, latch it, repeat ad nauseam.

For manually loading software in sequential locations, you have to
enter the binary address only once. After that, you press the
EXAMINE-NEXT toggle to increment the address by one. That reduced
the effort by nearly a factor of 3. Instead of toggling a 16-bit
address and EXAMINE and 8-bit datum and STORE each time, you toggle
just EXAMINE-NEXT and 8-bit datum and STORE for each consecutive
memory location after the first. Note the address and data toggles
are bistable, stay in whatever position you left them in, whereas
the three control toggles (EXAMIME EXAMINE-NEXT STORE) are spring
loaded, making momentary contact when you force them then spring
back to inactive when you release them.

> It had no storage device initially ... to use it you had to input
> your program by hand every time you turned it on.

Almost but not quite true. With static RAM, most of the data can
survive power-downs of hours or even days. I had 28k bytes of
static RAM on my ALTAIR, so if I needed to start it up after it had
been shut down I'd toggle in the starting address by hand, EXAMINE
that, compare what shows with what's on my printed listing, and if
it matches just EXAMINE-NEXT to compare the next. In the few cases
I saw a bit or two flipped, I'd re-enter that one byte of data.

I had to do that only for my BOOT1 loader, which was hand-loaded
from front panel and took text input in 3n+1 form from Beehive 3A
terminal connected to serial port, maybe also for BOOT2 loader
which had been loaded in 3n+1 form and took input in hexadecimal
form, and maybe also for BOOT3 loader which had been loaded in
hexadecimail form from Beehive and automatically downloaded the
next bootstrap loader from modem. If only a few bytes (of BOOT2 or
BOOT3) had been damaged by days of power down, comparing binary
against listing to verify it's 99% correct, and then manually
patching just one or two bytes, would be faster and safer than
manually entering 3n+1 or hexadecimal from keyboard. But once BOOT3
was loaded, I always downloaded all the rest of the software from
the PDP-10 over the modem.

> I did a little bit of programming on it, but I tired of it quickly.

Did your friend's machine have two serial ports, one for local
terminal and one for modem, and did you have access to a remote
PDP-10 or other mainframe for running a cross-assembler? Or did you
have some other computer locally available, where you could use
that other computer both to store your library of code and to
perform automated file transfer from archive on other computer to
where it's needed on IMSAI?

> As did my friend - once he got the tape storage working (a new
> prom)

Yeah, I never had the money to buy that, and with the PDP-10
available for both cross-assembling and archiving/downloading, I
didn't need it.

> Machine coding is not relevant anymore - it's completely
> infeasible to input all but the smallest program.

That's not totally true. For some educational purposes, like
*really* understanding pointers (not the kind in C so much as the
kind that are inherent in all pointer-linked data structures such
as linked lists and binary search trees etc.), it helps to have
some "hands-on" experience writing and executing machine-language
code, in a sense actually "seeing" a register first point to the
first byte of a CONS cell then by indexing with offset pointing
*through* the *second* pointer of that CONS cell to whatever the
CDR points to. Then "mission impossible" when your instructor tells
you to see if there's a way to reverse that process, whereby you
are given the register pointing to whatever the CDR points to, and
you are supposed to find the address of the original CONS cell.
Instant enlightenment, no lecture/sermon needed!
From: Martin Gregorie
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <g9hlbf$liu$1@localhost.localdomain>
On Mon, 01 Sep 2008 12:04:05 -0700, Robert Maas, http://tinyurl.com/uh3t
wrote:

>> From: George Neuner <········@comcast.net> A friend of mine had an
>> early 8080 micros that was programmed through the front panel using
>> knife switches
> 
> When you say "knife switches", do you mean the kind that are shaped like
> flat paddles? 
>
Pedantic correction:

"Knife switch" is the wrong term. These are high current switches, 
typically used in the sort of heavy duty circuit where the wiring hums 
when power is on or in school electrical circuits so even the back of the 
class can see whether the switch is open or closed. In these a copper 
'blade' closes the contact by being pushed down into a 
narrow, sprung U terminal that makes a close contact with both sides of 
the blade. Like this: http://www.science-city.com/knifeswitch.html

What you're talking is a flat handle on a SPST or DPST toggle switch. It 
is often called a paddle switch and mounted with the flats on the handle 
horizontal. Like this, but often with a longer handle: 
http://www.pixmania.co.uk/uk/uk/1382717/art/radioshack/spdt-panel-mount-
paddle-s.html


-- 
······@   | Martin Gregorie
gregorie. | Essex, UK
org       |
From: George Neuner
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <m43pb4ld0h7i83ko7l61sa3rl0f9iqs4ba@4ax.com>
On Mon, 1 Sep 2008 21:03:44 +0000 (UTC), Martin Gregorie
<······@see.sig.for.address.invalid> wrote:

>On Mon, 01 Sep 2008 12:04:05 -0700, Robert Maas, http://tinyurl.com/uh3t
>wrote:
>
>>> From: George Neuner <········@comcast.net> A friend of mine had an
>>> early 8080 micros that was programmed through the front panel using
>>> knife switches
>> 
>> When you say "knife switches", do you mean the kind that are shaped like
>> flat paddles? 
>>
>Pedantic correction:
>
>"Knife switch" is the wrong term. These are high current switches, 
>typically used in the sort of heavy duty circuit where the wiring hums 
>when power is on or in school electrical circuits so even the back of the 
>class can see whether the switch is open or closed. In these a copper 
>'blade' closes the contact by being pushed down into a 
>narrow, sprung U terminal that makes a close contact with both sides of 
>the blade. Like this: http://www.science-city.com/knifeswitch.html
>
>What you're talking is a flat handle on a SPST or DPST toggle switch. It 
>is often called a paddle switch and mounted with the flats on the handle 
>horizontal. Like this, but often with a longer handle: 
>http://www.pixmania.co.uk/uk/uk/1382717/art/radioshack/spdt-panel-mount-
>paddle-s.html

I don't know the correct term, but what I was talking about was a tiny
switch with a 1/2 inch metal handle that looks like a longish grain of
rice.  We used to call them "knife" switches because after hours
flipping them they would feel like they were cutting into your
fingers.

George
From: Martin Gregorie
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <g9j48h$5nj$2@localhost.localdomain>
On Mon, 01 Sep 2008 20:48:23 -0400, George Neuner wrote:

> I don't know the correct term, but what I was talking about was a tiny
> switch with a 1/2 inch metal handle that looks like a longish grain of
> rice.  We used to call them "knife" switches because after hours
> flipping them they would feel like they were cutting into your fingers.
>
That sounds like a sub-minature SPDT toggle switch with a normal handle. 
Cheap as chips, which is probably why they were used on that front panel. 
Like this by any chance?

http://www.maplin.co.uk/images/300/fh00a_ff70m.jpg




-- 
······@   | Martin Gregorie
gregorie. | Essex, UK
org       |
From: RedGrittyBrick
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <48bd1141$0$26092$db0fefd9@news.zen.co.uk>
George Neuner wrote:
> On Mon, 1 Sep 2008 21:03:44 +0000 (UTC), Martin Gregorie
> <······@see.sig.for.address.invalid> wrote:
> 
>> On Mon, 01 Sep 2008 12:04:05 -0700, Robert Maas, http://tinyurl.com/uh3t
>> wrote:
>>
>>>> From: George Neuner <········@comcast.net> A friend of mine had an
>>>> early 8080 micros that was programmed through the front panel using
>>>> knife switches
>>> When you say "knife switches", do you mean the kind that are shaped like
>>> flat paddles? 
>>>
>> Pedantic correction:
>>
>> "Knife switch" is the wrong term. These are high current switches, 
>> typically used in the sort of heavy duty circuit where the wiring hums 
>> when power is on or in school electrical circuits so even the back of the 
>> class can see whether the switch is open or closed. In these a copper 
>> 'blade' closes the contact by being pushed down into a 
>> narrow, sprung U terminal that makes a close contact with both sides of 
>> the blade. Like this: http://www.science-city.com/knifeswitch.html
>>
>> What you're talking is a flat handle on a SPST or DPST toggle switch. It 
>> is often called a paddle switch and mounted with the flats on the handle 
>> horizontal. Like this, but often with a longer handle: 
>> http://www.pixmania.co.uk/uk/uk/1382717/art/radioshack/spdt-panel-mount-
>> paddle-s.html
> 
> I don't know the correct term, but what I was talking about was a tiny
> switch with a 1/2 inch metal handle that looks like a longish grain of
> rice.  We used to call them "knife" switches because after hours
> flipping them they would feel like they were cutting into your
> fingers.
> 

That must be a toggle switch (as MG suggested) just not the paddle type.

e.g.
<http://cpc.farnell.com/SW02861/components-spares/product.us0?sku=multicomp-1m31t1b1m1qe>
<http://tinyurl.com/64a8ld>


-- 
RGB
From: Robert Maas, http://tinyurl.com/uh3t
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <rem-2008sep01-003@yahoo.com>
> From: ····@rpw3.org (Rob Warnock)
> In the LGP-30, they used hex addresses, sort of[1], but the
> opcodes (all 16 of them) had single-letter mnemonics chosen so that
> the low 4 bits of the character codes *were* the correct nibble for
> the opcode!  ;-}

That's a fascinating design constraint! It would be an interesting
puzzle to find the most efficient design whereby:
- The single-character mnemonics are as easy to memorize as possible;
- The instructions produce as efficient code as possible;
- The mnemonics really do accurately express what the instruction does;
- Minimize total number of instructions needed, maybe fewer than 16;
- With the low-order-four-bits rule of course.
- See also the avoid-ambiguous-sound criterion later below.

By the way, do you remember exactly all the 16 opcodes, or have a
Web reference available?

> [Or you could type in the actual hex digits, since the low 4 bits
>  of *their* character codes were also their corresponding binary
>  nibble values... "but that would have been wrong".]

Moreso because some of the sounds would be ambiguous, which I
recognized when I first started to use the IBM hexadecimal
standard. See below.

> The LGP-30 character code was defined before the industry had
> yet standardized on a common "hex" [sic, "hexadecimal", base 16
> not base 6!!!] character set,

Before IBM had decided to use hexadecimal in their abend coredumps
from their System/360 and by their 800-pound-gorilla status they
got everyone else to use that ABCDEF system.

> they used "0123456789fgjkqw".

That doesn't make sense. The low-order four bits of those letters
aren't consecutive ascending values from 9+1 to 9+6. Did you make a
typo, or did you explain something wrong?

(map 'list #'char-code "0123456789fgjkqw")
=> (48 49 50 51 52 53 54 55 56 57 102 103 106 107 113 119)
(loop for n in * collect (rem n 16))
=> (0 1 2 3 4 5 6 7 8 9 6 7 10 11 1 7)
Now if you used this sequence of letters instead:
(map 'list #'char-code "0123456789jklmno")
=> (48 49 50 51 52 53 54 55 56 57 106 107 108 109 110 111)
(loop for n in * collect (rem n 16))
=> (0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15)
Unfortunately o looks like 0, and l looks like 1.

Anyway, what I hated about IBM's hexadecimal notation was:
- A0 would be pronounced "Ay-tee", which sounds too much like "Eighty".
- 1A would be pronounced "Ay-teen", which sounds too much like "Eighteen".
- On the lineprinters where we get our abend listings, the capital
   D and digit 0 (which didn't have any diagonal slash) looked almost
   identical when the ribbon was going bad, as it always was.
- Likewise B and 8 looked nearly identical.
- Likewise E and F often looked nearly identical of lower part of E
   was hitting bad part of ribbon.

Now for single-character mnemonics for four bits of instruction
opcode, to avoid any two characters that look too similar:
   0 1 2 3 4 5 6 7 8 9
     A B C D E F G H I J K L M N O
   P Q R S T U V W X Y Z
We obviously have no choice for KLMNO, so because of look-alike we
can't use 0 or D or Q, so we have to use P instead of 0, so our choices
look like:
     1 2 3 4 5 6 7 8 9
     A B C   E F G H I J K L M N O
   P   R S T U V W X Y Z
If we have an opcode that sets a register to 1, or clears
register#1, we might use mnemonic "1" for that instruction, but
otherwise we must avoid using digits, use letters only.
     1
     A B C   E F G H I J K L M N O
   P   R S T U V W X Y Z
We can't use both U and V, and we can't use both E and F, and we
can't use both 1 and I, but we're stuck using both M and N, sigh.
At this point I can't decide which branches of the search to
discard and which to fix, so I'll stop analysing this puzzle.
(With lower-case characters different combinations were mutually
 exclusive, such as l and 1, but I was doing upper case here.)

If punctuation is allowed, then + - * / = < > would make dandy
mnemonic opcodes for the obvious instructions.
If characters that look like arrows can be used for push and pop,
then we have V for push and ^ for pop.
If characters that look like arrows can be used for moving
left/right in RAM, or shifting bits in a register, then we have <
for left and > for right.

I wonder if solving this puzzle will yield yet another esoteric
programming language?
From: Rob Warnock
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <VeOdnUJG6cODVybVnZ2dnUVZ_qjinZ2d@speakeasy.net>
Robert Maas, <··················@spamgourmet.com.remove> wrote:
+---------------
| > From: ····@rpw3.org (Rob Warnock)
| > In the LGP-30, they used hex addresses, sort of[1], but the
| > opcodes (all 16 of them) had single-letter mnemonics chosen so that
| > the low 4 bits of the character codes *were* the correct nibble for
| > the opcode!  ;-}
...
| By the way, do you remember exactly all the 16 opcodes, or have a
| Web reference available?
+---------------

There are quite a few out there; here's one:

    http://www.users.nwark.com/~rcmahq/jclark/lgp30.htm

Some notes on the opcodes:

- "Bring" (B) would be called "Load" in current parlance.

- "Extract" (E) would be called "And" in current parlance.

- "Unconditional Transfer" (U) would be called "Jump" or "Branch".

- "Hold" (H) would be called "Store"; "Clear" (C) would be called
  "Store-and-Clear" (the "DCA" instruction of the DEC PDP-8).

- There are two "Multiply" ops, "M" & "N", depending on whether the
  upper or lower bits (respectively) of the product are returned.

- "Store Address" (Y) stores an address in the AC into the address
  field of the destination [yes, modifying the code in-place!! --
  it was mostly used for indexing through arrays], whereas the
  "Return Address" (R) stores the current instruction address + 2
  into the address field of the destination. The subroutine calling
  sequence is thus an "R/U" pair:

	      R foo_last
	      U foo

  where the FOO subroutine might be written as:

    foo:      ...
              ...
    foo_last: U foo_last     ; overwritten

+---------------
| > The LGP-30 character code was defined before the industry had
| > yet standardized on a common "hex" [sic, "hexadecimal", base 16
| > not base 6!!!] character set,
...
| > they used "0123456789fgjkqw".
| 
| That doesn't make sense. The low-order four bits of those letters
| aren't consecutive ascending values from 9+1 to 9+6. Did you make a
| typo, or did you explain something wrong?
+---------------

No, what I wrote is correct. The low 4 bits of "0123456789fgjkqw" 
*are* 0 through 16 (or "0123456789abcdef" in current "hex")... IN
THE *LGP-30 FLEXOWRITER* CHARACTER CODE, *not* in ASCII or EBCDIC
or Baudot or any other standard code.

Well, actually, it's a little more complex than that. The LGP-30
used a 6-bit "keyboard" code on the Flexowriter (and the paper tape
reader & punch), but the machine could be put into either 6-bit or
4-bit input mode. In the latter, only the first four bits of each code
got shifted into the accumulator. So I suppose you could say those
were the *upper* 4 bits of each character, though when read into
the accumulator in "4-bit input mode" they truly *were* the lower
4 bits. (Sorry for the confusion. It was an "interesting" machine.)

+---------------
| (map 'list #'char-code "0123456789fgjkqw")
| => (48 49 50 51 52 53 54 55 56 57 102 103 106 107 113 119)
+---------------

That's ASCII. The LGP-30 did not use ASCII. The LGP-30 used
Flexowriter keyboard code, see:

    http://ed-thelen.org/comp-hist/lgp-30-man-f002.gif


-Rob

p.s. The full programming manual for the LGP-30 can be found here:

    http://ed-thelen.org/comp-hist/lgp-30-man.html

but good luck reading its archaic style.  ;-}

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: Rob Warnock
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <0-SdnROBHqDbVDHVnZ2dnUVZ_jWdnZ2d@speakeasy.net>
John W Kennedy  <·······@attglobal.net> wrote:
+---------------
| I said "machine language" and I meant it. I haven't touched a 1401 since 
| 1966, and haven't dealt with a 1401 emulator since 1968, but I can 
| /still/ write a self-booting program.
+---------------

Heh! I never dealt with a 1401 per se [except when running a 1410
in 1401 emulation mode to run the Autoplotter program, which wasn't
available for the 1410], but I still remember the IBM 1410 bootstrap
instructions you had to type in on the console to boot from magtape.

    v         v
    L%B000012$N

where the "v" accent is the "wordmark" indicator.

That says to read in a whole tape record in "load" mode (meaning
that wordmarks & groupmarks in memory are overwritten), synchronously
(stop & wait), from tape drive 0, starting at memory location
decimal 12, which, since the 1410 used *1*-based addressing,
was the location just after the no-op at location 11 above.

[Note that these are actual *machine* instructions, *not* "assember"!!
Like the 1401, the 1410 was a *character* machine, not an 8-bit-byte
binary machine. The bits in a character were named 1, 2, 4, 8, A, B,
and W (wordmark). Oh, and C, but that was character parity -- the
programmer couldn't set that separately.]

What was the corresponding 1401 boot sequence?

Oh, for the record, IMHO the DEC PDP-8 had a *much* simpler machine
language and assembler than the IBM 1401/1410.  ;-}


-Rob

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: John W Kennedy
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <48af8f95$0$29518$607ed4bc@cv.net>
Rob Warnock wrote:
> What was the corresponding 1401 boot sequence?

The 1401 had a boot-from-tape-1 button on the console, and a 
boot-from-card button on the card reader. You couldn't truly boot from a 
disk; you loaded a little starter deck of about 20 cards on the card reader.

On the 1401, the typewriter was an optional luxury, mainly used in long 
batch jobs to do ad-hoc on-line queries. On the compatible 1460, the 
typewriter was somewhat more common, because the console the typewriter 
mounted on was a standard part of the system, so only the typewriter had 
to be added.

-- 
John W. Kennedy
  "You can, if you wish, class all science-fiction together; but it is 
about as perceptive as classing the works of Ballantyne, Conrad and W. 
W. Jacobs together as the 'sea-story' and then criticizing _that_."
   -- C. S. Lewis.  "An Experiment in Criticism"
From: Martijn Lievaart
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <pan.2008.08.17.09.31.10@rtij.nl.invlalid>
On Sat, 16 Aug 2008 21:46:18 -0400, John W Kennedy wrote:

>> The 1401 was a decent enough processor for many industrial tasks -- at
>> that time -- but for general programming it was sheer horror.
> 
> But the easiest machine language /ever/.

True, very true.

M4
From: Martin Gregorie
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <pan.2008.07.22.09.21.32.805990@see_sig_for_address.invalid>
On Tue, 24 Jun 2008 18:42:15 -0400, John W Kennedy wrote:

> David Combs wrote:
>> passing
>> *unnamed* functions as args (could Algol 60 also do something like that,
>> via something it maybe termed a "thunk")
> 
> No, the "thunks" were necessary at the machine-language level to 
> /implement/ ALGOL 60, but they could not be expressed /in/ ALGOL.
>
Are you sure about that? 

The first time I ran across the term "thunking" was when Windows 3
introduced the Win32S shim and hence the need to switch addressing between
16 bit and 32 bit modes across call interfaces. That was called "thunking"
by Microsoft and even they would surely admit it was a kludge.

I used Algol 60 on an Elliott 503 and the ICL 1900 series back when it was
a current language. The term "thunking" did not appear in either compiler
manual nor in any Algol 60 language definition I've seen. A60 could pass
values by name or value and procedures by name. That was it. Call by name
is what is now referred to as reference passing.

I should also point out that Algol 60 was initially written as a means for
communicating algorithms between people. Compiler implementations came
later. In consequence the language did not define links to libraries or
i/o methods. Both features were compiler specific - for instance the
Elliott introduced 'input' and 'print' reserved words and syntax while the
1900 compilers used function calls. The Elliott approach was more readable.

Algol 60 did not have 'functions'. It had procedures which could be
declared to return values or not. A procedure that returned a value was
equivalent to a function but the term 'function' was not used. Similarly
it did not have a mechanism for declaring anonymous procedures. That, like
the incorporation of machine code inserts, would have been a
compiler-specific extension, so it is a terminological mistake to refer to
it without specifying the implementing compiler.


-- 
······@   | Martin Gregorie
gregorie. | 
org       | Zappa fan & glider pilot
From: Rob Warnock
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <Dq-dnZYE9JanTBjVnZ2dnUVZ_jadnZ2d@speakeasy.net>
Martin Gregorie  <······@see_sig_for_address.invalid> wrote:
+---------------
| John W Kennedy wrote:
| > No, the "thunks" were necessary at the machine-language level to 
| > /implement/ ALGOL 60, but they could not be expressed /in/ ALGOL.
|
| Are you sure about that? 
+---------------

I don't know if John is, but *I* am!  ;-}

+---------------
| I used Algol 60 on an Elliott 503 and the ICL 1900 series back when
| it was a current language. The term "thunking" did not appear in either
| compiler manual nor in any Algol 60 language definition I've seen.
+---------------

It wouldn't have been. Thunks were something used by Algol 60
*compiler writers* in the code generated by their compilers to
implement the semantics of Algol 60 call-by-name, but were not
visible to users at all [except that they allowed call-by-name
to "work right"].

+---------------
| A60 could pass values by name or value and procedures by name. That
| was it. Call by name is what is now referred to as reference passing.
+---------------

(*sigh*) NO, IT IS NOT!!!  Please go read the following:

    http://en.wikipedia.org/wiki/Thunk
    http://en.wikipedia.org/wiki/Evaluation_strategy#Call_by_name
    http://en.wikipedia.org/wiki/Jensen%27s_Device

+---------------
| [Algol 60] did not have a mechanism for declaring anonymous procedures.
+---------------

Quite correct, but completely off the mark. While an Algol 60 *user*
could not declare an anonymous procedure, the *implementation* of an
Algol 60 compilers required the ability for the compiler itself to
generate/emit internal anonymous procedures, to wit, the above-mentioned
thunks, sometimes creating them dynamically during the procedure call.
[Actually, a pair of them for each actual argument in a procedure call.]

+---------------
| That, like the incorporation of machine code inserts, would have been
| a compiler-specific extension, so it is a terminological mistake to
| refer to it without specifying the implementing compiler.
+---------------

Again, "incompetent, irrelevant, and immaterial" [as Perry Mason used
to so frequently object during trials]. Thunks were not "extensions" to
Algol 60 compilers; they were part of the basic implementation strategy
*within* Algol 60 compilers, necessary because of the semantics required
by call-by-name.

Basically, in Algol 60, each parameter must be passed [in general,
that is, one can optimize away many special cases] as *two* closures --
conventionally called "thunks" by Algol 60 compiler writers -- one
for "getting" (evaluating) and the other for "setting" the parameter
[if the parameter was a "place" in Common Lisp terms, else an error
was signalled]... IN THE CALLER'S LEXICAL ENVIRONMENT!!

The big deal was two-fold: (1) each time a formal parameter was
*referenced* in a callee, the expression for the actual parameter
in the caller had to be *(re)evaluated* in the *caller's* lexical
environment, and the value of that (re)evaluation used as the
value of the referenced formal parameter in the callee; and
(2) if a variable appeared twice (or more) in a parameter list,
say, once as a naked variable [which is a "place", note!] and again
as a sub-expression of a more complicated parameter, then setting
the formal parameter in the *callee* would *change* the value of
the actual parameter in the caller(!!), which in turn would change
the value of the *other* actual parameter in the caller the next time
it was referenced in the callee. The above-referenced "Jensen's Device"
shows how this can be used to do "very tricky stuff". A simpler and
shorter example is here:

    http://www.cs.rit.edu/~afb/20013/plc/slides/procedures-07.html

Because the actual parameters in the callee had to be evaluated
in the *caller's* lexical environment -- and because Algol 60 was
fully recursive, allowed nested procedure definitions, and could
pass "local" procedures as arguments -- efficient implementation of
Algol 60 procedure calls almost necessitated placing the bodies
of the compiler-generated actual parameter thunks on the caller's
dynamic stack frame [or at least call instructions *to* the thunks
which could pass the current lexical contours as sub-arguments].
Knuth's nasty "man or boy test" stressed this to the limit:

    http://en.wikipedia.org/wiki/Man_or_boy_test


-Rob

p.s. IIRC [but it's been a *long* time!], the ALGOL-10 compiler
for the DEC PDP-10 passed each actual parameter as the address of
a triple of words, of which the first two were executable and the
third could be used to store a variables value (simple case) or
to pass the lexical contour (more complicated case). When the
callee needed to reference (evaluate) an argument, it used the 
PDP-10 XCT ("execute") instruction to execute the first word of
the block, which was required to deliver the value to a standard
register [let's say "T0", just for concreteness], and if the callee
wanted to *set* an argument, it executed the *second* word pointed
to by the passed address, with the new value also in a defined place
[again, let's use T0]. So to implement "X := X + 1;" in the callee,
the compiler would emit code like this:

	    MOVE  T1,{the arg (address) corresponding to "X"}
	    XCT   0(T1)       ; fetch the current value of X into T0.
	    ADDI  T0, 1       ; increment it
	    XCT   1(T1)       ; execute the "setter" for X.

Now in the case where the actual parameter in the caller was a
simple global variable, call it "Y", then the address passed as
the arg could be the following "YTHNK" in static data space:

    YTHNK:  MOVE   T0,.+2     ; one-instruction "getter"
            MOVEM  T0,.+1     ; one-instruction "setter"
    Y:      BLOCK  1           ; the actual place where the value "Y" lives

Whereas if the argument being passed were some more complicated
expression, such as an array reference or a reference to a local
procedure in the caller, then the 3-word arg block would be passed
on the stack and the passed address would point to this [possibly
dynamically-constructed] triple, where PUSHJ is the PDP-10 stack-
oriented subroutine call instruction:

            PUSHJ  P,{lambda-lifted getter code}
            PUSHJ  P,{lambda-lifted setter code}
	    EXP    {lexical contour info needed for getter/setter to work}

Efficient for the simple case; slow-but-correct for the messy case.

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: Lew
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <PqadnenupInjexjVnZ2dnUVZ_rGdnZ2d@comcast.com>
Rob Warnock wrote:
> Martin Gregorie  <······@see_sig_for_address.invalid> wrote:
> +---------------
> | John W Kennedy wrote:
> | > No, the "thunks" were necessary at the machine-language level to 
> | > /implement/ ALGOL 60, but they could not be expressed /in/ ALGOL.
> |
> | Are you sure about that? 
> +---------------
> 
> I don't know if John is, but *I* am!  ;-}

At this point we are so far off topic for clj.programmer, but still impinging 
on functional programming issues with the discussion of closures, et al., that 
I respectfully request that you all exclude clj.programmer from followups to 
this thread.

(f-u set to comp.lang.functional)

-- 
Lew
From: John W Kennedy
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <48863ba1$0$5000$607ed4bc@cv.net>
Rob Warnock wrote:
> Thunks were something used by Algol 60
> *compiler writers* in the code generated by their compilers to
> implement the semantics of Algol 60 call-by-name, but were not
> visible to users at all [except that they allowed call-by-name
> to "work right"].

...unless you were a system programmer and had to write Algol-friendly 
assembler.



-- 
John W. Kennedy
  "Give up vows and dogmas, and fixed things, and you may grow like 
That. ...you may come to think a blow bad, because it hurts, and not 
because it humiliates.  You may come to think murder wrong, because it 
is violent, and not because it is unjust."
   -- G. K. Chesterton.  "The Ball and the Cross"
From: Josef Moellers
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <g64arv$aab$1@nntp.fujitsu-siemens.com>
Martin Gregorie wrote:

> Are you sure about that? 

> I used Algol 60 on an Elliott 503 and the ICL 1900 series back when it was
> a current language. The term "thunking" did not appear in either compiler
> manual nor in any Algol 60 language definition I've seen. A60 could pass
> values by name or value and procedures by name. That was it. Call by name
> is what is now referred to as reference passing.

Are you sure about that? ;-)

AFAIK "Call by name" is *not* the same as passing an argument by 
reference. With "call by name" you can implement this wonderful thing 
called "Jensen's Device", which you cannot do when you pass parameters 
by reference!

Josef
-- 
These are my personal views and not those of Fujitsu Siemens Computers!
Josef M�llers (Pinguinpfleger bei FSC)
	If failure had no penalty success would not be a prize (T.  Pratchett)
Company Details: http://www.fujitsu-siemens.com/imprint.html
From: Steve Schafer
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <kg2c84d8i5rfq6kd20nhh6reab7m59b9cm@4ax.com>
On Tue, 22 Jul 2008 10:21:50 +0100, Martin Gregorie
<······@see_sig_for_address.invalid> wrote:

>The first time I ran across the term "thunking" was when Windows 3
>introduced the Win32S shim and hence the need to switch addressing between
>16 bit and 32 bit modes across call interfaces. That was called "thunking"
>by Microsoft and even they would surely admit it was a kludge.

Win32s thunks are a completely different beast from the original Algol
60 thunks. As far as I know, the first published description of thunks
was:

 Ingerman PZ (1961) Thunks: A way of compiling procedure statements with
 some comments on procedure declarations, CACM 4:55-58.

Steve Schafer
Fenestra Technologies Corp.
http://www.fenestra.com/
From: Grant Edwards
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <vuWdnVXuSN27uRvVnZ2dnUVZ_jydnZ2d@posted.visi>
On 2008-07-22, Steve Schafer <·····@fenestra.com> wrote:
> On Tue, 22 Jul 2008 10:21:50 +0100, Martin Gregorie
><······@see_sig_for_address.invalid> wrote:
>
>>The first time I ran across the term "thunking" was when Windows 3
>>introduced the Win32S shim and hence the need to switch addressing between
>>16 bit and 32 bit modes across call interfaces. That was called "thunking"
>>by Microsoft and even they would surely admit it was a kludge.

What?!  Microsoft took a technical term and used it to mean
something completely different than the widely used meaning?
Never.

> Win32s thunks are a completely different beast from the
> original Algol 60 thunks. As far as I know, the first
> published description of thunks was:
>
>  Ingerman PZ (1961) Thunks: A way of compiling procedure statements with
>  some comments on procedure declarations, CACM 4:55-58.

The Algol usage is certainly what we were taught back in the
late 70's.  I wasn't even aware that Microsoft had hijacked it
to mean something else.

-- 
Grant Edwards                   grante             Yow! My polyvinyl cowboy
                                  at               wallet was made in Hong
                               visi.com            Kong by Montgomery Clift!
From: John W Kennedy
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <48863add$0$5013$607ed4bc@cv.net>
Martin Gregorie wrote:
> I used Algol 60 on an Elliott 503 and the ICL 1900 series back when it was
> a current language. The term "thunking" did not appear in either compiler
> manual nor in any Algol 60 language definition I've seen.

It doesn't have to; Algol 60 thunks are not part of the language. 
However, practical implementation of Algol 60 call by name means that 
thunks are created by every Algol 60 compiler, and the word "thunk" was 
coined in 1961 to designate them.

> A60 could pass
> values by name or value and procedures by name. That was it. Call by name
> is what is now referred to as reference passing.

Either you misunderstood (because in many simple cases the semantics of 
call-by-reference and call-by-name cannot be distinguished) or the 
compiler you used implemented non-standard Algol (which was fairly 
common in compilers meant for day-to-day practical work). Algol 
call-by-name was a unique form that subsequent language designers have 
recoiled from in horror.

(Historically, "call-by-name" has sometimes been used in non-Algol 
contexts to mean "call-by-reference".)

> Algol 60 did not have 'functions'. It had procedures which could be
> declared to return values or not. A procedure that returned a value was
> equivalent to a function but the term 'function' was not used.

This is simply wrong. You are accurately describing the language syntax, 
  which used (as PL/I does) the keyword "procedure" for both functions 
and subroutines, but Algol documentation nevertheless referred to 
"functions".

> Similarly
> it did not have a mechanism for declaring anonymous procedures. That, like
> the incorporation of machine code inserts, would have been a
> compiler-specific extension, so it is a terminological mistake to refer to
> it without specifying the implementing compiler.

Standards-conforming Algol compilers had a limited ability to create 
de-facto anonymous functions in the call-by-name implementation.

-- 
John W. Kennedy
  "Information is light. Information, in itself, about anything, is light."
   -- Tom Stoppard. "Night and Day"
From: David Combs
Subject: VERY SORRY FOR THAT CROSSPOST; Re: The Importance of Terminology's Quality
Date: 
Message-ID: <g1nk2v$6f8$1@reader2.panix.com>
(This one is also cross-posted, to apologize to one and all
about my just-prior followup.)

I stupidly didn't remember that whatever followup I made
would also get crossposted until *after* I had kneejerked
hit "s" (send) before I noticed the warning (Pnews?) on
just how many groups it would be posted to.

A suggestion for Pnews: that as soon as you give the
F (followup for trn), ie as soon as Pnews starts-up
on this followup, before you've typed in anything
or given it a filename to include, that AT THAT TIME
it remind you that it'll be crossposted to the
following 25 newsgroups:
  1: foo
   2: comp.lang.perl.misc
  3: other-group
  4: ...


, so way before you've said anything, you can
abort it if you want to.


SORRY!


David
From: Jon Harrop
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <5NCdnRlmwIIt5L7VnZ2dnUVZ8sSrnZ2d@plusnet>
······@gmail.com wrote:
> I'd like to introduce a blog post by Stephen Wolfram, on the design
> process of Mathematica. In particular, he touches on the importance of
> naming of functions.
> 
> ? Ten Thousand Hours of Design Reviews (2008 Jan 10) by Stephen
> Wolfram
>  http://blog.wolfram.com/2008/01/10/ten-thousand-hours-of-design-reviews/
> 
> The issue is fitting here today, in our discussion of ?closure?
> terminology recently, as well the jargons ?lisp 1 vs lisp2? (multi-
> meaning space vs single-meaning space), ?tail recursion?, ?currying?,
> ?lambda?, that perennially crop up here and elsewhere in computer
> language forums in wild misunderstanding and brouhaha.
> 
> The functions in Mathematica, are usually very well-name, in contrast
> to most other computing languages.

The mathematical functions are well named but many of the general
programming constructs are not even well defined, let alone well named.

For example, overloaded "Gamma" in Mathematica vs "gammainc" in MATLAB for
the incomplete gamma functions.

> In particular, the naming in 
> Mathematica, as Stephen Wolfram implied in his blog above, takes the
> perspective of naming by capturing the essense, or mathematical
> essence, of the keyword in question. (as opposed to, naming it
> according to convention, which often came from historical happenings)
> When a thing is well-named from the perspective of what it actually
> ?mathematically? is, as opposed to historical developments, it avoids
> vast amount of potential confusion.
> 
> Let me give a few example.
> 
> ? ?lambda?, widely used as a keyword in functional languages, is named
> just ?Function? in Mathematica. The ?lambda? happend to be called so
> in the field of symbolic logic, is due to use of the greek letter
> lambda ??? by happenstance. The word does not convey what it means.
> While, the name ?Function?, stands for the mathematical concept of
> ?function? as is.

Look at the "function" keyword in OCaml and F#. They also pattern match over
their input whereas Mathematica does not allow this in "Function".

> ? Module, Block, in Mathematica is in lisp's various ?let*?. The
> lisp's keywords ?let?, is based on the English word ?let?. That word
> is one of the English word with multitudes of meanings. If you look up
> its definition in a dictionary, you'll see that it means many
> disparate things. One of them, as in ?let's go?, has the meaning of
> ?permit; to cause to; allow?. This meaning is rather vague from a
> mathematical sense. Mathematica's choice of Module, Block, is based on
> the idea that it builds a self-contained segment of code. (however,
> the choice of Block as keyword here isn't perfect, since the word also
> has meanings like ?obstruct; jam?)

Bad example. Mathematica's "Block" implements what we all know as "Scope".
Mathematica's "Module" implements something most people have never needed
to learn (rewriting a subexpression with uniquely tagged identifiers to
disambiguate them from externals):

  In[1] := Module[{a}, a]
  Out[1] = a$617

> ? Functions that takes elements out of list are variously named First,
> Rest, Last, Extract, Part, Take, Select, Cases, DeleteCases... as
> opposed to ?car?, ?cdr?, ?filter?, ?filter?, ?pop?, ?shift?,
> ?unshift?, in lisps and perl and other langs.

You are comparing arrays in Mathematica to lists in other functional
languages. Mathematica is often asymptotically slower as a consequence,
with "Rest" being O(n):

  In[1] := AbsoluteTiming[Rest[Range[1, 1000000]];]
  Out[1] = {0.219, Null}

The equivalents over lists are all O(1) in SML, OCaml, F#, Haskell, Scala,
Lisp and Scheme and is called the "tail" of a list. For example, in F#:

  > time List.tl [1 .. 1000000];;
  Took 0ms
  val it : int list = ...

Perhaps the best example I can think of to substantiate your original point
is simple comparison because Mathematica allows:

  a < b < c

I wish other languages did.

-- 
Dr Jon D Harrop, Flying Frog Consultancy
http://www.ffconsultancy.com/products/?u
From: Robert Maas, http://tinyurl.com/uh3t
Subject: Re: The Importance of Terminology's Quality
Date: 
Message-ID: <rem-2008may08-007@yahoo.com>
> From: Jon Harrop <····@ffconsultancy.com>
> Perhaps the best example I can think of to substantiate your
> original point is simple comparison because Mathematica allows:
>   a < b < c
> I wish other languages did.

When the comparison operators are all the same, Lisp has that:
(< a b c)
For my personal comparison between C C++ Java and Lisp, see:
 <http://www.rawbw.com/~rem/HelloPlus/CookBook/Matrix.html#NumBool>

Now for mixed comparisons, such as a < b <= c, lisp as delivered is
no better than other languages. But in lisp it's possible/easy to
write a macro that would deal with such expressions. Exercise for
the reader (especially newbies wishing to make a good impression):
  (defmacro altcompare ...)
  (altcompare a < b <= c ...) ;a b c are evaluated
              ;< <= are *not* evaluated, instead wrapped with (function ...)
  ;Any function that can take two arguments is allowed there, not just
  ; the six comparison operators. Such function can be given in any form
  ; convertable by FUNCTION at compile time, i.e. symbol name or lambda expr.
Note for newbie: Be sure to evaluate each alternating sub-form just once,
in correct sequence, binding each result to a GENSYMmed LET variable
which can then be referenced twice each (except first and last once each).
Let the compiler optimize out any LET bindings that weren't really needed.