From: Peter Seibel
Subject: Where does the drive to syntax come from?
Date: 
Message-ID: <m2hdcx7rd7.fsf@gigamonkeys.com>
In my travels among the LispNots I constantly run into people--even
new programmers--who *want* elaborate syntax. (I.e. more elaborate
than parens and prefix notation.) One part of me thinks that maybe
this is due to the pervasive influence of non-Lisp languages; even
non-programmers have an idea what languages should look like and it's
not Lisp. Another part of me suspects that Larry Wall may be on to
something when he says folks like different things to look different,
thus all the different bits of syntax in Perl. (Certainly within Lisp
there's evidence that special purpose syntax is occasionally
considered useful, at least by some folks; c.f. the standard reader
macros, diffferent syntax for different kinds of numbers, LOOP,
FORMAT.) If Wall's view is correct, than that means that Lisp's
minimal syntax is a trade-off--we accept a uniform, and thus less
intuitive (per Wall) syntax because of the many advantages is gives us
in other respects (macros, ease of manipulation in editors, etc.) But
that would also mean that if one were designing a domain-specific
langauge that wasn't necessarily going to support, say, macros then
maybe the tradeoff would turn against s-exp syntax.

So, does anyone know of any actual research into these issues--maybe
by psychologists or human factors folks?

-Peter

-- 
Peter Seibel           * ·····@gigamonkeys.com
Gigamonkeys Consulting * http://www.gigamonkeys.com/
Practical Common Lisp  * http://www.gigamonkeys.com/book/

From: Peter Scott
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <1126048574.604152.244990@z14g2000cwz.googlegroups.com>
When you're used to Lisp (and you're using a good text editor),
different things *do* look different. It's just that new programmers
have trouble seeing it. At first, the parentheses hurt my head, but
later I figured out how to look past them. I'm not sure if this was the
influence of other programming languages or something intrinsic in my
mind.

The "syntax" a lisper sees is the shape of the code. A non-lisper will
look at special symbols like '(' and ')' and get confused by the
overwhelming number of them. This is all from my personal experience,
so YMMV.

-Peter
From: Ulrich Hobelmann
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <3o7nlnF4jr48U1@individual.net>
Peter Scott wrote:
> When you're used to Lisp (and you're using a good text editor),
> different things *do* look different. It's just that new programmers
> have trouble seeing it. At first, the parentheses hurt my head, but
> later I figured out how to look past them. I'm not sure if this was the
> influence of other programming languages or something intrinsic in my
> mind.

Two things can greatly improve this situation when demoing Lisp: one is 
using an Emacs style where the parentheses are so faint that they're 
almost invisible.  IIRC Marco Baringer's SLIME demo did this.

The other way is to start with something like a program in 
Python/Haskell-like syntax, and when the reader has understood it, 
introduce parentheses.  Heh!  The presentation to "The Swine Before 
Perl" (yes, it's in Scheme) does this on page 37-39.

> The "syntax" a lisper sees is the shape of the code. A non-lisper will
> look at special symbols like '(' and ')' and get confused by the
> overwhelming number of them. This is all from my personal experience,
> so YMMV.

They don't see the forest/program for the trees/parentheses, but that's 
just confusion.  In my experience Java beginners have a hard time too, 
maybe for similar reasons (mostly not understanding variable scope I 
think).  Teaching Lisp the Swine-before-Perl way would be simply great 
(and giving students the invisible-parens editor style :D).

-- 
My ideal for the future is to develop a filesystem remote interface
(a la Plan 9) and then have it implemented across the Internet as
the standard rather than HTML.  That would be ultimate cool.
	Ken Thompson
From: Dave Roberts
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <m3psrkqxfk.fsf@linux.droberts.com>
Ulrich Hobelmann <···········@web.de> writes:
> Peter Scott wrote:
> > When you're used to Lisp (and you're using a good text editor),
> > different things *do* look different. It's just that new programmers
> > have trouble seeing it. At first, the parentheses hurt my head, but
> > later I figured out how to look past them. I'm not sure if this was the
> > influence of other programming languages or something intrinsic in my
> > mind.
> 
> Two things can greatly improve this situation when demoing Lisp: one
> is using an Emacs style where the parentheses are so faint that
> they're almost invisible.  IIRC Marco Baringer's SLIME demo did this.

This actually raises a subtle point: color is syntax or a sort. I
think John McCarthy talked about that concept a bit at ILC 2005 this
year.

The issue that I think Wall is describing and that many Lisp newbies
have trouble with is that it's hard to get information out of data
that all looks homogenous. Using different characters breaks up the
pattern of text and makes it look different, thereby signaling to the
reader when different concepts are being employed. With Lisp, the
head-fake is that the "shape" of the code (indentation) is far more
important than the syntax itself (which is homogenous). Most Lisp
newbies find all the parens intimidating because they're used to
looking for syntax to signal something, not the indentation (maybe
Python will change this for some people, I don't know). I think
everybody else said bits and pieces of that, so here's the point:

The interesting thing for me is that I find it far easier to read
colorized Lisp programs that I do black-and-white Lisp. Additionally,
when emacs fontlock is lagging, it's very annoying to me. This implies
that color has become part of my Lisp syntax.

-- 
Dave Roberts
dave -remove- AT findinglisp DoT com
http://www.findinglisp.com/
From: Robert Maas, see http://tinyurl.com/uh3t
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <REM-2005sep07-003@Yahoo.Com>
> From: Dave Roberts <···········@remove-findinglisp.com>
> The issue that I think Wall is describing and that many Lisp newbies
> have trouble with is that it's hard to get information out of data
> that all looks homogenous. Using different characters breaks up the
> pattern of text and makes it look different, thereby signaling to the
> reader when different concepts are being employed.

Perhaps this is because beginners are trying to run before they can
walk or even crawl. They're looking at examples of deeply nested code
from books or online source files, and trying to make heads or tails of
it, instead of writing their own code in a very simple style that
doesn't take any effort to parse. A sequence of simple statements is
very easy to parse: Each statement has two open parens (one starting
the setq form, and the other starting the function-call form), and two
matching close parens at the end. With all the SETQs nicely lined up in
the same four columns, they tend to blur out like whitespace, and then
number of parens is so small as to be almost invisible, leaving the
names of the variables and the name of the function on each line as
distinct and meaningful. With each variable having a meaningful name,
defined by the (beginner) programmer, all that needs to be learned anew
for each line of code is the built-in function being called there.
After a block of code has been written, performing some useful
transformation of data from data going into the first line-of-code to
data coming out of the last line-of-code, finally a DEFUN is wrapped
around the whole block, giving one more open paren at the start and one
more close paren at the end, still hardly enough to cause eyestrain for
even a rank beginner, and the old lines of code are all indented
uniformly now to show they are no longer at the top level but are now
inside a DEFUN form. (And even with a PROG inside the DEFUN, so that
the variables can be made local, that's still not enough parens to
cause eyestrain.)

With that style, the only time there'd be any other parens would be in
the case of some sort of conditional branch, either an IF or a COND. So
with that style of programming, in fact the extra parens do indeed show
where something special is happening.

*After* the beginner has learned how to process data (strings and
nested lists mostly) in all the wonderful ways that CL provides, and
how to use the three combination methods (sequence, decision, looping)
in useful ways, and define functions to make tools which are then
usable just as if they were in the original CL API, and use all that to
make at least one interesting application, *then* would be the
appropriate time to try reducing the number of local variables by
nesting function calls with lots of parens. By that time, the beginner
already knows what's going on otherwise, and wouldn't be confused by
that one new complication of nesting syntax. (In fact, nested parens
probably already occurred in *data* somewhere during debugging the
first interesting application, so seeing such deep nesting in source
code too wouldn't be much new at all now.)

By the way, regarding the original question of this thread: If somebody
wants to use some special syntax particular to some problem domain,
that's easiest to do in Lisp, so it's an argument for Lisp rather than
against Lisp. All the person needs to do is write a parser for that
special syntax, something that takes a string with that syntax and
produces the corresponding nested list structure. (And for output, a
printer, which by comparision is trivial, unless prettyprinting is
desired, which then is a fun exercise in dynamic programming or
branch-and-bound to find the optimal sameline/newline decision for each
level of structure.)
From: David Steuber
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <87k6hofr3b.fsf@david-steuber.com>
Dave Roberts <···········@remove-findinglisp.com> writes:

> The interesting thing for me is that I find it far easier to read
> colorized Lisp programs that I do black-and-white Lisp. Additionally,
> when emacs fontlock is lagging, it's very annoying to me. This implies
> that color has become part of my Lisp syntax.

I've found indentation more useful than color, but color is growing on
my now that I have color-theme.el installed.  Now for a counter
point.  The quoted paragraph has very little syntax in it.  It is
mainly a stream of words with three dots to indicate end of sentence,
two commas, and one apostrophe.  The primary syntax is the white space
separating the words.

There are languages in use who's written representation don't even
have such markers.  Mandarin, anyone?

The human brain is very adaptive.  I think it takes what it is given
and learns to cope with that.

-- 
You should maybe check the chemical content of your breakfast
cereal. --- Bill Watterson
From: David Golden
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <5EKUe.15200$R5.776@news.indigo.ie>
Hm. Have you seen colorforth where colour really is part of the syntax?
http://www.colorforth.com/ 
- orig by Chuck Moore.

http://personalwebs.oakland.edu/~maslicke/colorforth/ 
- linux port

http://personalwebs.oakland.edu/~maslicke/colorforth/mandelbrot/index.html
- mandelbrot set in colorforth.
From: Ulrich Hobelmann
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <3olirmF6i0miU1@individual.net>
David Golden wrote:
> Hm. Have you seen colorforth where colour really is part of the syntax?
> http://www.colorforth.com/ 
> - orig by Chuck Moore.

Well, that's the representation.  Basically AFAIK color only means that 
some word is to be taken as a literal, or that it should be executed in 
place (a bit like a macro).  Other flags could be used for this.


-- 
My mouth says the words, my brain is thinking monstertrucks.
Joey (Friends)
From: Christopher C. Stacy
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <upsrlh8lm.fsf@news.dtpq.com>
Peter Seibel <·····@gigamonkeys.com> writes:

> In my travels among the LispNots I constantly run into people--even
> new programmers--who *want* elaborate syntax. (I.e. more elaborate
> than parens and prefix notation.) One part of me thinks that maybe
> this is due to the pervasive influence of non-Lisp languages; even
> non-programmers have an idea what languages should look like and it's
> not Lisp. Another part of me suspects that Larry Wall may be on to
> something when he says folks like different things to look different,
> thus all the different bits of syntax in Perl. (Certainly within Lisp
> there's evidence that special purpose syntax is occasionally
> considered useful, at least by some folks; c.f. the standard reader
> macros, diffferent syntax for different kinds of numbers, LOOP,
> FORMAT.)

#'
From: Ray Dillinger
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <PctTe.12422$p%3.50326@typhoon.sonic.net>
Peter Seibel wrote:
> In my travels among the LispNots I constantly run into people--even
> new programmers--who *want* elaborate syntax. (I.e. more elaborate
> than parens and prefix notation.) One part of me thinks that maybe
> this is due to the pervasive influence of non-Lisp languages; even
> non-programmers have an idea what languages should look like and it's
> not Lisp. Another part of me suspects that Larry Wall may be on to
> something when he says folks like different things to look different,
> thus all the different bits of syntax in Perl. (Certainly within Lisp
> there's evidence that special purpose syntax is occasionally
> considered useful, at least by some folks; c.f. the standard reader
> macros, diffferent syntax for different kinds of numbers, LOOP,
> FORMAT.) If Wall's view is correct, than that means that Lisp's
> minimal syntax is a trade-off--we accept a uniform, and thus less
> intuitive (per Wall) syntax because of the many advantages is gives us
> in other respects (macros, ease of manipulation in editors, etc.) But
> that would also mean that if one were designing a domain-specific
> langauge that wasn't necessarily going to support, say, macros then
> maybe the tradeoff would turn against s-exp syntax.
> 
> So, does anyone know of any actual research into these issues--maybe
> by psychologists or human factors folks?
> 

Well, I'd say Larry Wall counts; he's originally a
(human) linguist who got into programming sort of
"sideways," if I recall correctly.

In fact, I believe he's noted for once saying that
he considered perl to be a linguistic experiment;
that the idea was to make a programming language
that more closely approximated the flexibility and
context sensitivity of human languages, and see if
people who used it became "fluent" at a higher
level than was the case for other computer languages.

That said, it takes years of intensive effort for an
adult to learn a new natural language;  It's an
investment in time and effort that engages us on a
very fundamental level and occupies a *LOT* of our
mental resources. So if computer languages were all
as complex and context-sensitive as perl, we probably
wouldn't learn nearly as many....

				Bear
From: Sashank Varma
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <1126118254.668885.138690@f14g2000cwb.googlegroups.com>
Ray Dillinger wrote:
> Peter Seibel wrote:
> > Another part of me suspects that Larry Wall may be on to
> > something when he says folks like different things to look different,
> > thus all the different bits of syntax in Perl. (Certainly within Lisp
> > there's evidence that special purpose syntax is occasionally
> > considered useful, at least by some folks; c.f. the standard reader
> > macros, diffferent syntax for different kinds of numbers, LOOP,
> > FORMAT.) If Wall's view is correct, than that means that Lisp's
> > minimal syntax is a trade-off--we accept a uniform, and thus less
> > intuitive (per Wall) syntax because of the many advantages is gives us
> > in other respects (macros, ease of manipulation in editors, etc.) But
> > that would also mean that if one were designing a domain-specific
> > langauge that wasn't necessarily going to support, say, macros then
> > maybe the tradeoff would turn against s-exp syntax.
> >
> > So, does anyone know of any actual research into these issues--maybe
> > by psychologists or human factors folks?
>
> Well, I'd say Larry Wall counts; he's originally a
> (human) linguist who got into programming sort of
> "sideways," if I recall correctly.

As a psychologist who studies language comprehension, I've always
viewed Wall's claim as more of a linguistic (or rather, a Wall-ian)
slogan than a scientific claim.

I don't know of any evidence that more syntactic variation leads to
better comprehension. However, there is plenty of evidence that the
more complex a syntactic construction, the more difficult it is to
process. That is, it takes longer to comprehend, is understood less
accurately, elicits more activation in the language areas of the brain,
and is  impaired in normal adults with small working memory capacity
and in patients with damage to langugage areas. In fact, there is
evidence that syntactic ambiguity is difficult to process. If there are
several structural interpretations of a word sequence, having to choose
between them has negative behavioral and brain consequences.

Wall is right in saying that context helps. However, all the
experiments of which I'm aware have studied semantic context, not
syntactic context. So I don't see how Perl's baroque syntax provides
any contextual support.

> In fact, I believe he's noted for once saying that
> he considered perl to be a linguistic experiment;
> that the idea was to make a programming language
> that more closely approximated the flexibility and
> context sensitivity of human languages, and see if
> people who used it became "fluent" at a higher
> level than was the case for other computer languages.

Well, it looks like his experiment failed, and (if one has the
statistical power and logical will to reason from a null result), his
hypothesis (and thus Perl) have lost credibility.
From: Jeff
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <1126124427.146089.309730@o13g2000cwo.googlegroups.com>
I agree. And 'baroque' is certainly the word for Perl's syntax. I've
written a lot of Perl and its sheer arbitrariness has always annoyed me
no end. Now I wish I had run into Lisp much sooner and saved a few
million neurons. 

-jeff
From: Rob Thorpe
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <1126127328.363883.314300@f14g2000cwb.googlegroups.com>
Jeff wrote:
> I agree. And 'baroque' is certainly the word for Perl's syntax. I've
> written a lot of Perl and its sheer arbitrariness has always annoyed me
> no end. Now I wish I had run into Lisp much sooner and saved a few
> million neurons.

I first learnt lisp, perl and C++ at around the same time - which was
quite interesting.  The main problems I have with perl are:
* Finding help on the syntax - it's easy with lisp to find help on a
form since everything is form, there's so much syntax in perl it's hard
to find the relevant info.
* Syntax of references and addressing.  Why do array elements begin
with $ not @.  "->" is also confusing.
From: Peter Seibel
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <m2d5nk7bun.fsf@gigamonkeys.com>
"Sashank Varma" <············@yahoo.com> writes:

> As a psychologist who studies language comprehension, I've always
> viewed Wall's claim as more of a linguistic (or rather, a Wall-ian)
> slogan than a scientific claim.

[...]

> Wall is right in saying that context helps. However, all the
> experiments of which I'm aware have studied semantic context, not
> syntactic context. So I don't see how Perl's baroque syntax provides
> any contextual support.

So can we use those studies to justify WTIH-* macros, i.e.

  (with-some-particular-context (...)
    ...)

as an aid to ease of understanding?

-Peter

-- 
Peter Seibel           * ·····@gigamonkeys.com
Gigamonkeys Consulting * http://www.gigamonkeys.com/
Practical Common Lisp  * http://www.gigamonkeys.com/book/
From: Sashank Varma
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <1126144852.333290.265920@g44g2000cwa.googlegroups.com>
Peter Seibel wrote:
> "Sashank Varma" <············@yahoo.com> writes:
>
> > Wall is right in saying that context helps. However, all the
> > experiments of which I'm aware have studied semantic context, not
> > syntactic context. So I don't see how Perl's baroque syntax provides
> > any contextual support.
>
> So can we use those studies to justify WTIH-* macros, i.e.
>
>   (with-some-particular-context (...)
>     ...)
>
> as an aid to ease of understanding?

Yeah, I guess. I'm hedging because the analogy between programming and
natural languages both holds and breaks down for your example of WITH-*
macros.

It holds in the sense that a strong context enables comprehenders to
draw inferences about entities within the context's scope. For example,
I can infer that the names in (...) will be bound to complex objects
requiring initialization and clean-up, that they will have lexical
scope, that they will have dynamic extant, etc.

However, the analogy breaks down in that programmers use macros to
extend the syntax of Lisp, whereas natural language speakers don't
usually have (or take) this freedom.

The analogy holds in that WITH-* macros are one of Common Lisp's
idioms. Natural language idioms are canned sequences whose meaning is
holistic (i.e., not derivable from the meanings of their constituent
words). This is why they're hard for non-native speakers to acquire.
WITH-* idioms are similarly opaque -- one does not have to understand
their pieces (i.e., their definitions) to use and understand them.

There's more: On one hand, the WITH-* idiom is generative (i.e., it is
an idiom template), whereas most natural language idioms are canned. On
the other hand, some natural language dialects also have generative
idioms. For example, members of the Slashdot community have an "In
Soviet Russia..." idiom that applies to many situations.

And around it goes.

And I haven't event mentioned the fact that both WITH-* macros and
natural language constructions have pragmatic force...

Let me leave you with a demonstration. Understanding is by not
generally achievable in a bottom-up fashion, by just projecting
meanings of words through the syntactic structures that organize them.
Semantic context plays a critical role in organizing the comprehension
of connected text.

Read the following passage:

"The procedure is actually quite simple. First you arrange things into
groups. Of course, one pile may be sufficient depending on how much
there is to do. If you have to go somewhere else due to lack of
facilities, that is the next step; otherwise you are pretty well set.
It is important not to overdo things. That is, it is better to do too
few things at once than too many. In the short run this may not seem
important, but complications can arise. A mistake can prove expensive
as well. At first the whole procedure will seem complicated. Soon,
however, it will become just another facet of life. It is difficult to
foresee any end to the necessity for this task in the immediate future,
but one can never tell. After the procedure is completed, one arranges
the materials into different groups again. Then they can be put into
their appropriate places. Eventually they will all be used once more,
and the whole cycle will have to be repeated. However, that is part of
life."

Reflect on your understanding of it. How do you think you'd do on a
recall test? Would you be able to explain it well to another person?
Probably not, right?

Now, read it again. Before you do, however, scroll down to the bottom
of this message for its title.

[...]

Big difference, eh? I have some other examples of this sort if people
are interested. Interestingly, I think it's Gabriel in "Patterns of
Software Explanations" who gives a parallel example for programming
languages. He presents a short piece of code that's doing some weird
arithmetic. It's hard to see what its point is. Then he tells you it's
a pseudorandom number generator, and suddenly the seemingly arbitrary
multiplications and mods and strange numbers make complete sense.



















































"Washing Clothes"
From: Ray Dillinger
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <3jKTe.12573$p%3.50713@typhoon.sonic.net>
Sashank Varma wrote:

> 
> As a psychologist who studies language comprehension, I've always
> viewed Wall's claim as more of a linguistic (or rather, a Wall-ian)
> slogan than a scientific claim.
> 
> I don't know of any evidence that more syntactic variation leads to
> better comprehension. However, there is plenty of evidence that the
> more complex a syntactic construction, the more difficult it is to
> process. That is, it takes longer to comprehend, is understood less
> accurately, elicits more activation in the language areas of the brain,
> and is  impaired in normal adults with small working memory capacity
> and in patients with damage to langugage areas. In fact, there is
> evidence that syntactic ambiguity is difficult to process. If there are
> several structural interpretations of a word sequence, having to choose
> between them has negative behavioral and brain consequences.

And yet everywhere we look, we see that natural languages are
deeply syntactically ambiguous and complex;  One would think
that if such fundamental difficulties characterized ambiguity
and complexity, then people would use much simpler languages
amongst themselves.

There are a multitude of very interesting hypotheses that
could be drawn here about some kind of tradeoff or value
realized from all the extra work and difficulty we let
ourselves in for; but most of them are conflicting, few
apply to programming languages, and the design of
experiments to test any or all of them would be extremely
difficult.

I have a favorite hypothesis, but won't go much into it
here as it's pure speculation.  I think that a lot of
good ideas emerge from poorly-understood or ambiguously-
communicated bad ideas.  So our loose, imprecise,
context-sensitive, ambiguous, complex languages may be
helpful because they give us room to misunderstand
each other's bad ideas.  People are forced to think
"what did he mean?" (the extra work you're talking about)
and come up with an answer that, hopefully, makes sense -
in some cases even if what was actually meant makes less
sense.  So maybe our difficulties communicating contribute
to the iterative refinement of ideas.

Programming languages cannot yet benefit from that kind
of ambiguity, so whether or not human languages do isn't
really germane to the discussion at hand.

> Wall is right in saying that context helps. However, all the
> experiments of which I'm aware have studied semantic context, not
> syntactic context. So I don't see how Perl's baroque syntax provides
> any contextual support.

I remember that learning Perl (As of Perl 4, the last
version I seriously knew) *felt* a lot more like learning
a natural language than a programming language.  I
recognized lots of constructions that were clearly just
copied from the complexities of natural languages.  There
was a forest of syntax and special cases, but it did seem
to have a sort of grammatical pattern with contexts for
different kinds of programming ideas and the "same" word
doing related things in different contexts.

I also had this feeling that there was a sort of "gestalt"
understanding lurking in the background, as with human
languages; Having had to study French for a whole year or
two before actually starting to think in it, I felt like
I'd reached a new sort of understanding...  and learning
Perl felt like that kind of understanding was back there
somewhere, even though I never really got it.

I sort of lost interest in Perl when I saw that Perl 5
was a fairly radical redesign; if I'm going to invest that
kind of effort learning a language, I want the language to
be rock-solid stable in its structure. I'll wait until
it's finished, *then* learn it again.

			Bear
From: Sashank Varma
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <1126145446.286958.318100@g44g2000cwa.googlegroups.com>
Ray Dillinger wrote:
> And yet everywhere we look, we see that natural languages are
> deeply syntactically ambiguous and complex;  One would think
> that if such fundamental difficulties characterized ambiguity
> and complexity, then people would use much simpler languages
> amongst themselves.

Well, syntactically complex and and ambiguous sentences *are* harder to
comprehend on all the dimensions I listed before: time, accuracy of
understanding, difficulty for children, adults with small working
memory capacities, and patients with damage to language areas. There's
no doubt about it.

When the goal is to simplify communication, speakers/writers avoid
difficult constructions. Think of how mothers talk to their children.

But of course adult speakers have many other goals, including pragmatic
and poetic ones.

> There are a multitude of very interesting hypotheses that
> could be drawn here about some kind of tradeoff or value
> realized from all the extra work and difficulty we let
> ourselves in for; but most of them are conflicting, few
> apply to programming languages, and the design of
> experiments to test any or all of them would be extremely
> difficult.

I agree with everything but the last part. I think the experimental
study of how programmers comprehend and produce programs is tractable
right now.

> I have a favorite hypothesis, but won't go much into it
> here as it's pure speculation.  I think that a lot of
> good ideas emerge from poorly-understood or ambiguously-
> communicated bad ideas.  So our loose, imprecise,
> context-sensitive, ambiguous, complex languages may be
> helpful because they give us room to misunderstand
> each other's bad ideas.  People are forced to think
> "what did he mean?" (the extra work you're talking about)
> and come up with an answer that, hopefully, makes sense -
> in some cases even if what was actually meant makes less
> sense.  So maybe our difficulties communicating contribute
> to the iterative refinement of ideas.

I agree. There is interesting work on this topic, some of it in
education. I'm sure I can track down a PDF if you're interested.
From: Hakon Alstadheim
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <432010A2.5070302@alstadheim.priv.no>
Sashank Varma wrote:
> Ray Dillinger wrote:
> 
>>I have a favorite hypothesis, but won't go much into it
>>here as it's pure speculation.  I think that a lot of
>>good ideas emerge from poorly-understood or ambiguously-
>>communicated bad ideas.  So our loose, imprecise,
>>context-sensitive, ambiguous, complex languages may be
>>helpful because they give us room to misunderstand
>>each other's bad ideas.  People are forced to think
>>"what did he mean?" (the extra work you're talking about)
>>and come up with an answer that, hopefully, makes sense -
>>in some cases even if what was actually meant makes less
>>sense.  So maybe our difficulties communicating contribute
>>to the iterative refinement of ideas.
> 
> I agree. There is interesting work on this topic, some of it in
> education. I'm sure I can track down a PDF if you're interested.
> 

This *may* be a good effect of bad communication. Cold comfort I say.

One *bad* effect is that people don't ever expect to understand fully 
all details of what they see/hear so they^H^H^H *I* listen 
half-attentatively for the gist of what is said, extrapolate and then I 
am only ever able to hear what I expect, i.e. stuff I already know. This 
penomenon is, I believe, one major reason that "the more things change, 
the more they stay the same", Greenspunning, and all manner of 
reinventing of wheels. I think accepting fuzzy communication makes this 
bad effect grow steadily worse. We get an endless stream of people 
weaned on "C/C++" that have a needlessly hard time grasping lisp, 
because people don't believe that it is possible to *listen*. They 
believe they have to guess at what they are hearing, and hence only hear 
permutations of the Algol paradigm of programming, and start hating 
parenthesis.

P.S: I would like to thank Erik Naggum for planting this idea in my head 
over the course of many rants. This may be what he intended, or not. 
Anyway, thanks Erik.
From: Peter Seibel
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <m28xy77ffl.fsf@gigamonkeys.com>
Hakon Alstadheim <·····@alstadheim.priv.no> writes:

> Sashank Varma wrote:
>> Ray Dillinger wrote:
>> 
>>>I have a favorite hypothesis, but won't go much into it
>>>here as it's pure speculation.  I think that a lot of
>>>good ideas emerge from poorly-understood or ambiguously-
>>>communicated bad ideas.  So our loose, imprecise,
>>>context-sensitive, ambiguous, complex languages may be
>>>helpful because they give us room to misunderstand
>>>each other's bad ideas.  People are forced to think
>>>"what did he mean?" (the extra work you're talking about)
>>>and come up with an answer that, hopefully, makes sense -
>>>in some cases even if what was actually meant makes less
>>>sense.  So maybe our difficulties communicating contribute
>>>to the iterative refinement of ideas.
>> I agree. There is interesting work on this topic, some of it in
>> education. I'm sure I can track down a PDF if you're interested.
>> 
>
> This *may* be a good effect of bad communication. Cold comfort I say.
>
> One *bad* effect is that people don't ever expect to understand fully
> all details of what they see/hear so they^H^H^H *I* listen
> half-attentatively for the gist of what is said, extrapolate and then
> I am only ever able to hear what I expect, i.e. stuff I already
> know. This penomenon is, I believe, one major reason that "the more
> things change, the more they stay the same", Greenspunning, and all
> manner of reinventing of wheels. I think accepting fuzzy communication
> makes this bad effect grow steadily worse. We get an endless stream of
> people weaned on "C/C++" that have a needlessly hard time grasping
> lisp, because people don't believe that it is possible to
> *listen*. They believe they have to guess at what they are hearing,
> and hence only hear permutations of the Algol paradigm of programming,
> and start hating parenthesis.
>
> P.S: I would like to thank Erik Naggum for planting this idea in my
> head over the course of many rants. This may be what he intended, or
> not. Anyway, thanks Erik.

Here's a guy who didn't talk for 17 years, for reasons similar to the
ones you site:

  <http://www.grist.org/news/maindish/2005/05/10/hertsgaard-francis/>

Actually comes across as a pretty down-to-earth guy in that interview.

-Peter

 Seibel           * ·····@gigamonkeys.com
Gigamonkeys Consulting * http://www.gigamonkeys.com/
Practical Common Lisp  * http://www.gigamonkeys.com/book/
From: Marco Antoniotti
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <YNKTe.78$DJ5.75826@typhoon.nyu.edu>
Ray Dillinger wrote:
> I remember that learning Perl (As of Perl 4, the last
> version I seriously knew) *felt* a lot more like learning
> a natural language than a programming language.  I
> recognized lots of constructions that were clearly just
> copied from the complexities of natural languages.  There
> was a forest of syntax and special cases, but it did seem
> to have a sort of grammatical pattern with contexts for
> different kinds of programming ideas and the "same" word
> doing related things in different contexts.
> 
> I also had this feeling that there was a sort of "gestalt"
> understanding lurking in the background, as with human
> languages; Having had to study French for a whole year or
> two before actually starting to think in it, I felt like
> I'd reached a new sort of understanding...  and learning
> Perl felt like that kind of understanding was back there
> somewhere, even though I never really got it.
> 
> I sort of lost interest in Perl when I saw that Perl 5
> was a fairly radical redesign; if I'm going to invest that
> kind of effort learning a language, I want the language to
> be rock-solid stable in its structure. I'll wait until
> it's finished, *then* learn it again.
> 

There are *very cool* things one can do with Perl :)
http://www.csse.monash.edu.au/~damian/papers/HTML/Perligata.html :)

Cheers
--
marco
From: David Steuber
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <87psrgfs6e.fsf@david-steuber.com>
Marco Antoniotti <·······@cs.nyu.edu> writes:

> There are *very cool* things one can do with Perl :)
> http://www.csse.monash.edu.au/~damian/papers/HTML/Perligata.html :)

There are also some rather mundane things you can do with Perl:

  http://www.david-steuber.com/snippets/Perl_Is_Not_Lisp/

Although I confess I am quite proud of that bit of code, incomplete
though it may be.

-- 
You should maybe check the chemical content of your breakfast
cereal. --- Bill Watterson
From: Kent M Pitman
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <uzmqow5e4.fsf@nhplace.com>
Ray Dillinger <····@sonic.net> writes:

[various interesting remarks elided.  a very interesting post, i just
 don't have a lot of time to reply at this moment so will just add a
 thought or two.]

> I have a favorite hypothesis, but won't go much into it
> here as it's pure speculation.  I think that a lot of
> good ideas emerge from poorly-understood or ambiguously-
> communicated bad ideas.  So our loose, imprecise,
> context-sensitive, ambiguous, complex languages may be
> helpful because they give us room to misunderstand
> each other's bad ideas.  People are forced to think
> "what did he mean?" (the extra work you're talking about)
> and come up with an answer that, hopefully, makes sense -
> in some cases even if what was actually meant makes less
> sense.  So maybe our difficulties communicating contribute
> to the iterative refinement of ideas.

I also think the imprecision of language allows people to agree at
all.  e.g., consider things like "peace treaties", "constitutions",
and whatnot.  If the Bill of Rights had to be specified unambiguously,
it probably wouldn't have been written.  (Something that is almost
certainly relevant in the debate over whether "strict constructionism"
is an appropriate judicial theory.)  I don't think if language were
more precise that they ever would.

Also, if you assume that neural net programming is at least an
approximation to an understanding of how our brains work, then it's
pretty clear that an individual's history will necessarily lead to a
degree of overprecision in the definition of each word in each
individual's mind, and consequently none of us would be able to
understand one another if we insisted on a rigid alignment of
boundaries.  What I said was "tasty" or "pretty" or "fair" simply
cannot, by any practical means, turn out to be what you mean to more
than a certain degree of precision given our differing histories.

Having done so just yesterday, It's funny that I should have occasion
to again quote (or, at least, paraphrase) the late Prof. Bill Martin
from the computational linguistics course I took from him at MIT years
ago, this time by vague paraphrase, but I think it an important
observation: He more than once emphasized that we have designed
language with the aim of understanding it.  This may seem like an
obvious constraint, but I think it's not.  I think what he was getting
at is that a proper understanding of meaning must be one that takes
into account our basic brain mechanisms, not one that fights
them. Because language has to bootstrap itself in the early years, and
if the human brain can't make sense of whatever rules we give to it,
we'd never get started communicating.

But having then a workable language, we're free of course to describe
things in it that go beyond the bounds of what the bootstrap language
constraints would allow.  But that doesn't mean we're always
well-advised to do so.

Probably it's both the case that we have pattern matching and parallel
hardware that is capable of being involved (which accounts for some of
the structural complexity of what we see) and it's also the case that
we have only a certain capacity to do easy processing, such that the
remarks that Sashank Varma made about how complexity complicates
things also come quickly into play as a problem grows out of our
natural hardware ability to glance at and parse something.  Patterns
that result in idiomatic uses probably work well with only slight
training, while patterns that require complex dynamic understanding
probably don't work well.

> Programming languages cannot yet benefit from that kind
> of ambiguity, so whether or not human languages do isn't
> really germane to the discussion at hand.

And perhaps inherently cannot, unless they become neural net based
or otherwise fuzzy/heuristic.  That is, the present style of programming
is precision-oriented in a way that is almost antithetical to this 
notion of "meaning".
From: lin8080
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <4320A362.2C23BB24@freenet.de>
Ray Dillinger schrieb:

> I have a favorite hypothesis, but won't go much into it
> here as it's pure speculation.  I think that a lot of
> good ideas emerge from poorly-understood or ambiguously-
> communicated bad ideas.  So our loose, imprecise,
> context-sensitive, ambiguous, complex languages may be
> helpful because they give us room to misunderstand
> each other's bad ideas.  People are forced to think
> "what did he mean?" (the extra work you're talking about)
> and come up with an answer that, hopefully, makes sense -
> in some cases even if what was actually meant makes less
> sense.  So maybe our difficulties communicating contribute
> to the iterative refinement of ideas.

So, I can share your hypothesis. Assume a half-understand beginner
explained you his ideas/thoughts of something functionally is going on,
what would you think, while you know he is talking stupid things. And a
maybe nice idea is gone ...

stefan


For example:
When I was a 7 year old boy, my parents allowed me to go to the
senior-aged neighbor. This man owned the only TV in the village. On day
I asked him to rewind the TV contents, cause I want to see my favorite
series again (it was flipper). And I don't forget the eyes thrown on my
about my misunderstanding. 
But this man pleased me, to explain, how I thought a TV works. (That is,
you buy a TV with its contents and can watch limit sort of films, thats
why there are so many different models). So he explaind me, what radio,
wireless, long distance traffic and at least live-TV really is. (And I
entered my first not visible world that day.) The weeks after that my
brain boils over about half understand things and what one can do with
it (frequency waves over the world and the hardware to make that
visible).
So, I can share your hypothesis...
From: Pascal Bourguignon
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <87r7c0dxj4.fsf@thalassa.informatimago.com>
"Sashank Varma" <············@yahoo.com> writes:

> Ray Dillinger wrote:
>> Peter Seibel wrote:
>> > Another part of me suspects that Larry Wall may be on to
>> > something when he says folks like different things to look different,
>> > thus all the different bits of syntax in Perl. (Certainly within Lisp
>> > there's evidence that special purpose syntax is occasionally
>> > considered useful, at least by some folks; c.f. the standard reader
>> > macros, diffferent syntax for different kinds of numbers, LOOP,
>> > FORMAT.) If Wall's view is correct, than that means that Lisp's
>> > minimal syntax is a trade-off--we accept a uniform, and thus less
>> > intuitive (per Wall) syntax because of the many advantages is gives us
>> > in other respects (macros, ease of manipulation in editors, etc.) But
>> > that would also mean that if one were designing a domain-specific
>> > langauge that wasn't necessarily going to support, say, macros then
>> > maybe the tradeoff would turn against s-exp syntax.
>> >
>> > So, does anyone know of any actual research into these issues--maybe
>> > by psychologists or human factors folks?
>>
>> Well, I'd say Larry Wall counts; he's originally a
>> (human) linguist who got into programming sort of
>> "sideways," if I recall correctly.
>
> As a psychologist who studies language comprehension, I've always
> viewed Wall's claim as more of a linguistic (or rather, a Wall-ian)
> slogan than a scientific claim.
>
> I don't know of any evidence that more syntactic variation leads to
> better comprehension. However, there is plenty of evidence that the
> more complex a syntactic construction, the more difficult it is to
> process. That is, it takes longer to comprehend, is understood less
> accurately, elicits more activation in the language areas of the brain,
> and is  impaired in normal adults with small working memory capacity
> and in patients with damage to langugage areas. In fact, there is
> evidence that syntactic ambiguity is difficult to process. If there are
> several structural interpretations of a word sequence, having to choose
> between them has negative behavioral and brain consequences.


You should revise your terminology, I think there's some confusion.

In natural languages, what you call syntax reduces to a few
typographical characters:  , ; : . ! ? ' " ( ) -
and their use is actually very simple.  I'd call this "special characters",
and it's only a part of the _lexical_ structure of the written languages.

I don't think there's a natural tendency to add new special characters
to natural languages.  

On the other hand, natural language may have or have not a complex
syntax, what is known as the grammar.  But for the most part, the
complexity of the grammars natural languages doesn't matter: people
have a semantical undestanding: they routinely produce and understand
"ungrammatical" sentences.  This is very important because there are a
lot of gramatical sentenses that don't make sense anyway, and a lot of
gramatical sentenses that make sense, but for which the syntaxtic tree
doesn't help (they're syntactically ambiguous).


So do we want to speak about a tendency by some programmer (and
mathematicians?) to add gratuituously special characters to their
notations, or do we want to speak about the syntax?

Lisp has an additional syntax layer above the s-expr tree, as some
have noted previously.

(cl:if a b c d e) is ungrammatical, even if it's a well formed s-expression.

The lisp grammar rule is:

if --> '(' 'cl:if' condition then-form ')' .
if --> '(' 'cl:if' condition then-form else-form ')' .
(modulo the text vs. s-expr aspect).


The C grammar rule isn't much more complex:

if --> 'if' '(' expression_list ')' then_statement .
if --> 'if' '(' expression_list ')' then_statement 'else' else_statement .


I don't know if globally C or perl or lisp has a more complex or a
simplier grammar.  But what we know, is that it's harder to type C or
perl, to find all these baroque special characters needed to type
these languages.  And what for, if the abstrat tree in the end is the
same or very similar?


Another data-point, I note, if I'm not misled, that the first thing
APL programmers do is to define names for their numerous
glyph-operators to make their programs readable and modificable.
Well, in lisp all the operators already have readable names.


-- 
__Pascal Bourguignon__                     http://www.informatimago.com/

Nobody can fix the economy.  Nobody can be trusted with their finger
on the button.  Nobody's perfect.  VOTE FOR NOBODY.
From: Sashank Varma
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <1126156830.237505.131180@g14g2000cwa.googlegroups.com>
Pascal Bourguignon wrote:

> "Sashank Varma" <············@yahoo.com> writes:
> > I don't know of any evidence that more syntactic variation leads to
> > better comprehension. However, there is plenty of evidence that the
> > more complex a syntactic construction, the more difficult it is to
> > process. That is, it takes longer to comprehend, is understood less
> > accurately, elicits more activation in the language areas of the brain,
> > and is  impaired in normal adults with small working memory capacity
> > and in patients with damage to langugage areas. In fact, there is
> > evidence that syntactic ambiguity is difficult to process. If there are
> > several structural interpretations of a word sequence, having to choose
> > between them has negative behavioral and brain consequences.
>
> You should revise your terminology, I think there's some confusion.

I'm not using programming language terminology. I'm using linguistic
terminology. Is this the confusion?

> In natural languages, what you call syntax reduces to a few
> typographical characters:  , ; : . ! ? ' " ( ) -
> and their use is actually very simple.  I'd call this "special characters",

I would not call these characters syntax. I'd call them punctuation.

> and it's only a part of the _lexical_ structure of the written languages.

Actually, I'd say only the apostrophe is part of the lexical structure
of written natural language.

> On the other hand, natural language may have or have not a complex
> syntax, what is known as the grammar.

It's commonly accepted that natural languages have syntactic structures
that can be described by grammars, whether formal or informal. A small
minority disagree with this, to be sure.

> But for the most part, the
> complexity of the grammars natural languages doesn't matter: people
> have a semantical undestanding: they routinely produce and understand
> "ungrammatical" sentences.  This is very important because there are a
> lot of gramatical sentenses that don't make sense anyway, and a lot of
> gramatical sentenses that make sense, but for which the syntaxtic tree
> doesn't help (they're syntactically ambiguous).

This is true to a point. No one is arguing that syntax is a sufficient
account of language, only that it is a necessary part. Here are two
examples you might find interesting. The first is due to our very own
Kent Pitman (paraphrasing a linguistic teacher of his): You don't need
syntax to state semantically obvious things. For example, the
grammatical "Mouse eats cheese." and the ungrammatical "Eats mouse
cheese." are equally comprehensible sentences. However, you *do* need
syntax to state semantically surprising things, such as "Cheese eats
mouse."

The second example is related in Haiman (1985, p.4): "Jakobson drew
attention to the iconicity of Caeser's famous, and typical, 'veni,
vedi, vici', 'I came, I saw, I conquered'; Eric Kellerman (ms)
points out that the latter equence is emphatically not synonymous with
'I saw, I conquered, I came', which 'receives a different, but no
less iconic, interpretation'."

Haiman, J. (1985). Natural syntax: Iconicity and erosion. Cambridge:
Cambridge University Press.)
From: Alan Crowe
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <86fyselzsi.fsf@cawtech.freeserve.co.uk>
Pascal Bourguignon
> The C grammar rule isn't much more complex:
>
> if --> 'if' '(' expression_list ')' then_statement .
> if --> 'if' '(' expression_list ')' then_statement 'else' else_statement .

There is a point lurking here that has long bothered
me. Programming in assembler there are two basic cliches

1)Some instructions are OPTIONAL and may be skipped

         TEST ...
         COND-JUMP skip
         THIS
         THAT
  skip   ....

2)Two sequences of instructions are ALTERNATIVES

          TEST ...
          COND-JUMP skip
          THIS
          THAT
          JUMP rejoin
  skip    CALC
          COMP
  rejoin  ...

It seems natural for a high level language to have an OPT
construct corresponding to the first case, and an ALT
construct corresponding to the second, 

OPT boolean-expression MAYBE then_statement

ALT boolean-expression EITHER then_statement OR else_statement

The tradition is to use IF for both OPT and ALT, leading to
the dangling else problem: Does

IF a THEN IF b THEN c ELSE d 

mean

OPT a MAYBE
  ALT b EITHER c OR d

or does it mean

ALT a EITHER 
        OPT b MAYBE c
      OR d

Typically the dangling else is resolved in theory by an
operator precedence rule, so that additional brackets are
some times needed, and some times not needed. This approach
seems to have failed and in practise style guides recommend
using additional brackets in both case in order to make the
meaning clearer.

Kernighan and Ritchie give a grammar for C, suitable for
input to an automatic parse-generator and note:
 
   It has only one conflict, generated by the if-else
   ambiguity

Peter Seibel asks about the origin of the drive for complex
syntax. The if-else ambiguity provides an interesting
example of it. if-then and if-then-else rather than
opt-maybe and alt-either-or have a sacred status. The
if-else ambiguity is portrayed as a deep fact of computer
science, not an error in language design. 

Why do people love flaws and complications? What holds them
back from finishing their work, doing away with if, and
removing the last remaining conflict from a language
grammar?

Alan Crowe
Edinburgh
Scotland
From: Pascal Bourguignon
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <87zmql959f.fsf@thalassa.informatimago.com>
Alan Crowe <····@cawtech.freeserve.co.uk> writes:
> OPT boolean-expression MAYBE then_statement
> ALT boolean-expression EITHER then_statement OR else_statement
> [...]
> Typically the dangling else is resolved in theory by an
> operator precedence rule, so that additional brackets are
> some times needed, and some times not needed. This approach
> seems to have failed and in practise style guides recommend
> using additional brackets in both case in order to make the
> meaning clearer.
> [...]
> Why do people love flaws and complications? What holds them
> back from finishing their work, doing away with if, and
> removing the last remaining conflict from a language
> grammar?

Well, in lisp, there's IF and WHEN or UNLESS.

-- 
"Specifications are for the weak and timid!"
From: Alan Crowe
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <86wtlpgrel.fsf@cawtech.freeserve.co.uk>
Pascal Bourguignon wrote:
> Well, in lisp, there's IF and WHEN or UNLESS.

No, Lisp is different again. Lisp avoids if-else ambiguity
by being fully parenthesised. Without the parenthesis

IF a IF b c d

could meand `IF a WHEN b c d' or `WHEN a IF b c d'
(which is still ambiguous :-()

Current style guidance is to freely use IF for both if-then
and if-then-else, and to reserve WHEN for optional side-effects.
This does make me wonder - would it be better to use WHEN
for all if-then's and IF exclusively for if-then-else?

I used to be interested in language design. My language was
going to have OPT and ALT (conceptually; I never fixed on
the choice of keywords, perhaps "if then" and "as either or"
is good.) Then I discovered Common Lisp. (Perhaps I should
say rediscovered CL, but that is a long story.) It provided
adequate solutions to the problems, with the languages I
knew, that motivated my interested in language design. It
also opened new horizons.

A software project has many documents. Request for
proposal. Proposal. Contract. Requirements
analysis. Specification. User manual. User training course
materials. Design documents. Modularisation plan. Module
interface specifications. Mathematical analysis of numerical
routines. Use cases. Unit Tests. 

The more formal the document, the greater the scope for
machine assistance in its preparation. As we approach the
hardware the processing of the documents becomes fully
automated with the final stages, compilation of HLL source
to assembler listing, assembly of assembler listing into
object file, linking and loading of object files into the
core image, proceeding without human intervention.

In the context of the design of computer programming
languages, the concept of syntax is the idea that a certain
intermediate level document, called the program source, is
to be privileged as a sacred text to be prepared manually.

Notice that there are two opposing ideas here. One is the
resigned acceptance that we are ignorant and unable to
create the tools to generate program source
automatically. This leads to a system, such as CL's defmacro,
designed to support an opportunistic approach of spotting
the cases when we can automate the generation of source code
and making the most of these lucky breaks. (and of course to
the notion of making ones own luck.)

The other idea is "syntax", a system of abbreviations and
shortcuts tuned to a fingers-on-keyboard approach to the
program source. The notion of automatic processing of the
source format is rejected leading, to a system, such as Perl,
in which "source filters always break"

http://search.cpan.org/~chips/Perl6-Subs-0.05/lib/Perl6/Subs.pm

     This module is a source filter. Source filters always
     break. For example, the breakage caused by parameter
     names that turn into Perl quoting operators when their
     sigils are stripped may never be fixed.

From this perspective, the difference between CL and other
languages is sociological and organizational. Are some
documents reserved for manual processing and off-limits to
automation? DEFMACRO violates the natural order. Instead of
turning source into object, it turns notation into souce,
getting above itself and undertaking activities reserved to
mankind. Perhaps the real obstacle to the widespread
adoption of CL is that the language is inherently
blasphemous.

Alan Crowe
Edinburgh
Scotland
From: Rob Warnock
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <hIydnVozm-k4f7neRVn-rg@speakeasy.net>
Alan Crowe  <····@cawtech.freeserve.co.uk> wrote:
+---------------
| Why do people love flaws and complications? What holds them
| back from finishing their work, doing away with if, and
| removing the last remaining conflict from a language grammar?
+---------------

See <http://www.dreamsongs.com/WorseIsBetter.html> for one
explanation...


-Rob

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: Stefan Nobis
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <87ek809fp5.fsf@snobis.de>
"Sashank Varma" <············@yahoo.com> writes:

> I don't know of any evidence that more syntactic variation leads
> to better comprehension. However, there is plenty of evidence
> that the more complex a syntactic construction, the more
> difficult it is to process.

As you mention this Chinese comes to mind: It's rather complex
syntax, but it seems they are not too happy with it, because since
IIRC the 50ies the synatx is simplified (for example compare the
traditional and the simplified sign for "dragon").

IIRC in most languages irregular verbs are becoming fewer [arg,
this sounds complete wrong -- how do you say this in correct
english?]. Also variations of languages are not so common today as
some decades ago.

My (not educated) view is, that languages tend to become more
simple, so i would say a complex syntax is not really winning.

-- 
Stefan.
From: Frederic Beal
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <slrndi00ko.cf5.beal@clipper.ens.fr>
On 2005-09-08, Stefan Nobis <······@gmx.de> wrote:
> "Sashank Varma" <············@yahoo.com> writes:
>
>> I don't know of any evidence that more syntactic variation leads
>> to better comprehension. However, there is plenty of evidence
>> that the more complex a syntactic construction, the more
>> difficult it is to process.
>
> As you mention this Chinese comes to mind: It's rather complex
> syntax, but it seems they are not too happy with it, because since
> IIRC the 50ies the synatx is simplified (for example compare the
> traditional and the simplified sign for "dragon").

No, no, signs are not syntax (I mean, if I decided to write using only
capital letters then only small letters, you wouldn't say the difference
is in the syntax - the difference is in the shape of the characters,
which has nothing to do with grammar.)

> My (not educated) view is, that languages tend to become more
> simple, so i would say a complex syntax is not really winning.

From the fact languages are becoming simpler, one should not infer that
languages should become simpler or that simple languages are winning.

-- 
 Frederic
From: Lars Brinkhoff
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <8564tcrlx0.fsf@junk.nocrew.org>
Stefan Nobis <······@gmx.de> writes:
> Chinese comes to mind: It's rather complex syntax, but it seems they
> are not too happy with it, because since IIRC the 50ies the synatx
> is simplified (for example compare the traditional and the
> simplified sign for "dragon").

Does the composition of induvidual characters classify as "syntax"?
From my (incomplete) understanding of Chinese grammar, I'd say it's
quite simple.
From: jayessay
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <m31x3z4l31.fsf@rigel.goldenthreadtech.com>
"Sashank Varma" <············@yahoo.com> writes:

> Ray Dillinger wrote:
> >
> > Well, I'd say Larry Wall counts; he's originally a
> > (human) linguist who got into programming sort of
> > "sideways," if I recall correctly.
> 
> As a psychologist who studies language comprehension, I've always
> viewed Wall's claim as more of a linguistic (or rather, a Wall-ian)
> slogan than a scientific claim.
> 
> I don't know of any evidence that more syntactic variation leads to
> better comprehension. However, there is plenty of evidence that the
> more complex a syntactic construction, the more difficult it is to
> process. That is, it takes longer to comprehend, is understood less
> accurately, elicits more activation in the language areas of the brain,
> and is  impaired in normal adults with small working memory capacity
> and in patients with damage to langugage areas. In fact, there is
> evidence that syntactic ambiguity is difficult to process. If there are
> several structural interpretations of a word sequence, having to choose
> between them has negative behavioral and brain consequences.

I wonder what that Harrop guy would think about this?  No wait, he
thinks people like you are the equivalent of "creation scientists",
and that his own beliefs, while stunningly naive, are important
"objective" observations.  Go figure.


/Jon

-- 
'j' - a n t h o n y at romeo/charley/november com
From: Kent M Pitman
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <uk6htbhom.fsf@nhplace.com>
Peter Seibel <·····@gigamonkeys.com> writes:

> In my travels among the LispNots I constantly run into people--even
> new programmers--who *want* elaborate syntax.

I suspect people just want to reduce everything to a solved problem
... whether or not that "reduction" loses valuable/essential content
in the process.

People have learned that programming involves learning syntax and are
confused when something doesn't match the mold.  It doesn't seem
syntactically like the solution they were expecting and so they fight
it.  Like if you go the doctor looking for a pill and he tells you to
exercise or eat right instead... Sometimes those don't seem like the
kind of "medicine" you expect a doctor to offer you.  Lisp and its
syntax is, in this metaphor, the acupuncture of the syntax remedy world.
And whether you can use it depends not only on how open-minded you are,
but what your medical plan is prepared to acknowledge as legitimate 
treatment.

I have more to say on this, but I'll spare you tonight.  The following
post is related in my mind because of the contained story about 
Prof. Bose, though it makes some other useful points as well:

http://groups.google.com/group/comp.lang.lisp/msg/b81cfb3ee2e166f6
From: Stefan Nobis
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <87br35jp7d.fsf@snobis.de>
Peter Seibel <·····@gigamonkeys.com> writes:

> In my travels among the LispNots I constantly run into people--even
> new programmers--who *want* elaborate syntax. (I.e. more elaborate

Be asured, there is at least one person new to Lisp who like the
simple syntax -- and before Lisp I learned languages like C++,
Ada, Perl, Ruby and I always prefered C++ over Java.

The syntax of Lisp, to which i become used in only some hours, is
just beautiful.

> So, does anyone know of any actual research into these issues--maybe
> by psychologists or human factors folks?

No, sorry. I'm interested in these things, too, i always wondered
why in language design teams linguists and psychologists are
missing (as far as i know).

-- 
Stefan.
From: Aleksander Nabaglo
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <dfmfe6$mcr$1@srv.cyf-kr.edu.pl>
!

> In my travels among the LispNots I constantly run into people--even
> new programmers--who *want* elaborate syntax. (I.e. more elaborate
> than parens and prefix notation.) One part of me thinks that maybe
> this is due to the pervasive influence of non-Lisp languages; even
I feel discomfort seeing many))))))).
(Maybe (defining matching { and } as equivalent to ( and )
      {would (greatly (help
             {visual perception (?)
                            })) }))

-- 
Aleksander
.
From: ······@earthlink.net
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <1126103577.008795.31470@g49g2000cwa.googlegroups.com>
Aleksander Nabaglo wrote:
> I feel discomfort seeing many))))))).

Interlisp allowed [ as an open paren with ] meaning "enough close
parens to match back to the most recent enclosing ]".

-andy
From: Pascal Bourguignon
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <874q8whkjd.fsf@thalassa.informatimago.com>
Aleksander Nabaglo <·@ap.krakow.pl> writes:
>> In my travels among the LispNots I constantly run into people--even
>> new programmers--who *want* elaborate syntax. (I.e. more elaborate
>> than parens and prefix notation.) One part of me thinks that maybe
>> this is due to the pervasive influence of non-Lisp languages; even
> I feel discomfort seeing many))))))).
> (Maybe (defining matching { and } as equivalent to ( and )
>       {would (greatly (help
>              {visual perception (?)
>                             })) }))

It's dumb.  It forces you to give more attention to parentheses than needed.
Now you have to match ([{ vs )]}  
What if there's a mismatch like above?
Do you know what the reader will do?
Are you puzzled by the mismatch?
Why are you spenting time and neurons on this trivial stuff?


Free your mind, forget C's [], {}, () and use only lisp's () !


-- 
__Pascal Bourguignon__                     http://www.informatimago.com/

Nobody can fix the economy.  Nobody can be trusted with their finger
on the button.  Nobody's perfect.  VOTE FOR NOBODY.
From: Aleksander Nabaglo
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <dfp11m$hev$1@srv.cyf-kr.edu.pl>
!

>> (Maybe (defining matching { and } as equivalent to ( and )
>>       {would (greatly (help
>>              {visual perception (?)
>>                             })) }))
> 
> It's dumb.  It forces you to give more attention to parentheses than needed.
> Now you have to match ([{ vs )]}  
Alternating )})})} every level makes no help,
at third or fourth -- does.

> What if there's a mismatch like above?
> Do you know what the reader will do?
The reader would know programmer errors.

> Are you puzzled by the mismatch?
I instantly perceived something is wrong before conscious
understanding what is going exactly.

> Why are you spenting time and neurons on this trivial stuff?
I would like to see structure of nested do's

(do ((   (do-in-init-form ...) (do-in-step-form ...)
(do-in-result-form) (do-in-statement (( 
(do-in-initform-of-do-in-statement ((  ...

without the help of synatax highliting.

> Free your mind, forget C's [], {}, () and use only lisp's () !
Easy to say, harder to make mind self made free(mind);

-- 
Aleksander
.
From: Wade Humeniuk
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <NYNTe.166380$wr.37464@clgrps12>
Peter Seibel wrote:
> In my travels among the LispNots I constantly run into people--even
> new programmers--who *want* elaborate syntax. (I.e. more elaborate
> than parens and prefix notation.) One part of me thinks that maybe
> this is due to the pervasive influence of non-Lisp languages; even
> non-programmers have an idea what languages should look like and it's
> not Lisp. Another part of me suspects that Larry Wall may be on to
> something when he says folks like different things to look different,
> thus all the different bits of syntax in Perl. (Certainly within Lisp
> there's evidence that special purpose syntax is occasionally
> considered useful, at least by some folks; c.f. the standard reader
> macros, diffferent syntax for different kinds of numbers, LOOP,
> FORMAT.) If Wall's view is correct, than that means that Lisp's
> minimal syntax is a trade-off--we accept a uniform, and thus less
> intuitive (per Wall) syntax because of the many advantages is gives us
> in other respects (macros, ease of manipulation in editors, etc.) But
> that would also mean that if one were designing a domain-specific
> langauge that wasn't necessarily going to support, say, macros then
> maybe the tradeoff would turn against s-exp syntax.
> 
> So, does anyone know of any actual research into these issues--maybe
> by psychologists or human factors folks?
> 

I do not think this line thinking is valid when it comes to
programming.  Programming in its present form has to be unambiguous,
essentially the program has to be made so the compilers can
translate it to bits.  Just as modern aircraft are like but
unlike birds, so too do computer languages have to like
but unlike human languages.  To move ahead in a objective
scientific fashion one has to dump the psychological
baggage.  To assume that the best programming language
will somehow be modelled after human thinking is
folly.  Humans are not the center of the universe.
I think the view of computing as creating a machine
"in our image" is flawed.  One of the reasons I like
Lisp is that it has an alien objective nature, and that is
probably why many hate it.

Wade
From: Robert Maas, see http://tinyurl.com/uh3t
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <REM-2005sep07-004@Yahoo.Com>
> From: Wade Humeniuk <··················@telus.net>
> Programming in its present form has to be unambiguous, ...

I agree. But note that the purpose of programming is to tell the
computer exactly how to solve a problem by using exactly an algorithm
that some human invented in the first place. For this purpose, only a
very precise unambiguous computer programming language suffices.
Witness the many times a slave disobeys its master, an employee
disobeys its boss, simply because of a misunderstanding of what the
command was. See later for an alternative goal with different best
language.

> To assume that the best programming language will somehow be modelled
> after human thinking is folly.

I agree, if the purpose is as stated above. But on the other hand,
designing a system that can conceive new ideas that no human thought of
first, might require a different model. Human or other living-system is
one model, not yet realized effectively in A.I. software. Genetic
algorithms is another mode, which has already achieved good results in
some narrow fields which have been implemented to-date. Maybe a
"programming" language which uses genetic algorithms internally would
be useful for solving a wide range of optimization problems. You write
a set of declarations that state what the starting configuration is,
what kinds of transformations are allowed, and what computed value
needs to be maximized. You then run it a while and see what it reports
it discovered. If it is generating totally wrong stuff, that's a bug,
and you fix it by adding a new declaraction to forbid whatever wrong
direction it was exploring (whatever class of mutations it was
mistakenly generating). Perhaps there could be a nice GUI which
summarized the directions it was exploring, and you could click on part
of that display to discourage exploration in that direction, sort of
like trimming a growing plant except that the g.a. system is smart
enough to reflect the trimming action back to what caused it and
negotiate with the programmer what specifically needs to be changed in
the program to avoid needing that particular trimming ever again.
(Wouldn't it be nice if ants were like that? You find ants coming into
your kitchen just once, spray poison where they are coming in, and the
ant colony changes its exploration algorithm to never again explore
into your kitchen.)
From: Kent M Pitman
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <uvf1cw4yr.fsf@nhplace.com>
Wade Humeniuk <··················@telus.net> writes:

> [...] To assume that the best programming language
> will somehow be modelled after human thinking is
> folly.  Humans are not the center of the universe. [...]

This depends some on the purpose of programming.

Programs and programming languages are designed for particular purposes
and are only rightly judged good or bad against those purposes.

Programs will surely be easier to read if they harmonize with the hardware
facilities humans have for parsing.

e.g., they should be sufficiently simple at the token level for the
human eye to parse, yet sufficiently dense that the human eye is not
forced to scan linearly to take everything in when it could just as
well hae recognized thing in a more compact way.  (this explains why
octal, decimal, or hex is better than binary for most purposes.  i
don't know studies, but i'm sure it must be easily shown that binary
too sparse for the human brain to easily parse under normal
circumstances, while base 36 is perhaps too dense to meaningfully
interpret.  and surely base 1000 would require so many tokesn that humans
would have a hard time.)

Simplicity is good, but excessive simplicity is not helpful. That's why 
again we don't write stuff in straight binary.  We encode things into 
characters.  We don't even write text in Esperanto if we can help it, we
opt for something like English.  But we don't go the other extreme and try
to use Base64-encoded English either...

There are programming languages that interact only in binary, and protocols
between them can often work fine that way.  There are also boundaries where
programs but not people will try to interpret what's going through, and in
those cases, things like IP packet structure (network) or XML (textual piping)
are examples of structured formats that work well.

> I think the view of computing as creating a machine
> "in our image" is flawed.

Depends on what you plan to do with it.

> One of the reasons I like
> Lisp is that it has an alien objective nature, and that is
> probably why many hate it.

Funny, one of the reasons I like it is that it maps so neatly into 
the kinds of conceptual groupings that I imagine either programs or
machines to like, and it also is a neat notation that is both accessible
to humans and machines.  Rather than saying it's alien, I might prefer to
say it's more like a negotiated compromise between man and machine. ;)
From: Raffael Cavallaro
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <2005090809093316807%raffaelcavallaro@pasdespamsilvousplaitmaccom>
On 2005-09-07 23:10:37 -0400, Wade Humeniuk 
<··················@telus.net> said:

> To assume that the best programming language
> will somehow be modelled after human thinking is
> folly.  Humans are not the center of the universe.

Best for what purpose?

There is no objective best:

Q: Why does sh*t smell so bad?
A: If you were a fly it would taste like candy.

Since programming languages are used by programmers, and programmers 
are still human beings, then, as far as programming language design is 
concerned, human beings *are* the center of the universe.

The best programming language for human beings is one that leverages 
human beings' formidable skills in natural language, not one that works 
against our natural language expectations. This is part of the reason 
why lisp macros are so powerful - they allow the straightforward 
extension of the language to meet natural language expectations in the 
problem domain.

Maybe one day, when we have a full blown human level AI, computers will 
be programmers. Then it would be correct to say - wrt programming - 
that humans are not the center of the universe. But we are not there 
yet by a long shot.

regards
From: Björn Lindberg
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <4320484f$1@news.cadence.com>
Raffael Cavallaro wrote:
> On 2005-09-07 23:10:37 -0400, Wade Humeniuk 
> <··················@telus.net> said:
> 
>> To assume that the best programming language
>> will somehow be modelled after human thinking is
>> folly.  Humans are not the center of the universe.
> 
> 
> Best for what purpose?
> 
> There is no objective best:
> 
> Q: Why does sh*t smell so bad?
> A: If you were a fly it would taste like candy.
> 
> Since programming languages are used by programmers, and programmers are 
> still human beings, then, as far as programming language design is 
> concerned, human beings *are* the center of the universe.
> 
> The best programming language for human beings is one that leverages 
> human beings' formidable skills in natural language, not one that works 
> against our natural language expectations. This is part of the reason 
> why lisp macros are so powerful - they allow the straightforward 
> extension of the language to meet natural language expectations in the 
> problem domain.

I would agree except that I would argue that the programming language's 
value lies in leveraging the human's ability to reason abstractly 
(mathematically if you will), ie. to program. I do not see where natural 
languages come into it at all.

I am beginning to wonder if the more or less nonsensical parallells 
people try to draw between programming and natural languages is just a 
consequence of the fact that we happened to call it programming 
_languages_? Would mathematicians argue about how mathematical notation 
could leverage human abilities in math by being closer to natural 
languages had it been called mathematical languages?


Bj�rn
From: Ray Dillinger
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <KcZTe.12667$p%3.51774@typhoon.sonic.net>
Bj�rn Lindberg wrote:

> I am beginning to wonder if the more or less nonsensical parallells 
> people try to draw between programming and natural languages is just a 
> consequence of the fact that we happened to call it programming 
> _languages_? Would mathematicians argue about how mathematical notation 
> could leverage human abilities in math by being closer to natural 
> languages had it been called mathematical languages?

In fact I have observed mathematicians having long and heated
arguments about just that topic.

				Bear
From: Raffael Cavallaro
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <2005090823123975249%raffaelcavallaro@pasdespamsilvousplaitmaccom>
On 2005-09-08 10:18:55 -0400, Bj�rn Lindberg <·····@runa.se> said:

> I am beginning to wonder if the more or less nonsensical parallells 
> people try to draw between programming and natural languages is just a 
> consequence of the fact that we happened to call it programming 
> _languages_?

The parallels are not by and large nonsensical. We do most of our 
reasoning in what is known to linguists as "mentalese." Mentalese is 
*not* natural language. However, because we learn to speak and listen 
early in life and continue to do so daily for the rest of our lives, 
almost all of us are so adept at translating between mentalese and one 
or more natural languages that it often seems to us that we actually 
think in language.

If a programming medium is to be acessible to most people it should 
leverage our principal medium (natural language) for communicating 
thought (mentalese). That is why we speak of programming "languages" 
and why language design has drawn heavily and should continue to draw 
heavily on natural languages.

regards
From: Björn Lindberg
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <43213013$1@news.cadence.com>
Raffael Cavallaro wrote:
> On 2005-09-08 10:18:55 -0400, Bj�rn Lindberg <·····@runa.se> said:
> 
>> I am beginning to wonder if the more or less nonsensical parallells 
>> people try to draw between programming and natural languages is just a 
>> consequence of the fact that we happened to call it programming 
>> _languages_?
> 
> 
> The parallels are not by and large nonsensical. We do most of our 
> reasoning in what is known to linguists as "mentalese." Mentalese is 
> *not* natural language. However, because we learn to speak and listen 
> early in life and continue to do so daily for the rest of our lives, 
> almost all of us are so adept at translating between mentalese and one 
> or more natural languages that it often seems to us that we actually 
> think in language.
> 
> If a programming medium is to be acessible to most people it should 
> leverage our principal medium (natural language) for communicating 
> thought (mentalese).

I didn't have a programming medium 'accessible to most people' in my 
mind. Rather, a programming medium for experts.

> That is why we speak of programming "languages" and 
> why language design has drawn heavily and should continue to draw 
> heavily on natural languages.

Really? Can you give some examples of where programming languages have 
successfully drawn on natural languages? I'll add that I consider COBOL, 
SQL and Perl to be good examples of where such experiments failed.


Bj�rn
From: Raffael Cavallaro
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <2005090908564250073%raffaelcavallaro@pasdespamsilvousplaitmaccom>
On 2005-09-09 02:47:46 -0400, Bj�rn Lindberg <·····@runa.se> said:

> I didn't have a programming medium 'accessible to most people' in my 
> mind. Rather, a programming medium for experts.

I think you misunderstand me. Even "experts" are more adept at 
translating mentalese to natural language than to anything else. This 
is universally true of human beings who have normal language 
capabilities. Simply put, there is such a large amount of cortical 
resources genetically programmed to deal with natural language, and we 
begin learning and using it so early in life that it is no surprise 
that natural language is the communications medium we are by far most 
fluent in. Here we are in c.l.l, among "experts," and much, even most 
of our discussion is conducted in natural language, not lisp code. 
Programming media should leverage this hard wired, well practiced 
natural language capability, not work against it.

What you are arguing for is a programmer priesthood. You want a world 
where only people who are experts in programming can program. This 
would exclude for example experts in chemistry writing software to do 
research in chemistry, or biologists in biology, etc. Given the 
historical trend toward wider, not more narrow computer use the idea 
that we should retroactively make programming less acessible to domain 
experts is a non-starter.

As for examples of computer languages that leverage natural language, I 
have already pointed out that common lisp macros leverage our natural 
language abilities by allowing the extension of common lisp to resemble 
natural language constructs in the problem domain.

The failure of such pseudo-natural-language programming languages as 
AppleScript comes of their giving the superficial appearance that they 
are as tolerant of variation and error as a human listener when in fact 
they are just as strict in many ways as fortran or C.
From: Björn Lindberg
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <4321a39d$1@news.cadence.com>
Raffael Cavallaro wrote:
> On 2005-09-09 02:47:46 -0400, Bj�rn Lindberg <·····@runa.se> said:
> 
>> I didn't have a programming medium 'accessible to most people' in my 
>> mind. Rather, a programming medium for experts.
> 
> 
> I think you misunderstand me. Even "experts" are more adept at 
> translating mentalese to natural language than to anything else. This is 
> universally true of human beings who have normal language capabilities. 
> Simply put, there is such a large amount of cortical resources 
> genetically programmed to deal with natural language, and we begin 
> learning and using it so early in life that it is no surprise that 
> natural language is the communications medium we are by far most fluent 
> in.

I know and agree with that. However, I see nothing to say that that 
ability transfers to programming. Programming languages and natural 
languages are used to express vastly different things. The difficulties 
I experience when learning a new natural language does not have anything 
in common with the difficulties I have expressing myself programmatically.

> Here we are in c.l.l, among "experts," and much, even most of our 
> discussion is conducted in natural language, not lisp code. Programming 
> media should leverage this hard wired, well practiced natural language 
> capability, not work against it.

I have a feeling programming media is better designed to leverage other 
abilites.

> What you are arguing for is a programmer priesthood. You want a world 
> where only people who are experts in programming can program. This would 
> exclude for example experts in chemistry writing software to do research 
> in chemistry, or biologists in biology, etc. Given the historical trend 
> toward wider, not more narrow computer use the idea that we should 
> retroactively make programming less acessible to domain experts is a 
> non-starter.

I am not *arguing for* a programmer priesthood. But I suspect that 
reality is such that programming requires a certain unique expertise. As 
such, expert programming is and will be reserved for experts. Of course 
there is nothing prohibiting us to have limited, beginner friendly ways 
of programming as well, and we do.

> As for examples of computer languages that leverage natural language, I 
> have already pointed out that common lisp macros leverage our natural 
> language abilities by allowing the extension of common lisp to resemble 
> natural language constructs in the problem domain.

Did you? I must have missed that. In any case, if it is LOOP that you 
are thinking of, the strengths of LOOP does not lie in it resembling 
natural language. It lies it what LOOP lets you do in terms of 
expressing complex iteration. Many people prefer the ITERATE macro with 
similar capabilities and s-expression syntax. I probably would to if I 
could be bothered to learn it, but LOOP happens to be good enough and in 
the standard.

> The failure of such pseudo-natural-language programming languages as 
> AppleScript comes of their giving the superficial appearance that they 
> are as tolerant of variation and error as a human listener when in fact 
> they are just as strict in many ways as fortran or C.

I am not familiar with AppleScript, so let's talk about SQL instead. SQL 
was intended as a natural language syntax of relational algebra. Maybe 
(I don't know) SQL actually makes it a tiny bit easier for some people 
who don't know relational algebra to construct basic queries, but for 
the power user SQL is nothing but a head ache. The irregular, complex 
syntax makes different types of queries hard to remember. It is hard to 
compose. It can be difficult to see how it maps to relational algebra, 
which after all is what really happens.


Bj�rn
From: Raffael Cavallaro
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <200509091355258930%raffaelcavallaro@pasdespamsilvousplaitmaccom>
On 2005-09-09 11:00:44 -0400, Bj�rn Lindberg <·····@runa.se> said:

> I am not *arguing for* a programmer priesthood. But I suspect that 
> reality is such that programming requires a certain unique expertise. 
> As such, expert programming is and will be reserved for experts. Of 
> course there is nothing prohibiting us to have limited, beginner 
> friendly ways of programming as well, and we do.

Which is why I like common lisp. It allows expert programmers to 
program expertly but also allows the creation of domain specific 
embedded languages so that domain experts can program as well. This is 
also my fundamental objection to languages that force a certain 
paradigm (e.g. functional) on the user - they make it very difficult to 
extend the language in a way that is idiomatic to domain experts.

I further susupect that as time goes on, more and more of the issues 
that require programming expertise to deal with now (e.g., threading) 
will be provided by languages or operating systems[1] in such a way 
that people who are domain experts but by no means expert programmers 
will be able to safely use them. This will continue the trend toward a 
layerd approach where the user layer need know little or nothing about 
the lower layers if they are implemented properly (ask the average Java 
programmer about memory management). This parallels the transition from 
CLI to GUI only it is taking place a generation later - broader 
computer use by a much larger group of people with less computer 
expertise.

Expert programmers will of course be needed to build this 
infrastructure but it will be used more by non-expert programmers just 
as most of the people who use a Java garbage collector today would have 
no idea how to write a gc themselves.

regards


[1] though not as add on libraries apparently - see:
<http://lambda-the-ultimate.org/node/view/950>
From: Ulrich Hobelmann
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <3oct4oF5bplfU1@individual.net>
Bj�rn Lindberg wrote:
>> If a programming medium is to be acessible to most people it should 
>> leverage our principal medium (natural language) for communicating 
>> thought (mentalese).
> 
> I didn't have a programming medium 'accessible to most people' in my 
> mind. Rather, a programming medium for experts.

Right, like we don't want natural languages for kids, but real natural 
languages that can express a lot (even good, interesting prose or poetry).

Interestingly, most adults learn foreign languages just fine if they 
work on it.

>> That is why we speak of programming "languages" and why language 
>> design has drawn heavily and should continue to draw heavily on 
>> natural languages.
> 
> Really? Can you give some examples of where programming languages have 
> successfully drawn on natural languages? I'll add that I consider COBOL, 
> SQL and Perl to be good examples of where such experiments failed.

Add AppleScript there :-/

-- 
My mouth says the words, my brain is thinking monstertrucks.
Joey (Friends)
From: Thomas A. Russ
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <ymiacimdj49.fsf@sevak.isi.edu>
Ulrich Hobelmann <···········@web.de> writes:

> >> That is why we speak of programming "languages" and why language 
> >> design has drawn heavily and should continue to draw heavily on 
> >> natural languages.
> > 
> > Really? Can you give some examples of where programming languages have 
> > successfully drawn on natural languages? I'll add that I consider COBOL, 
> > SQL and Perl to be good examples of where such experiments failed.
> 
> Add AppleScript there :-/

Although PostScript, being a stack-based, postfix language must surely
have been inspired by German, nicht?

-- 
Thomas A. Russ,  USC/Information Sciences Institute
From: mikel
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <NLJUe.25666$z6.35364897@news.sisna.com>
Ulrich Hobelmann wrote:
> Bj�rn Lindberg wrote:
> 
>>> If a programming medium is to be acessible to most people it should 
>>> leverage our principal medium (natural language) for communicating 
>>> thought (mentalese).
>>
>>
>> I didn't have a programming medium 'accessible to most people' in my 
>> mind. Rather, a programming medium for experts.
> 
> 
> Right, like we don't want natural languages for kids, but real natural 
> languages that can express a lot (even good, interesting prose or poetry).
> 
> Interestingly, most adults learn foreign languages just fine if they 
> work on it.
> 
>>> That is why we speak of programming "languages" and why language 
>>> design has drawn heavily and should continue to draw heavily on 
>>> natural languages.
>>
>>
>> Really? Can you give some examples of where programming languages have 
>> successfully drawn on natural languages? I'll add that I consider 
>> COBOL, SQL and Perl to be good examples of where such experiments failed.
> 
> 
> Add AppleScript there :-/

But AppleScript is essentially a network protocl dressed up in a 
programming language costume.
From: Kent M Pitman
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <u7jdpb0bl.fsf@nhplace.com>
Bj�rn Lindberg <·····@runa.se> writes:

> > That is why we speak of programming "languages" and why language
> > design has drawn heavily and should continue to draw heavily on
> > natural languages.
> 
> Really? Can you give some examples of where programming languages have
> successfully drawn on natural languages? I'll add that I consider
> COBOL, SQL and Perl to be good examples of where such experiments
> failed.

I think that's a pretty bold claim.

First, none of those languages can be objectively said to be failures.

Second, you haven't identified the parts of the language that tried to
draw on natural language nor analyzed the effect of having done so
against a control of what would happen if you didn't do this.
From: Björn Lindberg
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <4322f743$1@news.cadence.com>
Kent M Pitman wrote:
> Bj�rn Lindberg <·····@runa.se> writes:
> 
> 
>>>That is why we speak of programming "languages" and why language
>>>design has drawn heavily and should continue to draw heavily on
>>>natural languages.
>>
>>Really? Can you give some examples of where programming languages have
>>successfully drawn on natural languages? I'll add that I consider
>>COBOL, SQL and Perl to be good examples of where such experiments
>>failed.
> 
> 
> I think that's a pretty bold claim.
> 
> First, none of those languages can be objectively said to be failures.

It depends on how you define failure I guess. I consider Perl to be a 
failure because it makes people bad programmers and encourages bad 
programming.

> Second, you haven't identified the parts of the language that tried to
> draw on natural language nor analyzed the effect of having done so
> against a control of what would happen if you didn't do this.

Right. No one has done this to my knowledge, which means that neither I 
nor someone of the opposite opinion can point to any such studies. My 
opinion is based on two things: Firstly, my own experience of different 
languages. I consider Lisp to be much closer to "the right way" to move 
forward than Perl, for example. And the properties of Lisp that makes me 
think that does not share any similarities to natural language. The 
second thing is the fact that I have never ever read or seen anything 
even remotely convincing with regards to emulating natural language in 
programming language design. Someone brought up Larry Wall. I have read 
what he has to say, and what he considers to be successful natural 
language influences on his programming language and the corresponding 
features in Perl, and I daringly judge them to be complete garbage.

I am open to be proven wrong of course, but so far I have not seen 
anything leading me to the conclusion that trying to find similarities 
between natural and programming languages is the way to create better 
programming languages. On the contrary, I think that source of 
inspiration seems to result in bad languages.


Bj�rn
From: Kent M Pitman
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <uhdcsn4h3.fsf@nhplace.com>
Bj�rn Lindberg <·····@runa.se> writes:

> I am open to be proven wrong of course, but so far I have not seen
> anything leading me to the conclusion that trying to find similarities
> between natural and programming languages is the way to create better
> programming languages. On the contrary, I think that source of
> inspiration seems to result in bad languages.

Before I think about affirming or denying what you say, I'll want you
to make a testable claim.
From: Espen Vestre
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <m1slwahi4s.fsf@mordac.netfonds.no>
Bj�rn Lindberg <·····@runa.se> writes:

> right way" to move forward than Perl, for example. And the properties
> of Lisp that makes me think that does not share any similarities to
> natural language. 

I don't agree. It's much easier to read lisp as natural language than
it is to read perl code (even good code, not only the WORN code).

I helped my daughter with her math homework the other day by showing
her how to implement simple functions to generate prime numbers,
and noticed how easy it is to write lisp code that is pretty readable
for non-lispers (or not-yet-lispers ;)).

> I am open to be proven wrong of course, but so far I have not seen
> anything leading me to the conclusion that trying to find similarities
> between natural and programming languages is the way to create better
> programming languages. On the contrary, I think that source of
> inspiration seems to result in bad languages.

My favourite example of a bad language inspired by natural language is
SQL. But then, SQL was designed for human CLI interaction and not
really for programming.
-- 
  (espen)
From: Ulrich Hobelmann
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <3olj8nF6i0miU2@individual.net>
Espen Vestre wrote:
> My favourite example of a bad language inspired by natural language is
> SQL. But then, SQL was designed for human CLI interaction and not
> really for programming.

But unfortunately there's no really good alternative around, AFAIK.  A 
SQL with orthogonal structure, less verbosity would be great.  I would 
want to develop one such system in Lisp if I believed anyone would buy 
it in a SQL-dominated world, where everybody uses different SQL 
implementations that can't even all parse the same standard SQL query 
(i.e. don't run Oracle queries on MySQL).

-- 
My mouth says the words, my brain is thinking monstertrucks.
Joey (Friends)
From: Björn Lindberg
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <432a7961@news.cadence.com>
Espen Vestre wrote:
> Bj�rn Lindberg <·····@runa.se> writes:
> 
> 
>>right way" to move forward than Perl, for example. And the properties
>>of Lisp that makes me think that does not share any similarities to
>>natural language. 
> 
> 
> I don't agree. It's much easier to read lisp as natural language than
> it is to read perl code (even good code, not only the WORN code).

Of course. The by Larry Wall imagined similarities between Perl and a 
natural language is at a different level. I believe you need look at 
Perl as a *foreign* natural language that you have to learn from 
scratch. Or better yet, don't look.

> I helped my daughter with her math homework the other day by showing
> her how to implement simple functions to generate prime numbers,
> and noticed how easy it is to write lisp code that is pretty readable
> for non-lispers (or not-yet-lispers ;)).
> 
> 
>>I am open to be proven wrong of course, but so far I have not seen
>>anything leading me to the conclusion that trying to find similarities
>>between natural and programming languages is the way to create better
>>programming languages. On the contrary, I think that source of
>>inspiration seems to result in bad languages.
> 
> 
> My favourite example of a bad language inspired by natural language is
> SQL. But then, SQL was designed for human CLI interaction and not
> really for programming.

That is true. The failure of SQL is that it became the standard 
relational database language while being utterly inadequate for 
programmatic use.


Bj�rn
From: lin8080
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <4320AA53.F4A3E2DE@freenet.de>
Raffael Cavallaro schrieb:

> Maybe one day, when we have a full blown human level AI, computers will
> be programmers. Then it would be correct to say - wrt programming -
> that humans are not the center of the universe. But we are not there
> yet by a long shot.

I mean, this exactly is the point. There are programs today, able to
interpret, what a human says. Naturally next step is to tell the
computer, what you want it should do. This way no programming language
is necessary. Interesting, with what language this program is written.

stefan
From: Kent M Pitman
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <ubr31b1hk.fsf@nhplace.com>
Raffael Cavallaro <················@pas-d'espam-s'il-vous-plait-mac.com> writes:

> Since programming languages are used by programmers, and programmers
> are still human beings, then, as far as programming language design is
> concerned, human beings *are* the center of the universe.
> 
> The best programming language for human beings is one that leverages
> human beings' formidable skills in natural language, not one that
> works against our natural language expectations.

I disagree with the first paragraph above, but agree with the second.

Languages enable communication.  In human<->human languages, you might
claim humans are at the center.  (Meta-nit: I've always had trouble
with this particular use of language.  There is "nothing" at the center,
and there are two people, one at each side.  Averages and other collectives
can create the notion of a center, but that's another matter.  Ah well.)

In computer languages, the function is sometimes human<->human (in
design, teaching, etc.) but is mostly human<->computer, again with
"nothing" at the center.  (Meta-continued: I guess you could claim
there's a cyborg in the middle, but if you do, you see even more
clearly the problem I raised parenthetically in the prior paragraph.)

Lest you think I'm just rambling for the sake of rambling (which I am
probably in fact doing, but you need not think it), the implications
of this are really that good computer languages must work in the
intersection of the two sets.  Neither must they fight with the
computer nor with the person.  Moreover, since the computer and the
person judge "goodness" in different ways, the metric of judgment for
comparing them is necessarily multidimensional (a vector of "good for
computer" and "good for human") and as a natural consequence of its
two dimensionality, hard to order linearly, making discussions of
goodness and badness inherently a lot of fun because, of course, there
can be no right answer.

> This is part of the
> reason why lisp macros are so powerful - they allow the
> straightforward extension of the language to meet natural language
> expectations in the problem domain.

And useful/powerful tools for the human to understand how the machine
will perceive such abstracted requests.
 
> Maybe one day, when we have a full blown human level AI, computers
> will be programmers. Then it would be correct to say - wrt programming
> - that humans are not the center of the universe. But we are not there
> yet by a long shot.

Although at that point, curiously, it will no longer be likely that
the computer will use a different metric than people.  And it's likely
the compromise will have been, for better or worse, to make the
computer's understanding "fuzzier", permitting tolerance/flexibility
but also as a consequence permitting error and/or obfuscating any
ability to get a clear view of how the machine is perceiving what we
write.  Rather than reviewing code, we'll be giving in a Turing
Test...
From: Raffael Cavallaro
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <2005091112305616807%raffaelcavallaro@pasdespamsilvousplaitmaccom>
On 2005-09-10 07:34:07 -0400, Kent M Pitman <······@nhplace.com> said:

> the implications
> of this are really that good computer languages must work in the
> intersection of the two sets.  Neither must they fight with the
> computer nor with the person.
> 
My nit to pick with this is that, since we design and make the 
computers, we can change them, so it is possible to modify the 
computers and/or our interface to them (programming languages) while it 
is not possible to change the innate cognitive abilities of human 
beings. That is why I characterized the human half of this interaction 
as being the "center."

Of course practically speaking, it is also difficult to modify the 
computer world as well - it is to a large extent a given  - we have to 
deal with existing machine architectures, protocols, APIs, languages, 
etc. - so your point is well taken.


regards
From: Wade Humeniuk
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <KEZUe.224724$9A2.134672@edtnps89>
Raffael Cavallaro wrote:

> 
> Since programming languages are used by programmers, and programmers are 
> still human beings, then, as far as programming language design is 
> concerned, human beings *are* the center of the universe.
> 

I am just jumping into the middle (have to reply to somewhere).

Real progress is made when we do not put ourselves at the center
of the universe.  The realization that the Earth, then the Sun
are not the center of the universe "woke" us up and allowed to make
real substantive progress.  Progress by its definition means that
we realize something new.  If one has the premise that human beings,
their thought patterns, neural organization, etc
are the ideal model, you get stagnation.  To move ahead we
have to embrace ideas and discoveries that are "outside" of
how humans are built.  This is why the scientific method is so
powerful, it allows exploration outside of vague subjective
views of the world.  The scientific method entails that a scientist
in some ways puts his/her importance below the importance of the
method.  It is not the human being which is important but
the method.  With programming language design, it is language
design that is important, not how humans implement language.

> The best programming language for human beings is one that leverages 
> human beings' formidable skills in natural language, not one that works 
> against our natural language expectations. This is part of the reason 
> why lisp macros are so powerful - they allow the straightforward 
> extension of the language to meet natural language expectations in the 
> problem domain.
> 

I disagree.  A common complaint you hear from Lispers is "Lisp has
wrecked me, I cannot seem to like or want to program in any other
languages".  The ideas and implementations of Lisp have altered
many peoples way of looking at programming.  I would not say that
Lisp is a natural language, and so the unnatural has changed many
a Lisp programmer (for the better).


> Maybe one day, when we have a full blown human level AI, computers will 
> be programmers. Then it would be correct to say - wrt programming - that 
> humans are not the center of the universe. But we are not there yet by a 
> long shot.
> 

I think programming languages have shown us just how ambiguous human
beings are.  Humans simply cannot say what they really mean, nor do they
know what they really mean.  The order of the day is wild hand-waving.

Here a really good draft by Marvin Minsky talking about Consciousness,
and how human beings have not been dealing with it.

http://web.media.mit.edu/~minsky/E4/eb4.html

Wade
From: Raffael Cavallaro
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <2005091117044375249%raffaelcavallaro@pasdespamsilvousplaitmaccom>
On 2005-09-11 13:17:30 -0400, Wade Humeniuk 
<··················@telus.net> said:

> If one has the premise that human beings,
> their thought patterns, neural organization, etc
> are the ideal model, you get stagnation.

When speaking of communication with human beings we need to remember 
that the ability to change innate human language acquisition abilities 
can only happen on geological time scales  (i.e., over thousands or 
tens of thousands of years due to evolution). Human languages 
themselves only change significantly  on times scales of many decades 
or centuries. So the "stagnation" is simply a reflection of the fact 
that over time scales of the order of a  human lifetime, human 
communications abilities *are* fixed by genetics and existing natural 
languages.

Computer hardware and software are not fixed over such short time 
scales. The language we discuss here is still younger than one human 
lifespan and has undergone a great deal of change in that comparatively 
short time. Therefore progress in this field comes from creating new 
computer systems with interfaces that more closely fit innate human 
language abilities and existing natural languages because these latter 
are effectively fixed, not by trying to get people to communicate more 
like the current crop of computer hardware, operating systems and 
network protocols.

Again, as a practical matter, engineers *do* need to communicate with 
current hardware, OSes and protocols, but that should happen at the 
infrastructure level and be abstracted as much as possible from the 
higher level languages in which we commonly program.

regards
From: Hakon Alstadheim
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <7jlVe.5509$qE.1208212@juliett.dax.net>
Wade Humeniuk wrote:
> Raffael Cavallaro wrote:
> 
>>
>> Since programming languages are used by programmers, and programmers 
>> are still human beings, then, as far as programming language design is 
>> concerned, human beings *are* the center of the universe.
>>
> Real progress is made when we do not put ourselves at the center
> of the universe.
A programming language is not a theory of the universe. It is a tool a 5 
tonne axe would chop wood more effectively, but most people prefer an 
axe they can lift. A tool must afford use, preferrably several different 
kinds of use. A knife is a good example. To be able to cut cured meat, 
carve wood, scrape paint and a million other things i use my knife. The 
knife i choose will be designed so I can hold it in my hand, AND with 
the physics of whatever i might want to cut in mind. A knife has two 
ends, the handle and the blade.

* The handle should not be designed primarily to hold the blade but to 
fit in my hand. Just like a computer language should be designed 
primarily so that I can express whatever I am able to conceive.

The other end; i.e. the blade or the binary output, should be designed 
for working on the target (platform or piece of wood).

This is the essence of HLL

>> The best programming language for human beings is one that leverages 
>> human beings' formidable skills in natural language

> I disagree.  A common complaint you hear from Lispers is "Lisp has
> wrecked me, I cannot seem to like or want to program in any other
> languages".  The ideas and implementations of Lisp have altered
> many peoples way of looking at programming.  I would not say that
> Lisp is a natural language, and so the unnatural has changed many
> a Lisp programmer (for the better).
> 

That can be interpreted the other way aswell. After using a tool that 
fits your hand, you will no longer stand for using a knife with no 
handle. Lisp is natural in that it affords a lot of different ways of 
modelling the real world, just like our natural language does.
From: Alain Picard
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <87psre1ssl.fsf@memetrics.com>
Wade Humeniuk <··················@telus.net> writes:

> I disagree.  A common complaint you hear from Lispers is "Lisp has
> wrecked me, I cannot seem to like or want to program in any other
> languages".  The ideas and implementations of Lisp have altered
> many peoples way of looking at programming.  I would not say that
> Lisp is a natural language, and so the unnatural has changed many
> a Lisp programmer (for the better).

This seems akin to "physical intuition".  Apparently, if you ask a
bunch of (non technical) college students about the flight of an arrow
as it leaves the bow, the overwhelming majority believe the arrow
speeds up during its flight towards the target.  This is apparently
the "natural", albeit wrong, belief.

It's only after several years of studying Newtonian mechanics
that it becomes "obvious" that the speed is greatest as it
leaves the bow, slows as it arcs upwards, gains in speed again
(though not as much as initially, due to air friction) etc etc.

Maybe learning Lisp allows people to develop some sort of "programming
intuition" which lets them reason faster and more correctly about
programming problems and ideas.

The fact that Newtonian mechanics is "unnatural" doesn't mean
that people can't become "fluent" in it; after this is accomplished,
it feels most natural indeed.  So perhaps we should not be afraid
of making our computer languages unnatural; we should just worry
about making them "right".  
From: Joe Marshall
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <d5nben4r.fsf@alum.mit.edu>
Alain Picard <············@memetrics.com> writes:

> This seems akin to "physical intuition".  Apparently, if you ask a
> bunch of (non technical) college students about the flight of an arrow
> as it leaves the bow, the overwhelming majority believe the arrow
> speeds up during its flight towards the target.  This is apparently
> the "natural", albeit wrong, belief.
>
> It's only after several years of studying Newtonian mechanics
> that it becomes "obvious" that the speed is greatest as it
> leaves the bow, slows as it arcs upwards, gains in speed again
> (though not as much as initially, due to air friction) etc etc.
>
> Maybe learning Lisp allows people to develop some sort of "programming
> intuition" which lets them reason faster and more correctly about
> programming problems and ideas.

This sounds quite plausible.

Another issue may be that having a bunch of syntax gives you something
to occupy yourself with while you are thinking about the problem.
When you program in Java, for example, you need to write several lines
of `boilerplate' before you actually address the problem.  While you
are typing away in the foreground, you have a moment or two to think
about the approach to the problem you are trying to solve.  In Lisp,
you are immediately faced with the problem (well, there is that
initial open paren that you have to type).
From: Jeff
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <1126190062.542675.5250@g47g2000cwa.googlegroups.com>
Another 'drive to [complex] syntax' is its dividuous property of
separating those who understand it from those who don't. I'm only being
half facetious - this is one of the most powerful drivers there is.
Think of the priesthood that once ruled Babylon with the knowledge of
the multiplication table in base 60. 

-jeff
From: jayessay
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <m364tb4la5.fsf@rigel.goldenthreadtech.com>
Peter Seibel <·····@gigamonkeys.com> writes:

> So, does anyone know of any actual research into these issues--maybe
> by psychologists or human factors folks?

I pointed this out just a bit ago, but it was in response to that
JHarrop guy (the one who thinks complex syntax is good and toy
benchmarks are meaningful), so it was probably kill filed.  Anyway
these people are probably a decent place to start.

http://www.ppig.org/



/Jon

-- 
'j' - a n t h o n y at romeo/charley/november com
From: Rob Thorpe
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <1126212946.670050.313050@o13g2000cwo.googlegroups.com>
Peter Seibel wrote:
> In my travels among the LispNots I constantly run into people--even
> new programmers--who *want* elaborate syntax. (I.e. more elaborate
> than parens and prefix notation.) One part of me thinks that maybe
> this is due to the pervasive influence of non-Lisp languages; even
> non-programmers have an idea what languages should look like and it's
> not Lisp.

I think people have to be told why it's different.  Explain that the
language is written as tree, meaning code trees are simply a subset of
data trees.  (And so lisp is an ~first-order data type in lisp).  Then
why this means expansion of expressions like macros is not that
difficult to understand.

> Another part of me suspects that Larry Wall may be on to
> something when he says folks like different things to look different,
> thus all the different bits of syntax in Perl.

I think they call on slightly different skills.  In perl you have to:
work to make the syntax legible when you come back to it, avoid
compressing huge amounts of info into tiny spaces, avoid confusing
parts of odd expressions - or using things others may not even
understand.  In lisp you have only one problem, it's just big: how to
make the separation of one form amongst others clear amongst all the
parens.  Indentation and an indenting editor seems to help the most.
From: Ulrich Hobelmann
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <3obsfvF58lr2U1@individual.net>
Rob Thorpe wrote:
> Peter Seibel wrote:
>> In my travels among the LispNots I constantly run into people--even
>> new programmers--who *want* elaborate syntax. (I.e. more elaborate
>> than parens and prefix notation.) One part of me thinks that maybe
>> this is due to the pervasive influence of non-Lisp languages; even
>> non-programmers have an idea what languages should look like and it's
>> not Lisp.
> 
> I think people have to be told why it's different.  Explain that the
> language is written as tree, meaning code trees are simply a subset of
> data trees.  (And so lisp is an ~first-order data type in lisp).  Then
> why this means expansion of expressions like macros is not that
> difficult to understand.

Well, at least my friends have a really strong basic anti-Lisp attitude, 
and probably other people too.  When they hear "Lisp" it doesn't matter 
what you tell them, their ears close up...

I think you should better start with a new revolutionary technology you 
envision: programs that are free from syntax, and instead represented as 
XML.  Then you tell them what cool conversions this allows.  Only 
another day you tell them that Lisp provides all that in a 
human-readable (and writable) way, with a standard library, fast 
implementations, and it exists now, and has existed for decades.

-- 
My mouth says the words, my brain is thinking monstertrucks.
Joey (Friends)
From: Rob Thorpe
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <1126262135.649325.190620@g49g2000cwa.googlegroups.com>
Ulrich Hobelmann wrote:
> Rob Thorpe wrote:
> > Peter Seibel wrote:
> >> In my travels among the LispNots I constantly run into people--even
> >> new programmers--who *want* elaborate syntax. (I.e. more elaborate
> >> than parens and prefix notation.) One part of me thinks that maybe
> >> this is due to the pervasive influence of non-Lisp languages; even
> >> non-programmers have an idea what languages should look like and it's
> >> not Lisp.
> >
> > I think people have to be told why it's different.  Explain that the
> > language is written as tree, meaning code trees are simply a subset of
> > data trees.  (And so lisp is an ~first-order data type in lisp).  Then
> > why this means expansion of expressions like macros is not that
> > difficult to understand.
>
> Well, at least my friends have a really strong basic anti-Lisp attitude,
> and probably other people too.  When they hear "Lisp" it doesn't matter
> what you tell them, their ears close up...

That happens.  I sometimes have a fairly strong anti-lisp attitude
myself when there is going to be no-one else but me who knows how to
maintain a program if it's in lisp.

> I think you should better start with a new revolutionary technology you
> envision: programs that are free from syntax, and instead represented as
> XML.  Then you tell them what cool conversions this allows.  Only
> another day you tell them that Lisp provides all that in a
> human-readable (and writable) way, with a standard library, fast
> implementations, and it exists now, and has existed for decades.

Interesting, shall we turn all the curly-parens to pointy ones?  "<"
That might do the trick?  Nah, we should just release a very good
album, it's worked for the Rolling Stones.

More seriously, a while ago I spoke to someone using XML to store
configuration information.  I explained to him how it could be done
using s-expressions, and how they would simpler and shorter even if a
non-lisp language was used to read.  He explained to me that although
the parsing is very simple the verification of the data would be
difficult in the language he was using, which was a problem I hadn't
thought of.
From: Paul F. Dietz
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <b4adnQaE1PhDQL_eRVn-qg@dls.net>
Ulrich Hobelmann wrote:

> I think you should better start with a new revolutionary technology you 
> envision: programs that are free from syntax, and instead represented as 
> XML.

I'm encountering many people who recognize XML as lemming technology,
but perhaps that's just the circles I move in.

	Paul
From: Tayssir John Gabbour
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <1126253969.754678.160060@g47g2000cwa.googlegroups.com>
Peter Seibel wrote:
> In my travels among the LispNots I constantly run into people--even
> new programmers--who *want* elaborate syntax. (I.e. more elaborate
> than parens and prefix notation.) One part of me thinks that maybe
> this is due to the pervasive influence of non-Lisp languages; even
> non-programmers have an idea what languages should look like and it's
> not Lisp. Another part of me suspects that Larry Wall may be on to
> something when he says folks like different things to look different,
> thus all the different bits of syntax in Perl. (Certainly within Lisp
> there's evidence that special purpose syntax is occasionally
> considered useful, at least by some folks; c.f. the standard reader
> macros, diffferent syntax for different kinds of numbers, LOOP,
> FORMAT.) If Wall's view is correct, than that means that Lisp's
> minimal syntax is a trade-off--we accept a uniform, and thus less
> intuitive (per Wall) syntax because of the many advantages is gives us
> in other respects (macros, ease of manipulation in editors, etc.) But
> that would also mean that if one were designing a domain-specific
> langauge that wasn't necessarily going to support, say, macros then
> maybe the tradeoff would turn against s-exp syntax.
>
> So, does anyone know of any actual research into these issues--maybe
> by psychologists or human factors folks?

I've been toying with the idea of asking Noam Chomsky his thoughts on
the matter, as last weekend he seemed to be familiar with Lisp. (No
surprise, as he works at MIT.) Or you could email him, chomsky at mit's
edu domain.

But I'd hope such studies discuss XML/HTML/XML-manipulation-languages,
as they're very Lisp-like and many desire to even write full-blown
languages in them (with the disappointing conclusion that breaking down
the wall of verbosity to make this feasible would likely give you
Lisp). XML/HTML are popular without precedent, cutting across user
demographics.


Tayssir
From: Förster vom Silberwald
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <1126529292.091628.178590@g43g2000cwa.googlegroups.com>
Peter Seibel wrote:
> In my travels among the LispNots I constantly run into people--even
> new programmers--who *want* elaborate syntax. (I.e. more elaborate
> than parens and prefix notation.) One part of me thinks that maybe
> this is due to the pervasive influence of non-Lisp languages; even
> non-programmers have an idea what languages should look like and it's
> not Lisp.

Clean language had a similar problem: some people hatred it for not
being "syntactically more idiosyncratic".

OCaml users like their overloaded syntax and would never touch a
language which seems clear and ordered.

Schneewittchen
From: Ulrich Hobelmann
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <3oljceF6i0miU3@individual.net>
F�rster vom Silberwald wrote:
> OCaml users like their overloaded syntax and would never touch a
> language which seems clear and ordered.

Isn't OCaml the ML dialect that uses different syntaxes for integer and 
FP math?

-- 
My mouth says the words, my brain is thinking monstertrucks.
Joey (Friends)
From: Neelakantan Krishnaswami
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <slrndibfg5.naf.neelk@gs3106.sp.cs.cmu.edu>
In article <··············@individual.net>, Ulrich Hobelmann wrote:
> F�rster vom Silberwald wrote:
>> OCaml users like their overloaded syntax and would never touch a
>> language which seems clear and ordered.
> 
> Isn't OCaml the ML dialect that uses different syntaxes for integer and 
> FP math?

Yes, but that's one of its good ideas. The reason is that tacking a
little overloading onto ML is not a good idea; you need to do a
comprehensive job (like Haskell's typeclasses) if you want something
that's not a PITA.

Coming back on topic for this ng and this thread: 

I once talked briefly with Dave McQueen, and he told me that the
reason that ML's syntax is so odd is that the people who invented it
were a bunch of logicians who didn't really know much about parsing.

See, the first implementations of ML were on top of (pre CL) Lisp
systems, and the way they implemented the parser was by calling READ
to tokenize the input stream, and then they looked at each incoming
symbol, and dispatched on it by looking up a parser function that was
attached to the symbol's plist, and calling that on the rest of the
token stream.  This resulted in several linguistic oddities.

First, you could only ever use a token to parse in one way. For
example, the fact that Ocaml uses commas to separate elements of a
tuple, and semicolons to separate elements of a list, originates from
this decision. (For the past fifteen or so years, Caml has used a
parser generator, though.)

Secondly, many ML constructs (such as pattern matching) don't have
closing brackets, because the parsing function would eat some of the
stream and then it would be done. This lead to numerous shift/reduce
conflicts when people tried to re-cast the language as a declarative
grammar to be fed to a parser generator.


-- 
Neel Krishnaswami
·····@cs.cmu.edu
From: Rob Warnock
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <muydncn30p5w87veRVn-vw@speakeasy.net>
Neelakantan Krishnaswami  <·····@cs.cmu.edu> wrote:
+---------------
| Ulrich Hobelmann wrote:
| > Isn't OCaml the ML dialect that uses different syntaxes
| > for integer and FP math?
| 
| Yes, but that's one of its good ideas.
+---------------

And not original. BLISS (1971) did it too. Storage was typeless
(and hence so were variables, which were only names for locations);
only operators were typed. E.g. [using ":=" here instead of the
original Teletype left-arrow, which got stolen by ASCII "_"]:

	I := .I + (2 * .J);

versus:

	X := .X FADR (2.34 FMPR .Y);

The problem was that you could easily shoot yourself in the foot
if you used integer ops on floating values or v-v.


-Rob

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: ···@itasoftware.com
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <1126621647.034073.7910@g43g2000cwa.googlegroups.com>
Rob Warnock wrote:
> And not original. BLISS (1971) did it too. Storage was typeless
> (and hence so were variables, which were only names for locations);
> only operators were typed. E.g. [using ":=" here instead of the
> original Teletype left-arrow, which got stolen by ASCII "_"]:
>
> 	I := .I + (2 * .J);
>
> versus:
>
> 	X := .X FADR (2.34 FMPR .Y);
>
> The problem was that you could easily shoot yourself in the foot
> if you used integer ops on floating values or v-v.

Actually the bigger problem was all the .s.  Period was the dereference
operator.  Since the BLISS compiler was highly optimizing, it was very
happy to take code like:

      IF A EQL B THEN ...

,notice that the address of A was never the same as the address of B
and silently delete the entire IF block.  Many of us who used BLISS at
DEC got in the habit of going over the compiler assembly listing
looking for deleted code before doing any other debugging.

The folklore among engineers circa 1979 was that every . in a BLISS
program cost DEC $10.  It didn't help that the compiler authors
resisted supporting deleted code warnings on the grounds that you might
have intended to do that.

Dan Pierson
From: Rob Warnock
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <xOmdnQLwY-FNdLveRVn-1w@speakeasy.net>
<···@itasoftware.com> wrote:
+---------------
| Rob Warnock wrote [re BLISS]:
| > 	I := .I + (2 * .J);
| > versus:
| > 	X := .X FADR (2.34 FMPR .Y);
| > The problem was that you could easily shoot yourself in the foot
| > if you used integer ops on floating values or v-v.
| 
| Actually the bigger problem was all the .s.  Period was the
| dereference operator.
+---------------

Yes, since variable names were *always* addresses [well, pointers],
just like in assembler. So "I := .I + (2 * .J);" could be translated
directly into:

    MOVE  T0, I
    MOVE  T1, J
    IMULI T1, 2
    ADD   T0, T1
    MOVEM T0, I

and "I = J + 2;" [in C, "I = &J + 2;"] was:

    MOVEI T0, J
    ADDI  T0, 2
    MOVEM T0, I

+---------------
| Since the BLISS compiler was highly optimizing...
+---------------

Yup. It actually would emit this instead of my first example above:

    MOVE  T0, J
    IMULI T0, 2
    ADDM  T0, I

+---------------
| it was very happy to take code like:
|       IF A EQL B THEN ...
| ,notice that the address of A was never the same as the address of B
| and silently delete the entire IF block. ... It didn't help that the
| compiler authors resisted supporting deleted code warnings on the
| grounds that you might have intended to do that.
+---------------

Actually, IIRC it was more that the common style for writing STRUCTUREs
(a kind of macro that a BLISS user wrote to define the algorithm used
when a variable was "subscripted", e.g., "A[.I, .J+2]") in systems
code used lots of subsidiary macros for fields, e.g., "P = .P[NEXT]"
where "NEXT" might be a macro that expanded into "0,3,18,18", which
inside the STRUCTURE ended up fetching the left-hand-side of word 3
of the block pointed to by P. Those STRUCTUREs were often written as
CASE expressions that dispatched on the various small constants of the
field-selector macro expansions, the result being that there was *LOTS*
of dead code all over the place that needed to be eliminated!!

...which the BLISS compiler was very good at, but the dead code
elimination didn't occur until *long* after the macro expansion
was done & gone, so how was the compiler to know which bits of dead
code were "normal" (e.g., the massive amounts generated by the above
STRUCTURE + CASE + selector macros idiom) and which were due to users'
mistakes?!?


-Rob

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: wildwood
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <1126632841.203243.76670@g49g2000cwa.googlegroups.com>
Peter Seibel wrote:
> In my travels among the LispNots I constantly run into people--even
> new programmers--who *want* elaborate syntax. (I.e. more elaborate
> than parens and prefix notation.) One part of me thinks that maybe
> this is due to the pervasive influence of non-Lisp languages; even
> non-programmers have an idea what languages should look like and it's
> not Lisp. Another part of me suspects that Larry Wall may be on to
> something when he says folks like different things to look different,
> thus all the different bits of syntax in Perl...
> ...But
> that would also mean that if one were designing a domain-specific
> langauge that wasn't necessarily going to support, say, macros then
> maybe the tradeoff would turn against s-exp syntax.
>

I'm still relatively new to Lisp, but I think there's some connection
between newbies' desire for a lot of syntax, and the creation of DSLs
as a standard Lisp development technique.  Let's see if I can put it
into words.

When learning something new, I need to be able to manage the complexity
somehow.  When learning a programming language, that usually means that
I look at the discrete building blocks of the language.  For Perl, that
meant getting the hang of the ··@/%/& details, control flow, and I/O.
For Java, that meant getting a feel for objects and types.

Looking at Lisp from this perspective, the basic building blocks appear
to be list/cons/car/cdr, and a metric ton of parentheses.  And, from
this perspective, the language looks pretty useless.

I think one reason people want more syntax is, people want the
structure of the language to tell them how to solve their problem.
People expect a Java solution to involve OO design, and so they know
they're on the right track if they're figuring out what objects to use.

People who have no experience with Lisp don't know what a Lisp solution
to their problem would look like, and they can't tell what it should
look like from Lisp's syntax, and that causes cognitive discomfort.  It
takes people a while to grok that a Lisp solution could look like an
entirely new, special-purpose language.  All that uncertainty makes
them crave more restrictions - after all, if there was more restrictive
syntax, they feel like they could have a better picture of the solution
in their head.

My two cents.

--David Brandt
From: LuisGLopez
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <1126647849.990664.220870@g49g2000cwa.googlegroups.com>
Hi!

I'm extremely newbie to Lisp (as you who had suffered my code in c.l.l
can testify), and the words of wildwood makes a lot of sense to me.

wildwood ha escrito:

> I think one reason people want more syntax is, people want the
> structure of the language to tell them how to solve their problem.
[...]
>
> People who have no experience with Lisp don't know what a Lisp solution
> to their problem would look like

If I understood you right, I think that's the point. The more
restrictions a common person has, the easier for him to develop
anything. It's like in every art: if your teacher tells you to create a
piano sonata with this or that restriction, you may feel constrained at
first, but then you see that you can start sooner. If you leave all
teachers, and begin to create all alone, you feel the emptiness of the
white sheet of paper quite... well, I lack the term (please excuse my
english, by the way).
I think Lisp is like the more freedom you can get (al least, of the
languages I know), and that is great for a grown-up programmer (Hey!
I'm not saying "old"! ;)), but for a beginner it can be intimidating (I
think that was the word I was seeking before).

Luis.
From: Robert Uhl
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <m3hdcoh9f5.fsf@4dv.net>
Here's another thought about why syntax is nice: we are, I think, better
at recognising glyphs than words.  So the glyph + is better than the
word 'plus', and a glyph sequence [x] is better than the word 'aref.'
Perhaps I'm wrong, though.

-- 
Robert Uhl <http://public.xdi.org/=ruhl>
If God had intended for us to use French units He would have had 10 disciples.
From: ······@earthlink.net
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <1126659869.193993.175320@f14g2000cwb.googlegroups.com>
> Here's another thought about why syntax is nice: we are, I think, better
> at recognising glyphs than words.  So the glyph + is better than the
> word 'plus', and a glyph sequence [x] is better than the word 'aref.'
> Perhaps I'm wrong, though.

That's correct as far as it goes, but there can't be enough glyphs and
no
one[1] can keep track of how the glyphs interact.  Plus, most
operations, including addition, are not binary.

It shouldn't be surprising that rules that work for +-/* don't work
well when you've got 1000x as many operators, most with different
arity.

-andy

[1] APL folks can, but no one likes their rule.
From: LuisGLopez
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <1126663868.655445.326020@g47g2000cwa.googlegroups.com>
Pascal: In my modest opinion, your analysis is absolutely brilliant. I
didn't think it that way; we are "trapped" in an old paradigm, and we
should create programming languages with some kind of "computer-like"
paradigm; I mean, trying to take advantage of computer's idiocincracy.
That's, if I understood you correctly.
From: Pascal Costanza
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <3op4jlF745bsU1@individual.net>
Robert Uhl wrote:
> Here's another thought about why syntax is nice: we are, I think, better
> at recognising glyphs than words.  So the glyph + is better than the
> word 'plus', and a glyph sequence [x] is better than the word 'aref.'
> Perhaps I'm wrong, though.

Here's another thought about why syntax is nice: we are, I think, better 
at inventing ever new excuses not to learn new things. So the sun 
orbiting around the earth is better than the earth orbiting around the 
sun, and caveman painting is better than communicating via words on 
newsgroups. Perhaps I'm wrong, though.


Pascal

-- 
OOPSLA'05 tutorial on generic functions & the CLOS Metaobject Protocol
++++ see http://p-cos.net/oopsla05-tutorial.html for more details ++++
From: LuisGLopez
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <1126654163.689978.177820@z14g2000cwz.googlegroups.com>
Pascal Costanza ha escrito:

> sun, and caveman painting is better than communicating via words on
> newsgroups.

(Well, in fact, I think a lot can be said for painting against posting
in usenet... :))

Seriously... Although I think it's true that humans are lazy, I also
think that we all love novelties. New things are attractive by
themselves. So I don't think that we don't learn new things because
they are new.

So, you may wonder *why* we don't learn new things? Don't look at me...
of course I don't know! :-)

Luis.
From: Pascal Costanza
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <3op8cuF74c14U1@individual.net>
LuisGLopez wrote:

> So, you may wonder *why* we don't learn new things? Don't look at me...
> of course I don't know! :-)

I think Alan Kay is right with the characterization that we are still in 
the very early phase of computer science, and we are still trying just 
to imitate what we already know instead of trying to figure out the 
possibilities. Essentially, we are imitating just paper. It will take 
some considerable time until we make use of the full potential of computers.

I recall him stating that it took about 200 years after the invention of 
  letterpress that people started to make use of the actual 
possibilities of books. He identifies that point in time when the first 
book was published that actually had page numbers. This allowed people 
to develop much more comprehensive arguments and theories. Until then, 
books only imitated what people already knew - speech. Essentially, 
thoughts couldn't be more complicated than one page. After adding page 
numbers, people started to be able to refer to different parts of the 
same book, which is a big improvement. And this actually led to 
development of new and more sophisticated theories about all aspects of 
our world.

There was a stage when, for example, music software tried to find ways 
to take advantage of the input and output devices that computers have. 
You could actually draw sound envelopes, etc., on the screen by using a 
mouse. Nowadays, typical mainstream music software has replaced these 
ideas with imitations of slide controls, etc., that you can typically 
find on mixing desks. It's apparently easier to learn them, because they 
look and behave in familiar ways. But they are actually much worse to 
use, because you cannot touch and grab them with your hands. Instead, 
you have to make funny movements with your mouse in order to get the 
effects you want, and they are never as precise as the real thing.

In a similar vein, the text we type in text editors is just represented 
in a two-dimensional fashion, so is just an imitation of paper, and all 
the discussion about syntax is about imitating tricks to make text more 
efficient to work with on paper. This is completely backwards, and 
doesn't really work as well as on paper. On paper, you can just add a 
few more lines to a single glyph to make it a different one. If you use 
a pencil, you can even easily delete things you made and get back to a 
previous stage. Two-dimensional text on the screen can never reach the 
same flexilibity in that regard, and typing special characters is 
typically a pain, because you have to use shift, control, alt, meta, 
hyper, whatever combinations to get them. On the other hand, Lisp code 
is much easier to work with because you can easily select large chunks 
of connected code and easily move in and out of s-expressions. And if 
you risk to take a deeper look, you will notice that Lisp code is 
actually not just two-dimensional text but that it is a 
multi-dimensional structure that consists of symbols, lists and atoms 
which are all objects with identity that you can manipulate and generate 
in macros, which allows you to embed code in your programs that you 
could have never typed manually. I don't think it's ideal, there is 
(probably lots of) room for improvement, but worrying about imitations 
of stuff optimized for being used on paper is not going to get us 
forward in any sense. At least not in my humble opinion.

People are able to adapt to new ways of doing things. The majority of 
things we have to deal with in everyday life would have been completely 
alien and unintuitive (and actually scary) to people just 100 years ago. 
  Douglas Engelbart has given the following example in a talk: Do you 
really think that it's a sane idea to bet your life on driving in a 
vehicle at 50 mph, trying to get on a highway, and all you have 
available to make sure that nothing bad happens is a small mirror of 
about the size of your hand?

If people want to waste their time on wondering which kinds of orderings 
of brackets, braces, parentheses, semicolons, colons, dots, commas, 
dollars, ampersands, at signs, backward slahes, forward slashes, dashes, 
underscores, etc., are more "natural", they can of coures go ahead and 
do so. But I am just not interested.


Pascal

-- 
OOPSLA'05 tutorial on generic functions & the CLOS Metaobject Protocol
++++ see http://p-cos.net/oopsla05-tutorial.html for more details ++++
From: Raffael Cavallaro
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <2005091402044450073%raffaelcavallaro@pasdespamsilvousplaitmaccom>
BTW I've just finished your paper on ContextL - really interesting 
stuff, very impressive.

On 2005-09-13 20:15:25 -0400, Pascal Costanza <··@p-cos.net> said:

> I think Alan Kay is right with the characterization that we are still 
> in the very early phase of computer science, and we are still trying 
> just to imitate what we already know instead of trying to figure out 
> the possibilities.

Or maybe we imitate what we already know with machines of our own 
design because we can change these machines and the interface to these 
machines much more easily and quickly than we can change the genetic 
endowment and cognitive history of our species. What is the device with 
a microprocessor in it most commonly used by people? The cell phone. 
Why? Because it requires little or no change in the way people have 
communicated for the last 50,000 years. You just pick it up and talk.

> I recall him stating that it took about 200 years after the invention 
> of   letterpress that people started to make use of the actual 
> possibilities of books. He identifies that point in time when the first 
> book was published that actually had page numbers. This allowed people 
> to develop much more comprehensive arguments and theories. Until then, 
> books only imitated what people already knew - speech. Essentially, 
> thoughts couldn't be more complicated than one page.

Preliterate peoples are more than capable of sustaining arguments of 
more than a page length. If anything their memories are better than 
those of literate people. Remember that the whole Odyssey was routinely 
recited from memory. Just because Alan Kay said it doesn't make it 
true. Before Gutenberg we have astronomical works of the Chinese and 
the Maya, the dialogs of Plato, Archimedes' proofs, the works of Indian 
and Arabic mathematicians - many of these much more than a page and 
quite complex.

> After adding page numbers, people started to be able to refer to 
> different parts of the same book, which is a big improvement. And this 
> actually led to development of new and more sophisticated theories 
> about all aspects of our world.

The great impact of the printing press was that it allowed mass 
production and portability (printed books could be smaller), which 
allowed information to spread more quickly. Page numbering was not why 
printed books revolutionized human communication, rapid disemination of 
large quantities of information was. Instead of just talking with your 
neighbor you could now hear from scores of people from around the 
planet and across time.

People are able to adapt to new ways of doing things.
> 
And yet with all of these novel possibilities, your whole post consists 
of natural language not very different than you might have spoken it in 
a conversation. Why? Because people are absolute wizzards at natural 
language since our recent evolutionary past consists almost entirely of 
life in small illiterate groups. We have brains specialized for it and 
engage in continual practice of it from infancy onward. As a result we 
excel at thinking about the world in the natural language pattern of 
noun-verb-preposition (not necessarily in that order of course). Since 
we have the circuitry and practiced skills for it, seems like computer 
interfaces ought to leverage that paradigm. I heard about one once that 
was very much like spoken language, what was it called...? Oh yeah - 
Smalltalk. It was invented by this guy called Alan Kay...

   ... of course he wrote it in lisp. ;^)

The future of computer interfaces should be the next step beyond 
Smalltalk. Even more like natural language. But it should also be able 
to go in the other direction when needed, down toward the machine. That 
means it should probably be developed in an extensible language. Sounds 
like another job for lisp.

regards
From: Ulrich Hobelmann
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <3oq482F6o57cU1@individual.net>
Raffael Cavallaro wrote:
> Or maybe we imitate what we already know with machines of our own design 
> because we can change these machines and the interface to these machines 
> much more easily and quickly than we can change the genetic endowment 
> and cognitive history of our species. What is the device with a 
> microprocessor in it most commonly used by people? The cell phone. Why? 
> Because it requires little or no change in the way people have 
> communicated for the last 50,000 years. You just pick it up and talk.

Then why do people waste their time typing those PITA SMS messages, on 
phones that have nothing but number keys?  (I don't.  Talking is much 
faster/easier as it reuses my builtin talking skills.)

> The great impact of the printing press was that it allowed mass 
> production and portability (printed books could be smaller), which 
> allowed information to spread more quickly. Page numbering was not why 
> printed books revolutionized human communication, rapid disemination of 
> large quantities of information was. Instead of just talking with your 
> neighbor you could now hear from scores of people from around the planet 
> and across time.

No, the printing revolution came not with printing, but with movable 
individual letters.  Before then, only the Bible was printed.  This is 
like the idea of code generation vs having programmable macros ;)

> People are able to adapt to new ways of doing things.
>>
> And yet with all of these novel possibilities, your whole post consists 
> of natural language not very different than you might have spoken it in 
> a conversation. Why? Because people are absolute wizzards at natural 
> language since our recent evolutionary past consists almost entirely of 
> life in small illiterate groups. We have brains specialized for it and 

And for ten years maybe human language is degrading quickly (with lots 
of people), because terse Perl-like syntax is easier to type on Instant 
Messenger or SMS than real language.  Ok, hip-hop might be reversing 
that somewhat.

-- 
My mouth says the words, my brain is thinking monstertrucks.
Joey (Friends)
From: Pascal Costanza
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <3oq9shF7929cU1@individual.net>
Raffael Cavallaro wrote:

> I heard about one once that 
> was very much like spoken language, what was it called...? Oh yeah - 
> Smalltalk. It was invented by this guy called Alan Kay...
> 
>   ... of course he wrote it in lisp. ;^)

Well, Smalltalk _is_ a Lisp dialect... ;)

> The future of computer interfaces should be the next step beyond 
> Smalltalk. Even more like natural language. But it should also be able 
> to go in the other direction when needed, down toward the machine. That 
> means it should probably be developed in an extensible language. Sounds 
> like another job for lisp.

Maybe you're right, maybe Alan Kay is wrong, maybe I am wrong. However, 
_that_ is the kind of topic that should be discussed, i.e. how to move 
things forward, not whether some accidental syntactic conventions are 
better than others.

I think that Common Lisp (more so than, say, Scheme) is in a good 
position to be the basis of something more interesting. Since the 
syntactic elements have identity (think: object identity), it should be 
possible to build an MVC-style framework on top that gives you different 
views on the code.

See http://www.ida.liu.se/~robwe/research-interests.html - especially 
Figure 1 - for some inspiration. The idea is that any kind of object can 
be used as literal data - something that Common Lisp already provides - 
and has a practical user interface. (Not just numbers and strings, so to 
speak.)


Pascal

P.S.: Note that I have used a hyperlink in this posting, something that 
definitly goes beyond natural, spoken language. ;)

-- 
OOPSLA'05 tutorial on generic functions & the CLOS Metaobject Protocol
++++ see http://p-cos.net/oopsla05-tutorial.html for more details ++++
From: Raffael Cavallaro
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <2005091411034943658%raffaelcavallaro@pasdespamsilvousplaitmaccom>
On 2005-09-14 05:46:57 -0400, Pascal Costanza <··@p-cos.net> said:

> maybe Alan Kay is wrong, maybe I am wrong.

No, I think *you* are right and are implementing exactly what I'm 
talking about, but possibly you don't see it yet in that light.

In natural language, almost every utterance can only be understood in 
its given context. With ContextL you have innovated by bringing lisp 
closer to the way we use natural language. Your innovation does not 
consist in creating a new mode of thinking or expression that is closer 
to the way the underying machine works, but consists instead of 
creating a new interface that allows us to think about problems and 
express their solutions in a way that is closer to our natural language 
strengths.

I agree with you that this is *not* simply an issue of surface syntax - 
it makes less difference which surface syntax can be argued to be 
"closer" to natural language than which interface has semantics that 
play to our natural language strenghts. This is the problem with 
AppleScript - it has a surface syntax that fools the naive user into 
believing that it also has the full range of natural language semantics 
when in fact it does not.

ContextL has semantics - context sensitivity - that play to our natural 
language strengths. Could the surface syntax be made more 
natural-language-like? Probably, but this is less important than having 
natural language semantics. The surface syntax can be added later.

regards
From: Björn Lindberg
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <43285d57$1@news.cadence.com>
Raffael Cavallaro wrote:
> On 2005-09-14 05:46:57 -0400, Pascal Costanza <··@p-cos.net> said:
> 
>> maybe Alan Kay is wrong, maybe I am wrong.
> 
> 
> No, I think *you* are right and are implementing exactly what I'm 
> talking about, but possibly you don't see it yet in that light.
> 
> In natural language, almost every utterance can only be understood in 
> its given context. With ContextL you have innovated by bringing lisp 
> closer to the way we use natural language. Your innovation does not 
> consist in creating a new mode of thinking or expression that is closer 
> to the way the underying machine works, but consists instead of creating 
> a new interface that allows us to think about problems and express their 
> solutions in a way that is closer to our natural language strengths.
> 
> I agree with you that this is *not* simply an issue of surface syntax - 
> it makes less difference which surface syntax can be argued to be 
> "closer" to natural language than which interface has semantics that 
> play to our natural language strenghts. This is the problem with 
> AppleScript - it has a surface syntax that fools the naive user into 
> believing that it also has the full range of natural language semantics 
> when in fact it does not.
> 
> ContextL has semantics - context sensitivity - that play to our natural 
> language strengths. Could the surface syntax be made more 
> natural-language-like? Probably, but this is less important than having 
> natural language semantics. The surface syntax can be added later.

Oh, come on. Everything we *do* is context dependant. Even organisms 
that lack speech is somehow able to act context dependently. Or to put 
it another way: The contexts of ContextL are dynamic contexts. As such 
they have more to do with how the program *runs* than how it looks like 
in text (ie. the language part).

If you are going to define all human thinking as "natural language" then 
we may be in closer agreement that I thought, but that seems like a 
fairly uninteresting definition to me, because then all programming 
languages are natural language based by definition since we created them.


Bj�rn
From: Raffael Cavallaro
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <2005091414314211272%raffaelcavallaro@pasdespamsilvousplaitmaccom>
On 2005-09-14 13:26:47 -0400, Bj�rn Lindberg <·····@runa.se> said:

> Oh, come on. Everything we *do* is context dependant. Even organisms 
> that lack speech is somehow able to act context dependently.

You are arguing my point. It is computers that are usually unable to 
deal with different contexts. Historically one has had to specify 
pretty much everything to get them to do what one wants. Stating that 
non-human animals act in a context dependent way is simply saying that 
living organisms are not like computers in this regard. I have no 
argument with that.
From: Björn Lindberg
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <43293890@news.cadence.com>
Raffael Cavallaro wrote:
> On 2005-09-14 13:26:47 -0400, Bj�rn Lindberg <·····@runa.se> said:
> 
>> Oh, come on. Everything we *do* is context dependant. Even organisms 
>> that lack speech is somehow able to act context dependently.
> 
> 
> You are arguing my point. It is computers that are usually unable to 
> deal with different contexts. Historically one has had to specify pretty 
> much everything to get them to do what one wants. Stating that non-human 
> animals act in a context dependent way is simply saying that living 
> organisms are not like computers in this regard. I have no argument with 
> that.

Your original claim was that programming languages need to be similar to 
natural languages. That is a much more specific claim than saying that 
computer systems should be suited to human use, which seems to be what 
you are saying now. The latter claim I can certainly agree with, but it 
is trivial.


Bj�rn
From: Raffael Cavallaro
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <2005091511414827544%raffaelcavallaro@pasdespamsilvousplaitmaccom>
On 2005-09-15 05:02:07 -0400, Bj�rn Lindberg <·····@runa.se> said:

> Your original claim was that programming languages need to be similar 
> to natural languages. That is a much more specific claim than saying 
> that computer systems should be suited to human use, which seems to be 
> what you are saying now.

"computer systems should be suited to human use" is a vague statement 
that encompases both ends of the discussion here, so no, it is not what 
I am saying.

The discussion here takes for granted that "computer systems should be 
suited to human use." The question is by moving in which direction:

A. Human-computer interfaces will progress if they become more like 
human communication - i.e., more like natural language.

B. Human-computer interfaces will progress if we ignore natural 
language and devise new modes of communication more suited to dealing 
with computers. These new modes of communication will become "suited to 
human use" because people are cognitively flexible and will learn and 
adapt to these new modes of communication.

I have been arguing for A. I think people are far less cognitively 
flexible than once thought (i.e., we are *not* blank slates). We have 
definite cognitive strengths and biases and we should play to them by 
making HCIs more like natural language.

Most recently I have argued that ContextL falls under the description 
of A rather than B - it adds natural-language-like features to CLOS. 
The feature it adds - contexts - makes computer systems more "suited to 
human use" because context is familiar to people from their well 
practiced and innate natural language communication skils. Contexts do 
not make computer systems more "suited to human use" by adding 
something unfamiliar, something more computer-like than 
natural-language-like, which people will have to learn and adjust to.

regards
From: Pascal Costanza
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <3ou8rcF7pbtrU1@individual.net>
Raffael Cavallaro wrote:
> On 2005-09-15 05:02:07 -0400, Bj�rn Lindberg <·····@runa.se> said:
> 
>> Your original claim was that programming languages need to be similar 
>> to natural languages. That is a much more specific claim than saying 
>> that computer systems should be suited to human use, which seems to be 
>> what you are saying now.
> 
> "computer systems should be suited to human use" is a vague statement 
> that encompases both ends of the discussion here, so no, it is not what 
> I am saying.
> 
> The discussion here takes for granted that "computer systems should be 
> suited to human use." The question is by moving in which direction:
> 
> A. Human-computer interfaces will progress if they become more like 
> human communication - i.e., more like natural language.
> 
> B. Human-computer interfaces will progress if we ignore natural language 
> and devise new modes of communication more suited to dealing with 
> computers. These new modes of communication will become "suited to human 
> use" because people are cognitively flexible and will learn and adapt to 
> these new modes of communication.

Thanks for detailing your position. I think that both will probably play 
a role. Note that there are people who are strong in thinking textually 
and others who are strong in thinking visually. They will probably have 
different preferences. Just to mention one dimension.

I think it's just too early to tell what the ideal means of interaction 
with computers will be, and we probably won't live long enough to see 
the "final" widely accepted results. Therefore, it's good to explore 
different directions, and not to dismiss any of them, at least not 
without further thought. (I only don't like an unreflected acceptance of 
the status quo just because it is the status quo.)

> Most recently I have argued that ContextL falls under the description of 
> A rather than B - it adds natural-language-like features to CLOS. The 
> feature it adds - contexts - makes computer systems more "suited to 
> human use" because context is familiar to people from their well 
> practiced and innate natural language communication skils. Contexts do 
> not make computer systems more "suited to human use" by adding something 
> unfamiliar, something more computer-like than natural-language-like, 
> which people will have to learn and adjust to.

Indeed, I haven't seen ContextL in that light, and I am also skeptical 
that it is actually like you describe it. Humans are good at guessing 
context. There was an interview with the writer of the Alien movie where 
he explains that he deliberately made the viewers just bump into already 
ongoing conversations between the characters in the beginning of the 
movie, without providing any background information on the characters 
and the setting. Yet, one is able to quickly get a good feel for the 
personalities and the tensions among them (which are important for the 
story).

One form of artistic human expression is jokes. They typically work by 
putting something (a person, a situation, a statement, a word, etc.) in 
a context in which it doesn't really fit. So they prepare an expectation 
in the audience, and finally take a turn into a completely different 
direction.

Example: What's the difference between an object-oriented methodologist 
and a terrorist? You can negotiate with a terrorist. (Very old one, I 
know. ;)

This joke works because "negotiating with a methodologist" and 
"negotiating with a terrorist" mean very different things. It ultimately 
works because the situation with a methodologist is described as worse 
than the situation with a terrorist, which is completely unexpected (but 
resonates at least with some audiences who actually experienced 
"negotiations with methodologists").

Even in the much simpler case of someone slipping on a banana peel, it 
still can sometimes make people laugh because, when that joke is used, 
it typically happens to someone you wouldn't expect it to happen to (or 
who you would want it to happen to).

These are probably not the best examples, but they illustrate the point 
that even in very (and deliberately so) ambiguous cases, people are 
still able to discern the actual meaning, and I doubt that computers can 
have the same ability. (Do computers understand jokes?)

In ContextL, things are different. You have to explicitly provide the 
context, and it is always unambiguous what the context is, which is 
different from natural language where you can talk about something 
without providing the context to make other people understand what you 
are talking about. Maybe ContextL is another step to getting there, but 
it's still a long way, as far as I can see.

Having said that, some people at our lab are starting to explore how 
automatic / semi-automatic reasoning about context could be added to a 
context-oriented approach. So maybe we can get closer to what you seem 
to have in mind. It's certainly an interesting perspective. Thanks for that.


Pascal

-- 
OOPSLA'05 tutorial on generic functions & the CLOS Metaobject Protocol
++++ see http://p-cos.net/oopsla05-tutorial.html for more details ++++
From: Raffael Cavallaro
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <2005091520583450878%raffaelcavallaro@pasdespamsilvousplaitmaccom>
On 2005-09-15 17:53:46 -0400, Pascal Costanza <··@p-cos.net> said:

> In ContextL, things are different. You have to explicitly provide the 
> context, and it is always unambiguous what the context is, which is 
> different from natural language where you can talk about something 
> without providing the context to make other people understand what you 
> are talking about. Maybe ContextL is another step to getting there, but 
> it's still a long way, as far as I can see.
> 
> Having said that, some people at our lab are starting to explore how 
> automatic / semi-automatic reasoning about context could be added to a 
> context-oriented approach. So maybe we can get closer to what you seem 
> to have in mind. It's certainly an interesting perspective. Thanks for 
> that.

Yes, that was one direction I thought you were heading when I read your 
article. It seems only natural that one could use ContextL by having an 
even higher level system automatically do the activation of different 
layers based on changing program state, external inputs, passing time, 
etc. I'm  glad to hear that you have colleagues already working on it. 
Once again, really interesting work you're doing.

regards
From: Björn Lindberg
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <432a7715@news.cadence.com>
Raffael Cavallaro wrote:
> On 2005-09-15 05:02:07 -0400, Bj�rn Lindberg <·····@runa.se> said:
> 
>> Your original claim was that programming languages need to be similar 
>> to natural languages. That is a much more specific claim than saying 
>> that computer systems should be suited to human use, which seems to be 
>> what you are saying now.
> 
> 
> "computer systems should be suited to human use" is a vague statement 
> that encompases both ends of the discussion here, so no, it is not what 
> I am saying.
> 
> The discussion here takes for granted that "computer systems should be 
> suited to human use." The question is by moving in which direction:
> 
> A. Human-computer interfaces will progress if they become more like 
> human communication - i.e., more like natural language.
> 
> B. Human-computer interfaces will progress if we ignore natural language 
> and devise new modes of communication more suited to dealing with 
> computers. These new modes of communication will become "suited to human 
> use" because people are cognitively flexible and will learn and adapt to 
> these new modes of communication.
> 
> I have been arguing for A. I think people are far less cognitively 
> flexible than once thought (i.e., we are *not* blank slates). We have 
> definite cognitive strengths and biases and we should play to them by 
> making HCIs more like natural language.

I can still only read into your claim that HCIs should be targeted 
towards human use. I do not see where you make the link to natural 
languages. When I program, I am communicating something to the computer 
that is not at all like what I communicate using natural language. If I 
had to communicate the same thing to a human I would likely also prefer 
some special notation or similar, because the things you express with 
natural language and therefore what natural language is suited to 
express are not the same things that you want to express in programming 
a computer.

When I program in Lisp I see it as a way to move beyond the limitations 
of language. When I look at a Lisp program I do not see text, I see a 
structured object far removed from any imaginable natural language 
description of the same thing -- and beneficially so.

> Most recently I have argued that ContextL falls under the description of 
> A rather than B - it adds natural-language-like features to CLOS. The 
> feature it adds - contexts - makes computer systems more "suited to 
> human use" because context is familiar to people from their well 
> practiced and innate natural language communication skils. Contexts do 
> not make computer systems more "suited to human use" by adding something 
> unfamiliar, something more computer-like than natural-language-like, 
> which people will have to learn and adjust to.

When I start a program from a Unix terminal, it will inherit the 
environment from my shell. I can also specifically alter the environment 
for this program. It results in this program having a different 
environment that only it can see, and that ceases to exist when this 
program terminates. This is quite similar to dynamic variables in Lisp, 
which ContextL is also intimitely tied to. Only in the case of the Unix 
system, there is no language involved. I agree with you that humans are 
able to act and think in a context dependent manner. I disagree that 
this ability comes from, or has anything to do with natural language.


Bj�rn
From: Raffael Cavallaro
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <2005091613571437709%raffaelcavallaro@pasdespamsilvousplaitmaccom>
On 2005-09-16 03:41:08 -0400, Bj�rn Lindberg <·····@runa.se> said:

> I agree with you that humans are able to act and think in a context 
> dependent manner.

It seems the only point you're making here is that our ability to act 
in a context dependent manner originated with our pre-verbal ancestors 
and did not originate with language. That is undoubtedly so. But we're 
not talking about *acting* here, we're talking about *symbolic 
communication*.

> I disagree that this ability comes from, or has anything to do with 
> natural language.

Context dependent *symbolic communication* comes from natural language. 
Our pre-verbal ancestors certainly acted in context dependent ways. 
They did *not* engage in context dependent symbolic communication.

The hallmark of the kind of symbolic communication typically required 
to interface with computers (as opposed to symbolic communication with 
fellow human beings a.k.a "natural language") is that computers don't 
"get" context - everything has to be explicitly spelled out for them.
From: Rob Warnock
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <ZqWdnSU15NWHmbPeRVn-tw@speakeasy.net>
Raffael Cavallaro <················@pas-d'espam-s'il-vous-plait-mac.com> wrote:
+---------------
| The hallmark of the kind of symbolic communication typically required 
| to interface with computers (as opposed to symbolic communication with 
| fellow human beings a.k.a "natural language") is that computers don't 
| "get" context - everything has to be explicitly spelled out for them.
+---------------

This was once true, but is no longer compeletly so. There has been
a *lot* of work in the past few years in the area of speaker-independent
voice recognition (SIVR) and context-dependent speech parsing. Some
of it comes under the rubric of "voice browsers" or"VoiceXML" with
stochastic Nth-order Markov language models ("N-Gram"), e.g.:

    http://www.w3.org/Voice/
    http://www.w3.org/TR/2004/REC-voicexml20-20040316/
    http://www.w3.org/TR/ngram-spec/

Considerable success has also been shown by research such as MIT's
Project "Oxygen":

    http://oxygen.lcs.mit.edu/Overview.html
    http://oxygen.lcs.mit.edu/Speech.html

I don't recall the phone number, but they had a demo running a while
back that was *quite* startling in the amount of "conversational
context" it could make use of. It could do stuff like this [I've
recalled it as best I can, and trimmed its responses heavily for
brevity here]:

    It: How may I help you?
    Me: What's the weather in Boston tomorrow afternoon?
    It: The weather for Boston, MA, for Monday afternoon,
	September 19th, is forecast to be mostly cloudy,
	with a slight chance of rain. ...
    Me: And the temperature?
    It: The temperature for Boston, MA, for Monday afternoon,
	September 19th, is forecast to be 65-75 degrees Farenheit?
    Me: What's that in Centigrade?
    It: 65-75 degrees Farenheit is 18-24 degrees Celcius.
    Me: How about for Dallas?
    It: The temperature for Dallas, TX, for Monday afternoon,
	September 19th, is forecast to be 29-32 degrees Celcius.
    Me: How about in the evening?
    It: The temperature for Dallas, TX, for Monday evening,
	September 19th, is forecast to be 21-24 degrees Celcius.
    Me: And the weather?
    It: The weather for Dallas, TX, for Monday evening,
	September 19th, is forecast to be clear, with visibility
	in excess of 5 miles.

...and so on. Oh, and if at that point you asked it about "Paris",
it would ask you back, "Do you mean Paris, TX, or Paris, France?"


-Rob

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: Raffael Cavallaro
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <2005091823044216807%raffaelcavallaro@pasdespamsilvousplaitmaccom>
On 2005-09-18 20:19:38 -0400, ····@rpw3.org (Rob Warnock) said:

> This was once true, but is no longer compeletly so. There has been
> a *lot* of work in the past few years in the area of speaker-independent
> voice recognition (SIVR) and context-dependent speech parsing.

No doubt. Unfortunately this is usually considered desirable for user 
applications, not so much so for programmers. Programming languages 
have not taken this route, and we see in this thread active resistance 
on the part of some programmers toward moving this way.

Since the original question of this thread arose in the context of 
developing a system that allows domain experts (molecular biologists in 
this case I believe) to program I think it's appropriate to point out 
that there exist in all likelihood an order of magnitude or more domain 
experts than programmers. These domain experts - chemists, physicists, 
attorneys, physicians, archaeologists, etc. - are for the most part 
quite bright an knowledgeable people and would gladly program for 
themselves if programming languages were as user friendly as the 
telephone weather system you just described. This is why I think the 
future will almost certainly see a move toward more user-friendly, 
natural language like programming systems.


wrt Peter's original question, I think that the "drive to syntax" can 
be explained by the fact that people unfamiliar with a thing will tend 
to judge it by its appearance. In that regard, a first impression of 
java is one of structure and order. A first impression of lisp is one 
of lack of structure. Non programmers cannot know just by looking at 
the surface syntax that there really is a great deal of structure in 
the lisp code - it just isn't in the surface syntax but rather in the 
special forms, built-in functions and macros, nesting, indentation, etc.

Conversely, non programmers cannot know if the underlying semantics of 
a language is broken simply by looking at the surface syntax. They can 
only judge by their experience in other areas where ordered superficial 
appearance means order and structure, and amorphous superficial 
appearance means lack of order and lack of stucture.
From: Björn Lindberg
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <432ea2ef$1@news.cadence.com>
Raffael Cavallaro wrote:
> On 2005-09-16 03:41:08 -0400, Bj�rn Lindberg <·····@runa.se> said:
> 
>> I agree with you that humans are able to act and think in a context 
>> dependent manner.
> 
> 
> It seems the only point you're making here is that our ability to act in 
> a context dependent manner originated with our pre-verbal ancestors and 
> did not originate with language. That is undoubtedly so. But we're not 
> talking about *acting* here, we're talking about *symbolic communication*.

My point may only be that I do not consider this ability to recognize 
context as having anything to do with natural language, which I 
percieved was your claim all along. But I also don't agree that context 
is absent in computer systems or programming languages of today. Quite 
the contrary.

>> I disagree that this ability comes from, or has anything to do with 
>> natural language.
> 
> 
> Context dependent *symbolic communication* comes from natural language. 
> Our pre-verbal ancestors certainly acted in context dependent ways. They 
> did *not* engage in context dependent symbolic communication.

Only in so far as they did not communicate. Why is this an interesting 
discovery?


Bj�rn
From: Raffael Cavallaro
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <2005091909073275249%raffaelcavallaro@pasdespamsilvousplaitmaccom>
On 2005-09-19 07:37:19 -0400, Bj�rn Lindberg <·····@runa.se> said:

>> Context dependent *symbolic communication* comes from natural language. 
>> Our pre-verbal ancestors certainly acted in context dependent ways. 
>> They did *not* engage in context dependent symbolic communication.
> 
> Only in so far as they did not communicate. Why is this an interesting 
> discovery?

Discovery? No one claimed it was a "discovery." Rather, I was pointing 
out that your bringing up how non Homo sapiens sapiens *acted* 50,000 
or more years ago is pretty much irrelevant to a discussion of the 
relationship between natural language and *communicating* with 
computers.

This whole discussion has been about what we, modern human beings, are 
familiar with - what comes more naturally to us. We are familiar with 
context in communication because of natural language in use now, today, 
not because of how our remote pre-verbal ancestors may have *acted* 
tens of thousands of years ago.

Here's one:

A: Typing is a good interface because people have evolved to have 
dextrous fingers.

B: Not so - our distant amphibian ancestors had fingers too.

A: Yes, our amphibian ancestors had fingers but they were not able to 
use them in the complex and dextrous way that we humans do. Dextrous 
finger manipulation is familiar to us from our recent human past 
because even though our amphibious ancestors had fingers they did not 
use them dextrously.

B: Only in so far as they did not play the piano.



Our distant non-verbal ancestors did not use language. Thus our 
familiarity with context dependent *communication* comes not from them, 
but from natural language. Therefore we should look to natural language 
for models of context dependent *communication* with computers that 
will be familiar to us, not to the *behavior* of ancient hominids (nor 
dogs, nor cats, nor bears).
From: Björn Lindberg
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <432fccc6@news.cadence.com>
Raffael Cavallaro wrote:
> On 2005-09-19 07:37:19 -0400, Bj�rn Lindberg <·····@runa.se> said:
> 
>>> Context dependent *symbolic communication* comes from natural 
>>> language. Our pre-verbal ancestors certainly acted in context 
>>> dependent ways. They did *not* engage in context dependent symbolic 
>>> communication.
>>
>>
>> Only in so far as they did not communicate. Why is this an interesting 
>> discovery?
> 
> 
> Discovery? No one claimed it was a "discovery." Rather, I was pointing 
> out that your bringing up how non Homo sapiens sapiens *acted* 50,000 or 
> more years ago is pretty much irrelevant to a discussion of the 
> relationship between natural language and *communicating* with computers.
> 
> This whole discussion has been about what we, modern human beings, are 
> familiar with - what comes more naturally to us. We are familiar with 
> context in communication because of natural language in use now, today, 
> not because of how our remote pre-verbal ancestors may have *acted* tens 
> of thousands of years ago.
> 
> Here's one:
> 
> A: Typing is a good interface because people have evolved to have 
> dextrous fingers.
> 
> B: Not so - our distant amphibian ancestors had fingers too.
> 
> A: Yes, our amphibian ancestors had fingers but they were not able to 
> use them in the complex and dextrous way that we humans do. Dextrous 
> finger manipulation is familiar to us from our recent human past because 
> even though our amphibious ancestors had fingers they did not use them 
> dextrously.
> 
> B: Only in so far as they did not play the piano.
> 
> 
> 
> Our distant non-verbal ancestors did not use language. Thus our 
> familiarity with context dependent *communication* comes not from them, 
> but from natural language. Therefore we should look to natural language 
> for models of context dependent *communication* with computers that will 
> be familiar to us, not to the *behavior* of ancient hominids (nor dogs, 
> nor cats, nor bears).

Ok, we will probably not get further here. I consider the ability to 
percieve context to be something independent of natural language. 
Therefore, even if we create a computer interface based on contexts, it 
does not constitute a loan from natural language. I have already 
articulated elsewhere why I am skeptic towards borrowing from natural 
language in the design of programming languages, and in my view it has 
not been done successfully so far.


Bj�rn
From: Lars Brinkhoff
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <8564t3ooan.fsf@junk.nocrew.org>
Pascal Costanza <··@p-cos.net> writes:
> See http://www.ida.liu.se/~robwe/research-interests.html -
> especially Figure 1 - for some inspiration. The idea is that any
> kind of object can be used as literal data - something that Common
> Lisp already provides - and has a practical user interface. (Not
> just numbers and strings, so to speak.)

Sounds like (CLIM-style) presentations meet structure editors.  Did
Interlisp or Genera do this?  I believe some Scheme implementation does.
From: Pascal Costanza
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <3oqnonF745e9U1@individual.net>
Lars Brinkhoff wrote:
> Pascal Costanza <··@p-cos.net> writes:
> 
>>See http://www.ida.liu.se/~robwe/research-interests.html -
>>especially Figure 1 - for some inspiration. The idea is that any
>>kind of object can be used as literal data - something that Common
>>Lisp already provides - and has a practical user interface. (Not
>>just numbers and strings, so to speak.)
> 
> Sounds like (CLIM-style) presentations meet structure editors.  Did
> Interlisp or Genera do this?  I believe some Scheme implementation does.

I don't know, but would certainly be interested to hear more about this.

Which Scheme implementation?


Pascal

-- 
OOPSLA'05 tutorial on generic functions & the CLOS Metaobject Protocol
++++ see http://p-cos.net/oopsla05-tutorial.html for more details ++++
From: Lars Brinkhoff
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <85wtljn2br.fsf@junk.nocrew.org>
Pascal Costanza wrote:
> Lars Brinkhoff wrote:
> > Pascal Costanza wrote:
> > > See http://www.ida.liu.se/~robwe/research-interests.html -
> > > especially Figure 1 - for some inspiration. The idea is that any
> > > kind of object can be used as literal data - something that
> > > Common Lisp already provides - and has a practical user
> > > interface.
> > Sounds like (CLIM-style) presentations meet structure editors.
> > Did Interlisp or Genera do this?  I believe some Scheme
> > implementation does.
> I don't know, but would certainly be interested to hear more about this.

So would I!  It sounds like one of those cools things a LispM should
be able to do, but I don't recall anyone mentioning such a feature.

> Which Scheme implementation?

I'm sorry, I don't remember.  I think I read about it in a DDJ article
a few years ago, but it could have been some other trade magazine.
From: Lars Brinkhoff
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <85slw7n1uh.fsf@junk.nocrew.org>
Lars Brinkhoff wrote:
> Pascal Costanza wrote:
> > Lars Brinkhoff wrote:
> > > Pascal Costanza wrote:
> > > > See http://www.ida.liu.se/~robwe/research-interests.html -
> > > > especially Figure 1 - for some inspiration. The idea is that any
> > > > kind of object can be used as literal data - something that
> > > > Common Lisp already provides - and has a practical user interface.
> > > I believe some Scheme implementation does [this].
> > Which Scheme implementation?

It doesn't allow any object as literal data, but it was probably
DrScheme:

http://www.plt-scheme.org/software/drscheme/tour/tour-Z-H-5.html
From: Pascal Costanza
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <3ou9o3F7i3q6U1@individual.net>
Lars Brinkhoff wrote:
> Lars Brinkhoff wrote:
> 
>>Pascal Costanza wrote:
>>
>>>Lars Brinkhoff wrote:
>>>
>>>>Pascal Costanza wrote:
>>>>
>>>>>See http://www.ida.liu.se/~robwe/research-interests.html -
>>>>>especially Figure 1 - for some inspiration. The idea is that any
>>>>>kind of object can be used as literal data - something that
>>>>>Common Lisp already provides - and has a practical user interface.
>>>>
>>>>I believe some Scheme implementation does [this].
>>>
>>>Which Scheme implementation?
> 
> It doesn't allow any object as literal data, but it was probably
> DrScheme:
> 
> http://www.plt-scheme.org/software/drscheme/tour/tour-Z-H-5.html

Neat, but probably not quite the "real" thing.

For example, any object as literal data could probably solve the issue 
with defconstant in Common Lisp which doesn't allow you to change the 
value of a constant if it is not eql to the previous one. Assume you 
have this:

(defconstant +a+ #.(make-array ...))

If it were possible to change the contents of the thusly created array, 
you wouldn't need to change its identity. A MVC-based user interface on 
top could then allow you to conveniently make the changes. As a next 
step, you could then provide all kinds of representations that deviate 
from the then-internal s-expression representation (while still allowing 
Lisp experts to work with that internal representation). Some of those 
representations could for example be actual mathematical notation (the 
full thing, not just some ASCII-based simulation).

I guess that's also the idea behind intentional programming, except that 
they don't use Lisp as a starting point.

My impression from the Scheme link is that the pictures are probably 
"just" values, in the sense that they probably don't have object 
identity. And I don't think this would scale.


Pascal

-- 
OOPSLA'05 tutorial on generic functions & the CLOS Metaobject Protocol
++++ see http://p-cos.net/oopsla05-tutorial.html for more details ++++
From: Pascal Costanza
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <3ovsulF7vqaqU1@individual.net>
Pascal Costanza wrote:
> Lars Brinkhoff wrote:
> 
>> Lars Brinkhoff wrote:
>>
>>> Pascal Costanza wrote:
>>>
>>>> Lars Brinkhoff wrote:
>>>>
>>>>> Pascal Costanza wrote:
>>>>>
>>>>>> See http://www.ida.liu.se/~robwe/research-interests.html -
>>>>>> especially Figure 1 - for some inspiration. The idea is that any
>>>>>> kind of object can be used as literal data - something that
>>>>>> Common Lisp already provides - and has a practical user interface.
>>>>>
>>>>> I believe some Scheme implementation does [this].
>>>>
>>>> Which Scheme implementation?
>>
>>
>> It doesn't allow any object as literal data, but it was probably
>> DrScheme:
>>
>> http://www.plt-scheme.org/software/drscheme/tour/tour-Z-H-5.html
[...]

> My impression from the Scheme link is that the pictures are probably 
> "just" values, in the sense that they probably don't have object 
> identity. And I don't think this would scale.

I have been notified via email that this is not correct.

The editor framework for PLT Scheme has a notion of "snip objects" which 
can present objects on screen. For example, image snips are (probably) 
the things you see in the link above. See 
http://download.plt-scheme.org/doc/299.400/html/mred/mred-Z-H-619.html#node_chap_8

Furthermore, the I/O framework can deal with "special values" that are 
non-serialized values that can be part of input/output streams (if I 
understand correctly). Since they are not serialized, they can have 
object identity. See 
http://download.plt-scheme.org/doc/299.400/html/mzscheme/mzscheme-Z-H-11.html#node_chap_11

I don't really understand this, since I have never used it, but it seems 
to me that those special values allow you to integrate the objects into 
source code that can then be presented via snip objects (or something 
along these lines).

I would still like to additionally see different views of source code 
that can present concepts in different ways. I think that would be 
worthwhile...


Pascal

-- 
OOPSLA'05 tutorial on generic functions & the CLOS Metaobject Protocol
++++ see http://p-cos.net/oopsla05-tutorial.html for more details ++++
From: David Steuber
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <87zmqfypbs.fsf@david-steuber.com>
Pascal Costanza <··@p-cos.net> writes:

> See http://www.ida.liu.se/~robwe/research-interests.html - especially
...
> P.S.: Note that I have used a hyperlink in this posting, something
> that definitly goes beyond natural, spoken language. ;)

No, you simply cited another work using a URL.  While a good news
client will recognize the URL and allow you to make a gesture to
retreive the document in an apporpriate viewer, that is not an
inherent function of the ascii text you posted.

Incidentally, I would like a narrower definition of what Lisp is.  It
doesn't seem right to me to claim that Smalltalk is a Lisp or that
Dylan is a Lisp.  If you do that, then perhaps Perl is also a Lisp and
I've already said elsewhere that it isn't.

I don't think the s-exp notation can easily be removed from Lisp
without breaking it.  Lisp code expressed as a Lisp data structure
that can be manipulated by the Lisp environment from read time to run
time seems to be a key property.  LISt Processing seems to be very
much a part of what Lisp is regardless of the other data structures
supported by Lisp.

Once you start down the path of irregular syntax, lexing will forever
dominate your language.  Consume you it will.

-- 
You should maybe check the chemical content of your breakfast
cereal. --- Bill Watterson
From: Thomas F. Burdick
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <xcvslw6keqv.fsf@conquest.OCF.Berkeley.EDU>
David Steuber <·····@david-steuber.com> writes:

> Incidentally, I would like a narrower definition of what Lisp is.  It
> doesn't seem right to me to claim that Smalltalk is a Lisp or that
> Dylan is a Lisp.  If you do that, then perhaps Perl is also a Lisp and
> I've already said elsewhere that it isn't.

A lisp is a language with symbols and anonymous functions, whose users
are familiar with continuations and the Y combinator, whether they use
them or think they're a good idea or not.  Lisp language communities
are plagued by constantly recurring discussions of special operating
systems and/or machines just for their language.

Cool, can I call ST a Lisp now?  :-)

[ But seriously, although I wouldn't call ST a Lisp, the languages are
  very closely related, and the language communities have very similar
  sets of values.  Not siblings, but ST and Lisp are more like one
  another than either is like anything else.  I call Smalltalkers our
  kissing cousins. ]

-- 
           /|_     .-----------------------.                        
         ,'  .\  / | Free Mumia Abu-Jamal! |
     ,--'    _,'   | Abolish the racist    |
    /       /      | death penalty!        |
   (   -.  |       `-----------------------'
   |     ) |                               
  (`-.  '--.)                              
   `. )----'                               
From: Kent M Pitman
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <u7jdifgwm.fsf@nhplace.com>
David Steuber <·····@david-steuber.com> writes:

> Incidentally, I would like a narrower definition of what Lisp is.  It
> doesn't seem right to me to claim that Smalltalk is a Lisp or that
> Dylan is a Lisp.  If you do that, then perhaps Perl is also a Lisp and
> I've already said elsewhere that it isn't.

A chair is something that most would agree has legs and a back and can be
moved around.  Yet one can say to a bunch of campers in the woods when
they come across a clearing that contains only tree stumps and large 
boulders: "Pull up a chair."  The definition of a word changes according
to circumstance.

Countries draw battle lines among them, yet if Earth were threatened from
outer space, it is widely presumed that our commonalities as Terrans would
paper over our differences for long enough to address the problem.  Even the
famous "jelly donut" quote from Kennedy (Google: Kennedy "jelly donut")
shows that the definition of a word can change.

So having a narrower definition for a particular conversation--sure.
But a narrower definition for all purposes?  Uh, that would be
meaningless at best given that at least some of the enumerable
circumstances include "the ability to make sense of prior
conversations already on record", for which you're going to need a
broad definition. :)

But, beyond that, I again remind you that Dylan was something I think
few people would doubt was a Lisp when it started out.  It morphed
into something you would not recognize as a Lisp.  So you need to be
clear.  The 1992 Dylan manual from Apple shows code examples like:
 ? (define-method show-rest (a #rest b)
     (print a)
     (print b)
     #t)
 show-rest
 ? (show-rest 10 20 30 40)
 10
 (20 30 40)
 #t
 ? (show-rest 10)
 10
 ()
 #t
I think you will have a hard time getting a definition of Lisp so narrow
that this is not a Lisp, but that the other things you want to be Lisps are!

> I don't think the s-exp notation can easily be removed from Lisp
> without breaking it.  Lisp code expressed as a Lisp data structure
> that can be manipulated by the Lisp environment from read time to run
> time seems to be a key property.  LISt Processing seems to be very
> much a part of what Lisp is regardless of the other data structures
> supported by Lisp.

Similar claims were made by people who said that lists should be funcallable.
It used to be in older dialects that you could write programs like:

 (defun counter (n)
   (let ((cell '(0)))
     (car (rplaca cell (1+ (car cell))))))

Today, that won't work reliably in all Lisps.  Have we given up Lispness?
Or learned something about the value of separating pages that are mostly
unchanging onto pure pages so they can be shared?

In Maclisp, for example, macros would definitely do things like:

 (defun first macro (form) (rplaca form 'car))
 => FIRST
 (defun foo (x) (first x))
 => FOO
 (get 'foo 'expr)
 => (LAMBDA (X) (FIRST X))
 (foo '(A B C))
 => A
 (get 'foo 'expr)
 => (LAMBDA (X) (CAR X))

That's self-modifying code.  I think Maclisp might have even allowed (in
the interpreter, though I'm sure it wouldn't compile):

 (eval (subst '(undefined) 
              'shared-cell
              '(defun funkall (symbol)
                 (rplaca 'shared-cell symbol)
                 shared-cell)))

Now that was self-modifying code.  But it's hard t compile, and Maclisp
had remarkably different semantics for interpreted and compiled code as
a result.  Did we become non-Lisp to make things compile?

I mention this just to say that not all change is bad.  And while I
wholly agree that there are important things about Lisp, sometimes too
time has a way of teaching you that the things you thought were
important were not the really important things.

The extreme case is probably the Lisp Machine, which was originally an
exercise in hardware design.  And yet what I bet most of us miss is
not the hardware, it's the software.  That is, we who wish we still
had Lisp Machines would mostly gladly settle for stock hardware with
equivalently interesting software...  So somewhere along the way we
realized that what we thought we valued, we didn't.

> Once you start down the path of irregular syntax, lexing will forever
> dominate your language.  Consume you it will.

Now this is a more modest claim, and I think there's some virtue in it.
... even if you did have to resort to irregular syntax in order to make
your point. ;)
From: Duane Rettig
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <4k6hintuk.fsf@franz.com>
Kent M Pitman <······@nhplace.com> writes:

> David Steuber <·····@david-steuber.com> writes:
>
>> Incidentally, I would like a narrower definition of what Lisp is.  It
>> doesn't seem right to me to claim that Smalltalk is a Lisp or that
>> Dylan is a Lisp.  If you do that, then perhaps Perl is also a Lisp and
>> I've already said elsewhere that it isn't.

I based my own categorizations on what the languages (i.e. their users)
say about themselves.  Dylan claims to come from lisp roots, and I can see
it in the definitions.  Smalltalk makes no such claim.

> A chair is something that most would agree has legs and a back and can be
> moved around.  Yet one can say to a bunch of campers in the woods when
> they come across a clearing that contains only tree stumps and large 
> boulders: "Pull up a chair."  The definition of a word changes according
> to circumstance.

It also changes when one is sitting in a leaderless board meeting, and
everybody is already sitting down....

-- 
Duane Rettig    ·····@franz.com    Franz Inc.  http://www.franz.com/
555 12th St., Suite 1450               http://www.555citycenter.com/
Oakland, Ca. 94607        Phone: (510) 452-2000; Fax: (510) 452-0182   
From: Pascal Costanza
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <3ou70gF78k0jU2@individual.net>
Duane Rettig wrote:

> I based my own categorizations on what the languages (i.e. their users)
> say about themselves.  Dylan claims to come from lisp roots, and I can see
> it in the definitions.  Smalltalk makes no such claim.

Alan Kay does (but Smalltalkers usually don't).


Pascal

-- 
OOPSLA'05 tutorial on generic functions & the CLOS Metaobject Protocol
++++ see http://p-cos.net/oopsla05-tutorial.html for more details ++++
From: Duane Rettig
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <4br2tr62s.fsf@franz.com>
Pascal Costanza <··@p-cos.net> writes:

> Duane Rettig wrote:
>
>> I based my own categorizations on what the languages (i.e. their users)
>> say about themselves.  Dylan claims to come from lisp roots, and I can see
>> it in the definitions.  Smalltalk makes no such claim.
>
> Alan Kay does (but Smalltalkers usually don't).

Hmm, you have a reference for this?  Last I heard him was 1998,
when he spoke at an Oopsla conference (I wasn't there - I just heard
the tape) where he mentioned that AMOP was the best written book
of the decade,  but I've never heard him say that ST came from Lisp.
Was that before or after '98?

-- 
Duane Rettig    ·····@franz.com    Franz Inc.  http://www.franz.com/
555 12th St., Suite 1450               http://www.555citycenter.com/
Oakland, Ca. 94607        Phone: (510) 452-2000; Fax: (510) 452-0182   
From: Raffael Cavallaro
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <2005091600444877923%raffaelcavallaro@pasdespamsilvousplaitmaccom>
On 2005-09-16 00:26:35 -0400, Duane Rettig <·····@franz.com> said:

> Pascal Costanza <··@p-cos.net> writes:
> 
>> Duane Rettig wrote:
>> 
>>> I based my own categorizations on what the languages (i.e. their users)
>>> say about themselves.  Dylan claims to come from lisp roots, and I can see
>>> it in the definitions.  Smalltalk makes no such claim.
>> 
>> Alan Kay does (but Smalltalkers usually don't).
> 
> Hmm, you have a reference for this?

Well Alan Kay acknowledges a great debt to lisp in his "The Early 
History of Smalltalk," especially from section 12 on:

<http://gagne.homedns.org/~tgagne/contrib/EarlyHistoryST.html#12>

for example:

"One little incident of LISP eauty happened when Allen Newell visited 
PARC with his theory of hierarchical thinking and was challenged to 
prove it. He was given a programming problem to solve while the 
protocol was collected. The problem was: given a list of items, produce 
a list consisteing of all of the odd indexed items followed by all of 
the even indexed items. Newel's internal programming langage resembple 
IPL-V in which pointers are manipulated explicitly, and he got into 
quite a struggle to do the program. In 2 seconds I wrote down:
oddsEvens(x) = append(odds(x), evens(x))
the statement of the problem in Landin's LISP syntax--and also the 
first part of the solution. Then a few seconds later:
where odds(x) = if null(x) v null(tl(x)) then x
                   else hd(x) & odds(ttl(x))
     evens(x) = if null(x) v null(tl(x)) then nil
                   else odds(tl(x))
This characteristic of writing down many solutions in declarative form 
and have them also be the programs is part of the appeal and beauty of 
this kind of language. Watching a famous guy much smarter than I 
struggle for more than 30 minutes to not quite solve the problem his 
way (there was a bug) made quite an impression. It brought home to me 
once again that "point of view is worth 80 IQ points." I wasn't smarter 
but I had a much better internal thinking tool to amplify my abilities. 
This incident and others like it made paramount that any tool for 
children should have great thinking patterns and deep beeauty 
"built-in.""

[sic] - cut and pasted so the typos are in the original html version.

regards
From: Duane Rettig
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <47jdhqzvq.fsf@franz.com>
Raffael Cavallaro <················@pas-d'espam-s'il-vous-plait-mac.com> writes:

> On 2005-09-16 00:26:35 -0400, Duane Rettig <·····@franz.com> said:
>
>> Pascal Costanza <··@p-cos.net> writes:
>>
>>> Duane Rettig wrote:
>>>
>>>> I based my own categorizations on what the languages (i.e. their users)
>>>> say about themselves.  Dylan claims to come from lisp roots, and I can see
>>>> it in the definitions.  Smalltalk makes no such claim.
>>> Alan Kay does (but Smalltalkers usually don't).
>> Hmm, you have a reference for this?
>
> Well Alan Kay acknowledges a great debt to lisp in his "The Early
> History of Smalltalk," especially from section 12 on:
>
> <http://gagne.homedns.org/~tgagne/contrib/EarlyHistoryST.html#12>

Thanks; this seems to be a good example of Kay's thinking.  However,
if you read it closely, you find (at least I do) what I've always
known, to a certain extent: that Alan Kay has always been greatly
influenced by Lisp, but he was always after something more - a few
phrases that appear in that document are "the perfect language" and
"Lisp done right".  He also based the most basic underlying features on
message passing, as you can see from the 6 main ideas (a sidebar about
halfway through the paper whose first item is "Everything is an object").

There is a lot to say for Kay being influenced by Lisp.  But I still
don't find writing like this to be compelling evidence that the author
of ST claims that it came from Lisp roots.  Perhaps there is another
location where he comes right out and says it...

-- 
Duane Rettig    ·····@franz.com    Franz Inc.  http://www.franz.com/
555 12th St., Suite 1450               http://www.555citycenter.com/
Oakland, Ca. 94607        Phone: (510) 452-2000; Fax: (510) 452-0182   
From: Pascal Costanza
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <3ov9f9F7pn7fU1@individual.net>
Duane Rettig wrote:

> Pascal Costanza <··@p-cos.net> writes:
> 
>>Duane Rettig wrote:
>>
>>
>>>I based my own categorizations on what the languages (i.e. their users)
>>>say about themselves.  Dylan claims to come from lisp roots, and I can see
>>>it in the definitions.  Smalltalk makes no such claim.
>>
>>Alan Kay does (but Smalltalkers usually don't).
> 
> Hmm, you have a reference for this?  Last I heard him was 1998,
> when he spoke at an Oopsla conference (I wasn't there - I just heard
> the tape) where he mentioned that AMOP was the best written book
> of the decade,  but I've never heard him say that ST came from Lisp.
> Was that before or after '98?

I have two references:

"We had two ideas, really. One of them we got from Lisp: late binding. 
The other one was the idea of objects." - from 
http://acmqueue.com/modules.php?name=Content&pa=showpage&pid=273&page=2

"My original experiments with this architecture were done using a model 
I adapted from van Wijngaarten's and Wirth's "Generalization of Algol" 
and Wirth's Euler. Both of these were rather LISP-like but with a more 
conventional readable syntax. I didn't understand the monster LISP idea 
of tangible metalanguage then, but got kind of close with ideas about 
extensible languages draw from various sources, including Irons' IMP.

The second phase of this was to finally understand LISP and then using 
this understanding to make much nicer and smaller and more powerful and 
more late bound understructures. [...]

The original Smalltalk at Xerox PARC came out of the above." - from 
http://userpage.fu-berlin.de/~ram/pub/pub_jf47hxhHHt/doc_kay_oop_en

The first is from 2004, the second from 2003.


Pascal

-- 
OOPSLA'05 tutorial on generic functions & the CLOS Metaobject Protocol
++++ see http://p-cos.net/oopsla05-tutorial.html for more details ++++
From: Harald Hanche-Olsen
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <pcowtli6sv8.fsf@shuttle.math.ntnu.no>
+ Kent M Pitman <······@nhplace.com>:

| David Steuber <·····@david-steuber.com> writes:
| 
| > Once you start down the path of irregular syntax, lexing will forever
| > dominate your language.  Consume you it will.
| 
| Now this is a more modest claim, and I think there's some virtue in it.
| ... even if you did have to resort to irregular syntax in order to make
| your point. ;)

Hm, I thought that was an obvious Yodaism, perhaps with the intent to
lend the statement an air of profundity.

-- 
* Harald Hanche-Olsen     <URL:http://www.math.ntnu.no/~hanche/>
- Debating gives most of us much more psychological satisfaction
  than thinking does: but it deprives us of whatever chance there is
  of getting closer to the truth.  -- C.P. Snow
From: Kent M Pitman
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <uu0glzyd1.fsf@nhplace.com>
Harald Hanche-Olsen <······@math.ntnu.no> writes:

> + Kent M Pitman <······@nhplace.com>:
> 
> | David Steuber <·····@david-steuber.com> writes:
> | 
> | > Once you start down the path of irregular syntax, lexing will forever
> | > dominate your language.  Consume you it will.
> | 
> | Now this is a more modest claim, and I think there's some virtue in it.
> | ... even if you did have to resort to irregular syntax in order to make
> | your point. ;)
> 
> Hm, I thought that was an obvious Yodaism, perhaps with the intent to
> lend the statement an air of profundity.

And I took it for that.  But I wanted to note that what distinguishes
Yoda is as much his use of syntax as his alleged profundity...
From: David Steuber
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <877jdggs5b.fsf@david-steuber.com>
Kent M Pitman <······@nhplace.com> writes:

> Harald Hanche-Olsen <······@math.ntnu.no> writes:
> 
> > + Kent M Pitman <······@nhplace.com>:
> > 
> > | David Steuber <·····@david-steuber.com> writes:
> > | 
> > | > Once you start down the path of irregular syntax, lexing will forever
> > | > dominate your language.  Consume you it will.
> > | 
> > | Now this is a more modest claim, and I think there's some virtue in it.
> > | ... even if you did have to resort to irregular syntax in order to make
> > | your point. ;)
> > 
> > Hm, I thought that was an obvious Yodaism, perhaps with the intent to
> > lend the statement an air of profundity.
> 
> And I took it for that.  But I wanted to note that what distinguishes
> Yoda is as much his use of syntax as his alleged profundity...

A Yodaism it was.  However, I offer as examples of the validity of the
statement C++, Objective-C++, Perl, and any language where the lack of
a comma changes everything (possibly beyond simply being a syntax
error).

Perl tries (and succeeds) to be terse.  Ironically I've been bitten by
leaving out commas in lists because I got used to not typing them in
my Lisp code.  When you think about it, there are plenty of cases in
Perl where the commas are redundant syntactical elements.  You could
even say the same thing for C:

printf ("%d is a number\n", 42);

Why is that comma required?  I suppose you might have a function in
there that returns a format string:

printf (func_returning_format_string(a,b,c), x,y,z);

I'm not saying all syntax is bad.  But some of it does seem
superfluous.

-- 
You should maybe check the chemical content of your breakfast
cereal. --- Bill Watterson
From: Kent M Pitman
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <uu0gk81sh.fsf@nhplace.com>
David Steuber <·····@david-steuber.com> writes:

> Perl tries (and succeeds) to be terse.  Ironically I've been bitten by
> leaving out commas in lists because I got used to not typing them in
> my Lisp code.  When you think about it, there are plenty of cases in
> Perl where the commas are redundant syntactical elements.  You could
> even say the same thing for C:

Heh.  Yeah. I fussed a bunch when when "," was usurped in Maclisp for
use by backquote.  Prior to backquote, comma was whitespace and you
could write a list of ordered pairs as '(3,4) to mean '(3 4).  I thought
the loss of this notation was major sadness.

> printf ("%d is a number\n", 42);
> 
> Why is that comma required?

Remember what I quoted earlier from my class with Prof. Bill Martin,
recently also indirectly quoted by Sashank Varma and (mis)attributed
to me: Syntax is needed not to say what's obvious, but what's not
obvious.  The example Martin used to give was that if all you needed
to do was talk about what mice eat, you could just say "mice cheese
eat" or "eat mice cheese" and people would get your drift just from
the nouns.  But the clarity of expression that comes from having
"cheese eat mice" express something UNexpected because the syntax
clearly says that the cheese is what is doing the action and the mice
is the object is something for which syntax is necessary.

So working back from your example, what you want is not to say "why
is it required here?" but rather "what else could I say that might
highlight the usefulness of a separator?", which draws you to issues
of parse barriers--things that terminate parses.  

In Maclisp, |foo| was a symbol and if you did |foo||bar| you got
two symbols.  In CL, we made a change to make |foo||bar| be a single
symbol |foobar|.  We did not, incidentally, make the corresponding
change to "...", thank goodness.

But C does have this "feature", which I sometimes use to introduce
newlines and get indentation. That is:

    "this string is too long, "
    "so i have continued it on a second line"

Consequently, the reason you need the comma in your example is,
by construction, to protect you from being confused by:

   printf("%d isn't a string\n" NINE);

where you migth think you were going to get

   #define NINE 9

but you might instead have gotten

   #define NINE "it's an int.\n", 9

> I'm not saying all syntax is bad.  But some of it does seem
> superfluous.

Until you look around at the full space of things that _might_ be said.

RTF is full of this, if you've ever played around with it at the source
level.  It has some dorky token (I can't even remember any more what
the token is) that is required in all kinds of places and it seems to
serve no purpose.  But eventually I figured out that its purpose is to
make sure that certain unwanted parses don't happen.  You don't understand
it in isolation--you understand it as part of a cooperating system.
From: Pascal Costanza
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <3ou93lF7kounU1@individual.net>
David Steuber wrote:

> Pascal Costanza <··@p-cos.net> writes:
> 
>>See http://www.ida.liu.se/~robwe/research-interests.html - especially
> 
> ...
> 
>>P.S.: Note that I have used a hyperlink in this posting, something
>>that definitly goes beyond natural, spoken language. ;)
> 
> No, you simply cited another work using a URL.  While a good news
> client will recognize the URL and allow you to make a gesture to
> retreive the document in an apporpriate viewer, that is not an
> inherent function of the ascii text you posted.

The meanings of the ascii texts we post are also not an inherent 
function of those texts. They are effectively just bit patterns that we 
need to interpret. They are (in this newsgroup) typically unintelligible 
to non-English speakers. Probably even more so to people who don't know 
the Latin alphabet.

But maybe I am missing something and you wanted to get at something else?


Pascal

-- 
OOPSLA'05 tutorial on generic functions & the CLOS Metaobject Protocol
++++ see http://p-cos.net/oopsla05-tutorial.html for more details ++++
From: David Steuber
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <871x3ogrq9.fsf@david-steuber.com>
Pascal Costanza <··@p-cos.net> writes:

> David Steuber wrote:
> 
> > Pascal Costanza <··@p-cos.net> writes:
> >
> >>See http://www.ida.liu.se/~robwe/research-interests.html - especially
> > ...
> >
> >>P.S.: Note that I have used a hyperlink in this posting, something
> >>that definitly goes beyond natural, spoken language. ;)
> > No, you simply cited another work using a URL.  While a good news
> > client will recognize the URL and allow you to make a gesture to
> > retreive the document in an apporpriate viewer, that is not an
> > inherent function of the ascii text you posted.
> 
> The meanings of the ascii texts we post are also not an inherent
> function of those texts. They are effectively just bit patterns that
> we need to interpret. They are (in this newsgroup) typically
> unintelligible to non-English speakers. Probably even more so to
> people who don't know the Latin alphabet.
> 
> But maybe I am missing something and you wanted to get at something else?

All I was saying is that you cited another source much like anyone
would do in a paper or speech.  You did not go beyond natural speech
any more than a lawyer or scholar citing some other source to support
a point of view.

The underlying mechanics of how what you typed on your keyboard got
displayed on my screen are irrelevant.

-- 
You should maybe check the chemical content of your breakfast
cereal. --- Bill Watterson
From: Pascal Costanza
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <3pb5bhF9j24qU1@individual.net>
David Steuber wrote:

> All I was saying is that you cited another source much like anyone
> would do in a paper or speech.  You did not go beyond natural speech
> any more than a lawyer or scholar citing some other source to support
> a point of view.
> 
> The underlying mechanics of how what you typed on your keyboard got
> displayed on my screen are irrelevant.

Hm, I have been thinking about this for a while now, and I don't seem to 
find a good a response. I think it does go beyond natural language, for 
example because you can click on the link and can get immediately to the 
site that is referenced. In spoken language you cannot do this, in books 
you can only do this within a book, but with the internet you can do 
this worldwide.

It's not a qualitative but a quantitative difference, maybe, but I think 
it matters.


Pascal

-- 
OOPSLA'05 tutorial on generic functions & the CLOS Metaobject Protocol
++++ see http://p-cos.net/oopsla05-tutorial.html for more details ++++
From: Pascal Bourguignon
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <87k6hk5q5w.fsf@thalassa.informatimago.com>
Pascal Costanza <··@p-cos.net> writes:
> If people want to waste their time on wondering which kinds of
> orderings of brackets, braces, parentheses, semicolons, colons, dots,
> commas, dollars, ampersands, at signs, backward slahes, forward
> slashes, dashes, underscores, etc., are more "natural", they can of
> coures go ahead and do so. But I am just not interested.

Me neither, and that's why I settled on Lisp, after my juvenile search
for a "perfect" language (syntax).

-- 
__Pascal Bourguignon__                     http://www.informatimago.com/
-----BEGIN GEEK CODE BLOCK-----
Version: 3.12
GCS d? s++:++ a+ C+++ UL++++ P--- L+++ E+++ W++ N+++ o-- K- w--- 
O- M++ V PS PE++ Y++ PGP t+ 5+ X++ R !tv b+++ DI++++ D++ 
G e+++ h+ r-- z? 
------END GEEK CODE BLOCK------
From: Peter Seibel
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <m2r7bro659.fsf@gigamonkeys.com>
Pascal Costanza <··@p-cos.net> writes:

> If people want to waste their time on wondering which kinds of
> orderings of brackets, braces, parentheses, semicolons, colons,
> dots, commas, dollars, ampersands, at signs, backward slahes,
> forward slashes, dashes, underscores, etc., are more "natural", they
> can of coures go ahead and do so. But I am just not interested.

As the originator of this thread I somehow feel compelled to Pascal's
disdain of the topic. (Though I'm sure Pascal didn't mean anything
personal by it as we've always gotten along well.) While I pretty much
agree that the topic of bracket flavors and so on is pretty
uninteresting if the question is, what's the way forward to a better
general purpose programming langauge.

But that's not the only question. At the moment, I'm working on a
project who's goal is to enable a group of people (biologists) who
typically do not program to learn to program so they can get a grip on
the huge amount of data they have available to them as a consequence
of Moore's law and gene sequencing and so on. Now, if I could sit down
with each biologist individually and teach them to program I'm pretty
confident that teaching them a Lisp-based domain specific language
would not be too hard. It's sure a heck of a lot easier than the
wet-work procedures they do in their labs. However in order to do the
most good we need to provide a tool that can draw in more biologists
than I have time to sit down with individually so we need to make the
tool appealing to someone who's maybe experiencing it without a human
guide.

At any rate, in that context, some of the biologists we are working
with seem to have the "drive to syntax" that I'm talking about. They
find it easier to understand code (and believe other non-programmer
biologists will also find it easier to understand code) that has
syntax such as foo[i, j] for array access rather than (aref foo i j)
or (@ foo i j) or even, to steal an idea from Arc and, I think, Franz
Lisp, (foo i j). Now this may be because the bioligists we're dealing
with have some programming experience thus already have the C brain
damage. But maybe not. Anyway, that was the motivation for my original
question.

-Peter

P.S. Personally I don't think syntax is the hard thing about
programming and that it's quite likely that too much syntax--even if
they claim they like/want it--will decrease the probability of the
non-programmer biologists becoming programmers as it will exacerbate
the *real* problems of programming down the line and make the overall
experience more frustrating. But it still seems worthwhile to
understand why these folks *want* it even if in the end I will refuse
to give it to them. ;-)

-- 
Peter Seibel           * ·····@gigamonkeys.com
Gigamonkeys Consulting * http://www.gigamonkeys.com/
Practical Common Lisp  * http://www.gigamonkeys.com/book/
From: Pascal Costanza
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <3orohpF7dl1qU1@individual.net>
Peter Seibel wrote:

> As the originator of this thread I somehow feel compelled to Pascal's
> disdain of the topic. (Though I'm sure Pascal didn't mean anything
> personal by it as we've always gotten along well.)

Did I sound that harsh? ;)

> But that's not the only question. At the moment, I'm working on a
> project who's goal is to enable a group of people (biologists) who
> typically do not program to learn to program so they can get a grip on
> the huge amount of data they have available to them as a consequence
> of Moore's law and gene sequencing and so on. Now, if I could sit down
> with each biologist individually and teach them to program I'm pretty
> confident that teaching them a Lisp-based domain specific language
> would not be too hard. It's sure a heck of a lot easier than the
> wet-work procedures they do in their labs. However in order to do the
> most good we need to provide a tool that can draw in more biologists
> than I have time to sit down with individually so we need to make the
> tool appealing to someone who's maybe experiencing it without a human
> guide.

I think that situation is comparable to the music software user 
interfaces I have briefly mentioned. You're dealing people who have 
already acquired certain skills and a certain toolset, including 
commonly used notations within their field. Of course it makes people 
more comfortable if they recognize something similar to what they 
already know. The need to invest time to learn something that is 
seemingly unrelated to what one actually wants to do must appear as a 
waste of time, at least in the beginning.

I think these things will only change over time, and probably need 
changes in education.


Pascal

-- 
OOPSLA'05 tutorial on generic functions & the CLOS Metaobject Protocol
++++ see http://p-cos.net/oopsla05-tutorial.html for more details ++++
From: Robert Uhl
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <m34q8nuzlm.fsf@4dv.net>
Pascal Costanza <··@p-cos.net> writes:
>
>> Here's another thought about why syntax is nice: we are, I think,
>> better at recognising glyphs than words.  So the glyph + is better
>> than the word 'plus', and a glyph sequence [x] is better than the
>> word 'aref.'  Perhaps I'm wrong, though.
>
> Here's another thought about why syntax is nice: we are, I think,
> better at inventing ever new excuses not to learn new things. So the
> sun orbiting around the earth is better than the earth orbiting around
> the sun, and caveman painting is better than communicating via words
> on newsgroups. Perhaps I'm wrong, though.

It was a serious idea, though, and one which you didn't really refute.
Obviously lots of folks like syntax and glyphs--I must admit that as
much as I like Lisp that I sometimes miss foo[3] vs. (aref foo 3).

Syntax helps contain verbosity by giving phrase-shape a role to play;
glyphs help contain verbosity by condensing words into characters.
These are not necessarily bad things.

The advantage of code-as-lists appear to outweigh the advantages of
syntax & glyphs, but that doesn't mean that the latter haven't
advantages.

-- 
Robert Uhl <http://public.xdi.org/=ruhl>
I believe in life, and I also believe in love, but the world in which
I live in keeps trying to prove me wrong.                 --P. Weller
From: Pascal Costanza
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <3oro3pF7f46fU1@individual.net>
Robert Uhl wrote:

> It was a serious idea, though, and one which you didn't really refute.

Sure, because it cannot be refuted. It's all based heavily on 
assumptions that can neither be proven nor disproven, at least not in an 
absolute sense. That's the case for both sides.

It just seems to me that some people try to devise explanations for what 
as well may be just an accident, or caused by the force of habit.

Pascal

-- 
OOPSLA'05 tutorial on generic functions & the CLOS Metaobject Protocol
++++ see http://p-cos.net/oopsla05-tutorial.html for more details ++++
From: Christopher C. Stacy
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <ull1zs15b.fsf@news.dtpq.com>
Pascal Costanza <··@p-cos.net> writes:

> Robert Uhl wrote:
> 
> > It was a serious idea, though, and one which you didn't really refute.
> 
> Sure, because it cannot be refuted. It's all based heavily on
> assumptions that can neither be proven nor disproven, at least not in
> an absolute sense. That's the case for both sides.
> 
> It just seems to me that some people try to devise explanations for
> what as well may be just an accident, or caused by the force of habit.

I dunno...when I'm walking along and come across something
that looks like ($l=join("",<>))=~s/.*\n/index($`,$&)>=$[||print$&/ge;
I have to pause to wonder whether it was Intelligently Designed or not.



Oh -- sorry! Wrong flame group...
From: Joe Marshall
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <ek7qckpo.fsf@alum.mit.edu>
······@news.dtpq.com (Christopher C. Stacy) writes:

> Pascal Costanza <··@p-cos.net> writes:
>
>> Robert Uhl wrote:
>> 
>> > It was a serious idea, though, and one which you didn't really refute.
>> 
>> Sure, because it cannot be refuted. It's all based heavily on
>> assumptions that can neither be proven nor disproven, at least not in
>> an absolute sense. That's the case for both sides.
>> 
>> It just seems to me that some people try to devise explanations for
>> what as well may be just an accident, or caused by the force of habit.
>
> I dunno...when I'm walking along and come across something
> that looks like ($l=join("",<>))=~s/.*\n/index($`,$&)>=$[||print$&/ge;
> I have to pause to wonder whether it was Intelligently Designed or not.

That code does seem to exhibit irreducible complexity, but I have a real
problem using `intelligent' or `design' here.
From: Pascal Bourguignon
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <87oe6w5qeb.fsf@thalassa.informatimago.com>
Robert Uhl <·········@NOSPAMgmail.com> writes:
> Here's another thought about why syntax is nice: we are, I think, better
> at recognising glyphs than words.  So the glyph + is better than the
> word 'plus', and a glyph sequence [x] is better than the word 'aref.'
> Perhaps I'm wrong, though.

The success of APL and Chinese prooves it.

-- 
__Pascal Bourguignon__                     http://www.informatimago.com/
-----BEGIN GEEK CODE BLOCK-----
Version: 3.12
GCS d? s++:++ a+ C+++ UL++++ P--- L+++ E+++ W++ N+++ o-- K- w--- 
O- M++ V PS PE++ Y++ PGP t+ 5+ X++ R !tv b+++ DI++++ D++ 
G e+++ h+ r-- z? 
------END GEEK CODE BLOCK------
From: Ulrich Hobelmann
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <3oq4a5F6o57cU2@individual.net>
Pascal Bourguignon wrote:
> Robert Uhl <·········@NOSPAMgmail.com> writes:
>> Here's another thought about why syntax is nice: we are, I think, better
>> at recognising glyphs than words.  So the glyph + is better than the
>> word 'plus', and a glyph sequence [x] is better than the word 'aref.'
>> Perhaps I'm wrong, though.
> 
> The success of APL and Chinese prooves it.

APL is dead (don't know about J), Chinese got simplified, and many 
people have trouble reading a Chinese newspaper...

-- 
My mouth says the words, my brain is thinking monstertrucks.
Joey (Friends)
From: Robert Uhl
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <m3zmqftksk.fsf@4dv.net>
Pascal Bourguignon <····@mouse-potato.com> writes:
>
>> Here's another thought about why syntax is nice: we are, I think,
>> better at recognising glyphs than words.  So the glyph + is better
>> than the word 'plus', and a glyph sequence [x] is better than the
>> word 'aref.'  Perhaps I'm wrong, though.
>
> The success of APL and Chinese prooves it.

Even words are formed out of glyphs...

There's a place for glyphs, e.g. in math.  So is programming enough like
math that glyphs are useful, or different enough that they are a
hindrance?

-- 
Robert Uhl <http://public.xdi.org/=ruhl>
There is no female Mozart because there is no female Jack the Ripper.
                                                    --Camille Paglia
From: Pascal Bourguignon
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <87slw7z3uv.fsf@thalassa.informatimago.com>
Robert Uhl <·········@NOSPAMgmail.com> writes:

> Pascal Bourguignon <····@mouse-potato.com> writes:
>>
>>> Here's another thought about why syntax is nice: we are, I think,
>>> better at recognising glyphs than words.  So the glyph + is better
>>> than the word 'plus', and a glyph sequence [x] is better than the
>>> word 'aref.'  Perhaps I'm wrong, though.
>>
>> The success of APL and Chinese prooves it.
>
> Even words are formed out of glyphs...
>
> There's a place for glyphs, e.g. in math.  So is programming enough like
> math that glyphs are useful, or different enough that they are a
> hindrance?

It's a question of input devices.  If we had wacom tablet+stylus
instead of keyboard+mouse, and even now that we LCD is becoming
common, horizontal desk-sized screen/tablet+stylus,  more computers
could use gesture and glyph user interfaces.

http://www.itwriting.com/tablet.php
http://www.wacom.com/productinfo/intuos.cfm


Even better, Xerox prototyped user interfaces based on cameras and
images projected from above onto your normal paper and pen.  What you
write on your notebook is viewd, analyzed and integrated into the
database, and can be projected back, formated and with additionnal
information.


But really, I prefer to type my programs with a keyboard than to write
them on paper. I did that when learning ASM 360 and COBOL with pencil
and eraser, I know what's better!

The only positive improvement I can envision in this domain, is a mind
reading computer that would send to (an) emacs the text and editing
commands I think.


-- 
"Klingon function calls do not have "parameters" -- they have
"arguments" and they ALWAYS WIN THEM."
From: ······@earthlink.net
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <1126743681.716783.184720@g49g2000cwa.googlegroups.com>
Pascal Bourguignon wrote:
> It's a question of input devices.  If we had wacom tablet+stylus
> instead of keyboard+mouse, and even now that we LCD is becoming
> common, horizontal desk-sized screen/tablet+stylus,  more computers
> could use gesture and glyph user interfaces.

It's a question of input devices only if glyph input time is
significant.  I doubt that it is compared to "figuring out which
glyphs to write" time.  I suspect that both are dwarfed by "realizing
what was written" time.  (The latter can be a big part of
"understanding
how it's wrong" time.)
From: Thomas F. Burdick
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <xcvvf12kf3h.fsf@conquest.OCF.Berkeley.EDU>
······@earthlink.net writes:

> Pascal Bourguignon wrote:
> > It's a question of input devices.  If we had wacom tablet+stylus
> > instead of keyboard+mouse, and even now that we LCD is becoming
> > common, horizontal desk-sized screen/tablet+stylus,  more computers
> > could use gesture and glyph user interfaces.
> 
> It's a question of input devices only if glyph input time is
> significant.  I doubt that it is compared to "figuring out which
> glyphs to write" time.  I suspect that both are dwarfed by "realizing
> what was written" time.  (The latter can be a big part of
> "understanding
> how it's wrong" time.)

Heh, I pity the fool that has to figure out whether I wrote gamma,
phi, v, y, or Upsilon, if I was in a hurry, with no clear context
where only one makes sense.

-- 
           /|_     .-----------------------.                        
         ,'  .\  / | Free Mumia Abu-Jamal! |
     ,--'    _,'   | Abolish the racist    |
    /       /      | death penalty!        |
   (   -.  |       `-----------------------'
   |     ) |                               
  (`-.  '--.)                              
   `. )----'                               
From: Philippe Lorin
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <432809f4$0$31689$626a14ce@news.free.fr>
Robert Uhl wrote:
> Here's another thought about why syntax is nice: we are, I think, better
> at recognising glyphs than words.  So the glyph + is better than the
> word 'plus', and a glyph sequence [x] is better than the word 'aref.'
> Perhaps I'm wrong, though.

It was already pointed out in this thread, but apparently it ought to be 
made clearer: the distinction between "+" and "plus" does not belong to 
syntax; neither does the evolution of Chinese characters. Syntax deals 
with the arrangement of symbols above the morphological level -- symbols 
we call "words" in linguistics, and "tokens" in computer science. Of 
course the line between morphology and syntax is not clear-cut, but 
characters do not even belong to the morphological level; they're at the 
same level as phonemes (when considering written language as an 
autonomous language). Syntax is not about characters, neither in 
linguistics nor in computer science -- unless you're willing to count 
token formation rules as syntax (as they're part of the formal grammars, 
although usually implemented separately); but there Lisp is not that 
different from other languages.

This may sound like nitpicking, but I think we have two very different 
issues here, when we say non-Lispers "want syntax": do they want funny 
characters, or do they want grammar rules?

I for one wanted funny characters. Not as in C's "!" or "||", but "{" 
and "[". These characters are useful because of their shape: they're our 
means of drawing boxes to enclose things in a semi-linear writing, and 
this is useful. Variations like "(", "[" and "{" are just like color 
that says something about the box. Now I'm realizing that we can have 
this in Lisp without forcing it into the language, with the help of a 
proper editor. I'd still love real rectangles sometimes, but I'm not 
sure we could handle them with a keyboard.

I think you're right that "+" is more easily read than "plus". This is 
not true of all funny characters, though; I think "+" has the advantage 
of being learned at school, and used everywhere in real life. But not 
many characters are that lucky, and few are linked to a meaning we can 
readily use in computer languages.

One often neglected problem is, beyond readability, the communicability 
of the characters, and their ability to be memorized. It's important to 
be able to speak a character, as well as write and type it.

I also wanted syntax like that I was used to in math: infix "+" for 
instance. I must admit it felt natural to me. Now I'm ashamed and I 
wonder why prefix isn't the preferred way ;) .

Some characters "jump" at the eye like no others. This is a big 
difference between letters and funny characters (the distinction 
possibly being relative to one's mother tongues and culture). Also, 
we're less likely to want to build identifiers out of funny characters 
(except isolated). Thus the density of information in things like C's:
*ptr
is possible only using funny characters built into the lexer. I find 
this more readable than something like:
(deref ptr)
Lisp would better have strong reasons for preferring that wordy thing. 
While still a newbie, I believe it has; this has been discussed at 
length here now and then.

Lisp looks like it has no structure. This is uncomfortable, for a 
programming language. It's like Latin written in full majuscules with no 
spaces. We expect computer stuff to be organized, and organization to be 
represented by lines, dashes, bullets, arrows, boxes -- not words, 
because words are what we are trying to organize. You put your socks in 
a drawer, not a sock; and there's no need for meta-socks. In Lisp most 
things are words just like any other, or worse, mere positions, which we 
cannot see correctly until we know the words' meanings.

PS (more nitpicking): Robert, I suspect you're confusing "glyphs" and 
"characters". "Words" and "glyphs" are not at the same level; a word may 
be represented by a glyph, there is no opposition between the two. The 
"+" you're talking about is not a "glyph"; it's simply a character. A 
glyph is a drawing, that can be interpreted as a character; the same 
character can be represented by various glyphs, for instance in 
different fonts, or the handwriting of different persons. Moreover, a 
same glyph can represent different characters in different contexts. I 
cannot insert a glyph here since I don't know which font you're using to 
view this text, but chances are the following is rendered with a glyph 
that reads as the second letter of the roman alphabet to an English 
speaker, but as the third letter of the cyrillic alphabet to a Russian 
speaker: B. Two characters that have nothing in common.

Thus "aref", or "(aref x y)" is as much (not) a glyph sequence as "[x]".
From: Pascal Bourguignon
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <87oe6vc1fm.fsf@thalassa.informatimago.com>
Philippe Lorin <············@gmail.com> writes:
> but chances are the
> following is rendered with a glyph that reads as the second letter of
> the roman alphabet to an English speaker, but as the third letter of
> the cyrillic alphabet to a Russian speaker: B. Two characters that
> have nothing in common.

That would be B and В ;-) but you're right.

-- 
__Pascal Bourguignon__                     http://www.informatimago.com/

The world will now reboot.  don't bother saving your artefacts.
From: Robert Uhl
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <m3psrbtjc9.fsf@4dv.net>
Philippe Lorin <············@gmail.com> writes:
>
> It was already pointed out in this thread, but apparently it ought to
> be made clearer: the distinction between "+" and "plus" does not
> belong to syntax; neither does the evolution of Chinese characters.

Oh, you're certainly right.  I plead being tired, and also thinking of
the more general question, which is why languages with more syntax and
special operators than lisp seem to be so much more popular.

-- 
Robert Uhl <http://public.xdi.org/=ruhl>
There seems to be a feeling among the people who provide email
services that somehow people are clamouring for the increased file size
and ugly background pattern options that only MIME can provide.
                                                --David Shayne
From: Björn Lindberg
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <43293ae9$1@news.cadence.com>
Robert Uhl wrote:
> Philippe Lorin <············@gmail.com> writes:
> 
>>It was already pointed out in this thread, but apparently it ought to
>>be made clearer: the distinction between "+" and "plus" does not
>>belong to syntax; neither does the evolution of Chinese characters.
> 
> 
> Oh, you're certainly right.  I plead being tired, and also thinking of
> the more general question, which is why languages with more syntax and
> special operators than lisp seem to be so much more popular.

It is a historical accident. If the majority of popular languages had 
happened to be Lisps, people would still reject "different" syntax.

It is just the disadvantage of being an outsider, that people find the 
peculiarities about you, and pick on you for having them.


Bj�rn
From: Tayssir John Gabbour
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <1126696396.395917.136940@f14g2000cwb.googlegroups.com>
LuisGLopez wrote:
> wildwood ha escrito:
> > I think one reason people want more syntax is, people want the
> > structure of the language to tell them how to solve their problem.
> [...]
> >
> > People who have no experience with Lisp don't know what a Lisp solution
> > to their problem would look like
>
> If I understood you right, I think that's the point. The more
> restrictions a common person has, the easier for him to develop
> anything. It's like in every art: if your teacher tells you to create a
> piano sonata with this or that restriction, you may feel constrained at
> first, but then you see that you can start sooner. If you leave all
> teachers, and begin to create all alone, you feel the emptiness of the
> white sheet of paper quite... well, I lack the term (please excuse my
> english, by the way).

I think most people find computer languages -- and computers --
intimidating. If a language is considered extremely "powerful", perhaps
they think it must be EXTREMELY intimidating.

(idea thing thing)

Maybe Lisp should be taught not as intimidating, but as from a time
when people invented tools which were so powerful, they did not have to
intimidate.

You have to learn some constraints on the "ideas" (computational
limits, what comes with Common Lisp, what you have to download,
scoping), and the things (objects). But this can be done patiently.

Tayssir
From: Ulrich Hobelmann
Subject: Re: Where does the drive to syntax come from?
Date: 
Message-ID: <3oohtkF6u983U1@individual.net>
wildwood wrote:
> When learning something new, I need to be able to manage the complexity
> somehow.  When learning a programming language, that usually means that
> I look at the discrete building blocks of the language.  For Perl, that

Sure, but that's not all.  The basics of Lisp are easy to learn, much 
easier than Java, with class and method scopes, public keywords, static 
variables ...

> meant getting the hang of the ··@/%/& details, control flow, and I/O.
> For Java, that meant getting a feel for objects and types.
> 
> Looking at Lisp from this perspective, the basic building blocks appear
> to be list/cons/car/cdr, and a metric ton of parentheses.  And, from
> this perspective, the language looks pretty useless.

That's what an undergrad Scheme course does to you.  Most people have 
never seen a Lisp with rich data types in action, therefore they think 
it can't exist.

> I think one reason people want more syntax is, people want the
> structure of the language to tell them how to solve their problem.
> People expect a Java solution to involve OO design, and so they know
> they're on the right track if they're figuring out what objects to use.

And Lisp involves functions with loops and recursion, yes, like almost 
all other languages, too.

> People who have no experience with Lisp don't know what a Lisp solution
> to their problem would look like, and they can't tell what it should
> look like from Lisp's syntax, and that causes cognitive discomfort.  It
> takes people a while to grok that a Lisp solution could look like an
> entirely new, special-purpose language.  All that uncertainty makes
> them crave more restrictions - after all, if there was more restrictive
> syntax, they feel like they could have a better picture of the solution
> in their head.

I think most people simply don't get symbolic interpretation.  Last week 
I picked up "A little Java, a few patterns" and "The little MLer" at the 
library, today "The little Schemer."  In the middle of the book suddenly 
arithmetic expressions are introduced, after several chapters of lisp^Ht 
juggling.  Not that this was new to me, but it got me thinking again. 
One major point of Lisp are macros, and that all syntax is just syntax 
trees.  You say "the language looks pretty useless", while to a Lisper 
it contains everything there could ever be in a language.  This simply 
/isn't possible/ in Java, because you can't adapt the syntax to your 
needs (or you do incredibly unreadable stuff like in "A little Java...", 
working around the syntax all the time).

That is, things like LOOP are possible, or HTML generation that is much 
more concise than all the various template engines that people use with 
Java, or PHP.

Restrictive syntax, like concise languages like Perl or PHP, are just 
premature short-term optimizations that cut you in the long term, where 
the builtin syntax isn't enough, and you would need constructs like "new 
Car(new LeftWheel, new RightWheel, new Engine(new Fuel))" or whatever, 
and it gets unreadable quickly.  As has happened here during the past 
months, people point to one example in, say, OCaml, and say "look, it's 
shorter than Lisp."  Sure is, because many operators aren't written out, 
but replaced by (builtin) single characters.  At a higher level, though, 
uniform syntax gives you uniform readability, no matter what structures 
you introduce.

-- 
My mouth says the words, my brain is thinking monstertrucks.
Joey (Friends)