From: Tom Lord
Subject: a quick slice through history
Date: 
Message-ID: <vp3sdr4fivpcfa@corp.supernews.com>
A slice through history.....

      Anton:

      > [Sussman and Steele worked from the implementation side,
      >  noticed the coincidence of implementation with actors, 
      >  and later made the connection back to Landin.]

      > Only after the Scheme implementation was done
      > did some of those connections become obvious.

      > Well, that's my theory and I'm sticking to it...  :)

It's a good theory -- and oh my gosh how it relates to the history of
math and computation.  Landin's paper was about a decade earlier than
Scheme and the first-hand accounts suggest the truth of the
delightful, implementation-driven discovery of a convergence with the
Actors model...  Occam suggests your theory is right.

In my view, Scheme was just a loud and unexpected explosion from down
the hall in a certain lab, after which everyone around turned the
question "what the hell was that?!?!"

Let's look at some other related mid-70s developments, such as those
coming from Milne/Strachey/Scott, but in context.  I think the history
goes roughly like this:


  1) Euclid nails the axiomatic approach to math, at least for one
     specific example.  It's not clear that he is able to form an
     abstraction of the general idea and thus, for hundreds of years,
     the nature of the axiomatic approach is an unasked question.  It
     Just Works: Aristotle is God (or Devil, depending on who you ask).

  2) Leibnitz and Newton come up with Calculus.

  3) Various 19th century mathemeticians contemplating infinities
     inspired by calculus realize that they don't know what the hell
     they're talking about (even though they consistently get so much
     Right).  There's a crisis where people scramble to find a stable
     "foundation of mathematics".

  4) Hollerith invents a technology of "Population Management" that in
     years to come will have many successful spin-off technologies.
     The key thing is that he's a practical man who builds working
     machines and he seems to be on to something:  he's got some
     insight into what machines can do.  (He wasn't the first of
     course:  Babbage, at least, wins the abstract race.   But
     Hollerith's application of the general idea intersected with the
     need for Power to manage populations and so received greater 
     economic support.)

  5) Goedel, having fled a breakdown of civilization, famously proves
     that there is no unique foundation of math (worlds crumble
     inevitably, it seems) but more importantly he introduces a
     _technique_: mathematically modeling math as the formal
     manipulation of symbols.  Euclid is understood in a new light.

  6) Turing tries to help save civilization and, in the process, comes
     up with his universal model of physically realizable (at the
     macro-scale) computational devices.

     Church tries to help save mathematics and comes up with a
     calculus of formal symbol manipulation which he argues can be
     used as a foundation of all math.  We ought to notice that his
     chosen formalization is designed to be workable on pen and paper:
     an example of a macro-scale, physically-realizable computational
     device in action.  We also ought to notice that he's taking
     Goedel's clever hack and turning it into a first-class topic of
     discussion.  In response to the proof of the Church-Turing
     equivalence theorem, people really should have said "don't act so
     surprised."

     Zermelo-Frankel, gesturing back towards Euclid, come up with
     axiomatic set theory, their own proposed foundation.  Very chic,
     very retro.  Really, stylistically, the way very much math is
     done to this very day.  But the axiom of choice creates the
     classical v. intuitionist divide.  Even though ZF are pretty far
     from concern about macro-scale, physically realizable
     computations, they still run gawd-smack-dab-into-the-middle of
     what is apparently the same issue as Goedel, Turing, and Church.

     Between the three (ZF, Turing, Church), Goedel is beginning to be
     understood in a new light, especially as relates to computation.
     It's going to take a few decades for people's eyes to fully
     adjust to the new light.

     Computational devices (from gear based to tube based) are cobbled
     together and manage to save civilization (for the moment), buying
     smart people some more time to think.  (The people wanting to
     destroy civilization were cobbling together their own versions.
     They lost.  This is one of many reasons to have some faith in the
     ideas of civilization and employing technology in it's service.)

  7) Computing devices advance.  Sure, WWII gave us frozen TV dinners
     and tract housing and cheap radios and cheap autos and clock
     radios -- but it also gave us computers. Someone invents
     assembler and then autocoder and then "FORmula TRANslator".

  8) "Programming Languages" are the killer app and get huge amounts
      of intellectual attention.  (Late '50s ... Late 60s, trailing
      off into the 70s.)

     People think to formalize and systematize programming languages 
     and we get, for example Algol.   Theories of syntax and
     semantics begin to emerge.   Landin is in this mix.

     The movie "Desk Set" is filmed, giving us a historical record
     that gives some clues about how computers were colliding with
     the general economy.   The ghost of Hollerith haunts.

  9) Affordable hardware impacted research labs in industry and at 
     universities.   Hardware progress was lapping software progress
     notably and departments took it as granted that they had to 
     cobble together their own computing environments.

     Late 60's/Early 70's: Some lucky goofballs in New Jersey invent C
     and unix.  Some lucky goofballs in Cambridge start beefing up
     lisp.  The department of defense, remembering the lessons of
     WWII, manages to get some funding in the direction of Multics, in
     order to get these problems behind us.

   10) The mid 70s --- this period of time isn't yet fully appreciated
     for what happened then.   Oh my gosh.

     Milne/Strachey/Scott take the pass from Landin and friends and
     run for a touchdown --- taking Church's math to a whole new level
     and using it to model the meaning of programs and programming
     langauges.

     Virtually simultaneously, Sussman/Steele, armed not with a huge
     _theoretical_ repertoire but with some mere notation borrowed
     (inaccurately) from Church and about 15 years of "hacker
     culture" (how to implement modulo quickly on a PDP-10 and things
     of that ilk) come up with Scheme.   At that point in history, 
     they have a confused soup of vaguely theoretical ideas
     (ultimately stemming from the Algol lines of thought) and lots of
     specific ideas about, for example, the design space of candidates
     for `eval' or a compiler:  and they set out to make something
     minimal al cool.   Maybe (look at the rabbit thesis and the
     emphasis it puts on the reduction to a few primitives) they're at
     least _stylistically_ influenced by the mathematicians that
     worked on finding a solid foundation (see (6)) and
     _pragmattically_ they're influenced by the goal of bumming
     instructions and making a tractable system (see "Hackers, Heros
     of the Computer Revolution").

     Rapidly, people start making connections.  The theory branch of
     Milne/Strachey/Scott and the practical work of
     Hewitt/Sussman/Steele -- all converge.

     In retrospect, you might say that Turing model computation
     machines and Church modeled computation and, back then, they
     figured out "Hey, in theory, these things converge."   Then in
     the mid-70s, the convergence happened.   Nobody could have
     prevented it: it's an essential though emergent property of the
     macro-scale universe.


Space aliens program in a Scheme-like language.  I'm about as sure of
that as one could possibly be about a speculation concerning space
aliens.  It's roughly in the area of Scheme (and, yes, CL is roughly
in that area) that math and the emergent computational nature of
macro-scale physics converge.  Locality + determinism + computation =
Scheme.

-t

From: Gareth McCaughan
Subject: Re: a quick slice through history
Date: 
Message-ID: <87smlpsl65.fsf@g.mccaughan.ntlworld.com>
Tom Lord wrote:

>   6) Turing tries to help save civilization and, in the process, comes
>      up with his universal model of physically realizable (at the
>      macro-scale) computational devices.

Turing machines were introduced by a paper Turing wrote
some time before he had anything to do with WW2 crypto;
do you mean something else by "tries to help save
civilization"?

-- 
Gareth McCaughan
.sig under construc
From: Tom Lord
Subject: Re: a quick slice through history
Date: 
Message-ID: <vp64gr5lfcuda4@corp.supernews.com>
  Me:

  >> 6) Turing tries to help save civilization and, in the process,
  >>   comes up with his universal model of physically realizable (at
  >>   the macro-scale) computational devices.

  Gareth:

  > Turing machines were introduced by a paper Turing wrote some time
  > before he had anything to do with WW2 crypto; 

D'oh! 

  > do you mean something else by "tries to help save civilization"?

Nah.  I fixed my copy to say:


  6) Turing tries to come up with his universal model of physically
     realizable (at the macro-scale) computational devices.  (Later
     he goes on to try to help save civilization.)

Though, I dunno... "trying to help save civilization" isn't obviously
a _wrong_ explanation of his earlier work:  just not the one I
intended.

You know another nit might be the word "tries".  Most people wouldn't
say that "Turing tries to come up with" but, instead, "Turing came up
with".

Is my perception correct here that there's a missing proof?  That
classical and relativistic mechanics permit only macro-scale,
informationally-connected devices that aren't super-Turing oracles?

We know that lambda calculus and turing machine are computationally
equivalent, and we think that's as good as it gets because nobody can
think of anything better (other than via quantum computing) -- but is
there a physics proof that says you can't do any better than a turing
machine (in the classical or relativistic universe)?

(By "informationally connected" I mean things we can get information
out of.  So, for example, that an oracle could exist beyond an event
horizon doesn't count.)

-t