From: Erann Gat
Subject: Computability in principle
Date: 
Message-ID: <gat-0411031328270001@k-137-79-50-101.jpl.nasa.gov>
The "Python from Wise Guy's Viewpoint" thread has had an extraordinarily
good run, but the sub-thread on formalisms has gotten far enough removed
from the subject line that I've decided to start a new thread and broach a
new but related topic.

Many formalist arguments rely on exhaustive enumeration of programs. 
Prunesquallors "universal algorithm" is one such example.  This algorithm
exhaustively enumerates all programs and mechanically checks to see if
they satisfy a formal specification.  The argument has centered on the
quesiton of whether checking against a specification is really feasable on
theoretical grounds.  My position has been that it is not, that a
consequence of the Goedel/Turing theorem(s) there are formal
specifications (like halting) that are not computable, and that therefore
any language provided for writing specifications must be either
uncomputable or incomplete.  I would have thought this to be a completely
uncontroversial point, but on usenet you never know.

I wish here to broach a completely different argument against the
formalist proposition based on what can be computed in principle in our
physical universe.  The argument is very simple: if you turned the entire
universe into a computer and figured out how high it can count the answer
turns out to be pretty small, only about 500 bits.  Here's the math:

There are about 10^80 particles in the known Universe.
The universe is about 10^18 seconds old.
The Planck time, the time that it takes light to travel the diameter of a
proton, believed to be the smallest possible time scale on which physical
processes can occur, is 10^-43 seconds.

So if you turned the entire universe into a computer and had it do nothing
but count it could enumerate about 10^141 = 2^468 states.

So it is impossible even in principle given the physical constraints of
our universe to enumerate programs larger than about sixty bytes, or
shorter than a single line of text on a standard 80-character wide
display.  The real practical constraint is, of course, much smaller than
this, probably by an order of magnitude, but what's a few bytes among
friends?

All programs of practical interest are vastly larger than 60 bytes.  I
submit therefore that any formalist program (I use the word here in the
sense of "Hilbert's program", not a computer program) based on exhaustive
enumeration of programs (in the computer-program sense this time) has no
hope of having any practical utility, and is therefore not worthy of
serious consideration.

(Following Wadler's argument that programs and proofs amount to the same
thing, the same argument applies to anything based on exhaustive
enumeration of proofs as well.)

E.

From: Joachim Durchholz
Subject: Re: Computability in principle
Date: 
Message-ID: <bo9auq$vm3$1@news.oberberg.net>
I agree that enumeration isn't a practical approach for most problems.

There's one class of exceptions where enumeration is still an option: 
namely programs with a rather small input, say, in the range of 10-20 bits.
The exact limit depends on the resources needed for checking each case, 
and the available processing power. The distributed processing power 
used to demonstrate the "crackability" of DES enumerated a search space 
of 2^56 values, so this provides us with a useful upper limit on how far 
enumeration can take us.

To state the point more clearly: for some N << 56, enumerating all 
programs up to N bits size isn't a promising approach since most 
programs have an entropy of more than N bits, but enumerating all inputs 
up to N bits to prove a property of a given program might very well be 
useful.

Hope this is uncontroversial...

Regards,
Jo
From: Markus Mottl
Subject: Re: Computability in principle
Date: 
Message-ID: <bo9b9i$sse$1@bird.wu-wien.ac.at>
In comp.lang.functional Erann Gat <···@jpl.nasa.gov> wrote:
> So if you turned the entire universe into a computer and had it do nothing
> but count it could enumerate about 10^141 = 2^468 states.
[snip]
> All programs of practical interest are vastly larger than 60 bytes.  I
> submit therefore that any formalist program (I use the word here in the
> sense of "Hilbert's program", not a computer program) based on exhaustive
> enumeration of programs (in the computer-program sense this time) has no
> hope of having any practical utility, and is therefore not worthy of
> serious consideration.

Well, but you have forgotten one thing: our universe is not a classical
computer (maybe for a good reason? ;). By using quantum superpositions
you can vastly increase its capacity and represent all programs of a
given size simultaneously in the corresponding number of qbits.

Regards,
Markus Mottl

-- 
Markus Mottl          http://www.oefai.at/~markus          ······@oefai.at
From: Erann Gat
Subject: Re: Computability in principle
Date: 
Message-ID: <gat-0411031529020001@k-137-79-50-101.jpl.nasa.gov>
In article <············@bird.wu-wien.ac.at>, "Markus Mottl"
<······@oefai.at> wrote:

> In comp.lang.functional Erann Gat <···@jpl.nasa.gov> wrote:
> > So if you turned the entire universe into a computer and had it do nothing
> > but count it could enumerate about 10^141 = 2^468 states.
> [snip]
> > All programs of practical interest are vastly larger than 60 bytes.  I
> > submit therefore that any formalist program (I use the word here in the
> > sense of "Hilbert's program", not a computer program) based on exhaustive
> > enumeration of programs (in the computer-program sense this time) has no
> > hope of having any practical utility, and is therefore not worthy of
> > serious consideration.
> 
> Well, but you have forgotten one thing: our universe is not a classical
> computer (maybe for a good reason? ;). By using quantum superpositions
> you can vastly increase its capacity and represent all programs of a
> given size simultaneously in the corresponding number of qbits.

Quite true, and it's possible that advances in quantum computing will some
day invalidate my argument.  But that day is not yet here.

E.
From: Fergus Henderson
Subject: Re: Computability in principle
Date: 
Message-ID: <3fa88ba6$1@news.unimelb.edu.au>
···@jpl.nasa.gov (Erann Gat) writes:

>"Markus Mottl" <······@oefai.at> wrote:
>
>> In comp.lang.functional Erann Gat <···@jpl.nasa.gov> wrote:
>> > So if you turned the entire universe into a computer and had it do nothing
>> > but count it could enumerate about 10^141 = 2^468 states.
>> [snip]
>> > All programs of practical interest are vastly larger than 60 bytes.  I
>> > submit therefore that any formalist program (I use the word here in the
>> > sense of "Hilbert's program", not a computer program) based on exhaustive
>> > enumeration of programs (in the computer-program sense this time) has no
>> > hope of having any practical utility, and is therefore not worthy of
>> > serious consideration.
>> 
>> Well, but you have forgotten one thing: our universe is not a classical
>> computer (maybe for a good reason? ;). By using quantum superpositions
>> you can vastly increase its capacity and represent all programs of a
>> given size simultaneously in the corresponding number of qbits.
>
>Quite true, and it's possible that advances in quantum computing will some
>day invalidate my argument.  But that day is not yet here.

Your argument was about computability _in principle_ (see subject line),
not computability today.  Therefore your argument is already invalided.

-- 
Fergus Henderson <···@cs.mu.oz.au>  |  "I have always known that the pursuit
The University of Melbourne         |  of excellence is a lethal habit"
WWW: <http://www.cs.mu.oz.au/~fjh>  |     -- the last words of T. S. Garp.
From: Erann Gat
Subject: Re: Computability in principle
Date: 
Message-ID: <gat-0411032211220001@192.168.1.51>
In article <··········@news.unimelb.edu.au>, Fergus Henderson
<···@cs.mu.oz.au> wrote:

> ···@jpl.nasa.gov (Erann Gat) writes:
> 
> >"Markus Mottl" <······@oefai.at> wrote:
> >
> >> In comp.lang.functional Erann Gat <···@jpl.nasa.gov> wrote:
> >> > So if you turned the entire universe into a computer and had it do
nothing
> >> > but count it could enumerate about 10^141 = 2^468 states.
> >> [snip]
> >> > All programs of practical interest are vastly larger than 60 bytes.  I
> >> > submit therefore that any formalist program (I use the word here in the
> >> > sense of "Hilbert's program", not a computer program) based on exhaustive
> >> > enumeration of programs (in the computer-program sense this time) has no
> >> > hope of having any practical utility, and is therefore not worthy of
> >> > serious consideration.
> >> 
> >> Well, but you have forgotten one thing: our universe is not a classical
> >> computer (maybe for a good reason? ;). By using quantum superpositions
> >> you can vastly increase its capacity and represent all programs of a
> >> given size simultaneously in the corresponding number of qbits.
> >
> >Quite true, and it's possible that advances in quantum computing will some
> >day invalidate my argument.  But that day is not yet here.
> 
> Your argument was about computability _in principle_ (see subject line),
> not computability today. Therefore your argument is already invalided.

Not necessarily.

It is not yet clear whether quantum computation can really be made to
work.  It may be that the practical difficulty of keeping N qbits mutually
entangled grows exponentially with N, in which case the same argument may
well apply.  We just don't know yet.  Of course, if someone does manage to
build a quantum computer with a few hundred qbits then my argument will be
the last thing on anyone's mind, including mine.

E.
From: Markus Mottl
Subject: Re: Computability in principle
Date: 
Message-ID: <boanre$vci$1@bird.wu-wien.ac.at>
In comp.lang.functional Erann Gat <···@jpl.nasa.gov> wrote:
> It is not yet clear whether quantum computation can really be made to
> work.  It may be that the practical difficulty of keeping N qbits mutually
> entangled grows exponentially with N, in which case the same argument may
> well apply.

Indeed, I also fear that your argument might turn out to be true in
practice.  But well, as long as there is no evidence against large-scale
quantum computers, I'll retain some healthy optimism :-)

Regards,
Markus Mottl

-- 
Markus Mottl          http://www.oefai.at/~markus          ······@oefai.at
From: Markus Mottl
Subject: Re: Computability in principle
Date: 
Message-ID: <boap05$vci$3@bird.wu-wien.ac.at>
In comp.lang.functional Markus Mottl <······@oefai.at> wrote:
> In comp.lang.functional Erann Gat <···@jpl.nasa.gov> wrote:
>> It is not yet clear whether quantum computation can really be made to
>> work.  It may be that the practical difficulty of keeping N qbits mutually
>> entangled grows exponentially with N, in which case the same argument may
>> well apply.

> Indeed, I also fear that your argument might turn out to be true in
> practice.  But well, as long as there is no evidence against large-scale
> quantum computers, I'll retain some healthy optimism :-)

And of course: as Jerzy has pointed out, this is about maintaining
coherence, not entanglement. I supposed that this was what Erann meant.
Let's not get too fascist about terminology: except for Jerzy there are
probably hardly any hard-core physicists in here. Just because somebody
misuses a term from another branch of science doesn't mean that this
thread is meaningless. At least as long as we stay more closely to
computer science rather than physics...

Regards,
Markus Mottl

-- 
Markus Mottl          http://www.oefai.at/~markus          ······@oefai.at
From: Anton van Straaten
Subject: Re: Computability in principle
Date: 
Message-ID: <3t7qb.10147$9M3.9813@newsread2.news.atl.earthlink.net>
Markus Mottl wrote:
> In comp.lang.functional Erann Gat <···@jpl.nasa.gov> wrote:
> > It is not yet clear whether quantum computation can really be made to
> > work.  It may be that the practical difficulty of keeping N qbits
mutually
> > entangled grows exponentially with N, in which case the same argument
may
> > well apply.
>
> Indeed, I also fear that your argument might turn out to be true in
> practice.  But well, as long as there is no evidence against large-scale
> quantum computers, I'll retain some healthy optimism :-)

My suspicion is that quantum computers will indeed turn out to be
miraculous, with one minor drawback: when they produce a correct answer, it
will not necessarily appear in the same universe in which the question was
asked...

Anton
From: Aaron Denney
Subject: Re: Computability in principle
Date: 
Message-ID: <slrnbqj19u.d6c.wnoise@ofb.net>
["Followup-To:" header set to comp.lang.functional.]
On 2003-11-05, Anton van Straaten <·····@appsolutions.com> wrote:
> Markus Mottl wrote:
>> In comp.lang.functional Erann Gat <···@jpl.nasa.gov> wrote:
>> > It is not yet clear whether quantum computation can really be made to
>> > work.  It may be that the practical difficulty of keeping N qbits
> mutually
>> > entangled grows exponentially with N, in which case the same argument
> may
>> > well apply.
>>
>> Indeed, I also fear that your argument might turn out to be true in
>> practice.  But well, as long as there is no evidence against large-scale
>> quantum computers, I'll retain some healthy optimism :-)
>
> My suspicion is that quantum computers will indeed turn out to be
> miraculous, with one minor drawback: when they produce a correct answer, it
> will not necessarily appear in the same universe in which the question was
> asked...

Thinking of quantum computers as "harnessing the powers of multiple
universes" is not a fruitful way to think of their speed-ups over
classical computers.

-- 
Aaron Denney
-><-
From: Anton van Straaten
Subject: Re: Computability in principle
Date: 
Message-ID: <tPhqb.11872$Oo4.4010@newsread1.news.atl.earthlink.net>
Aaron Denney wrote:
> On 2003-11-05, Anton van Straaten <·····@appsolutions.com> wrote:
> > Markus Mottl wrote:
> >> In comp.lang.functional Erann Gat <···@jpl.nasa.gov> wrote:
> >> > It is not yet clear whether quantum computation can really be made to
> >> > work.  It may be that the practical difficulty of keeping N qbits
> > mutually
> >> > entangled grows exponentially with N, in which case the same argument
> > may
> >> > well apply.
> >>
> >> Indeed, I also fear that your argument might turn out to be true in
> >> practice.  But well, as long as there is no evidence against
large-scale
> >> quantum computers, I'll retain some healthy optimism :-)
> >
> > My suspicion is that quantum computers will indeed turn out to be
> > miraculous, with one minor drawback: when they produce a correct answer,
it
> > will not necessarily appear in the same universe in which the question
was
> > asked...
>
> Thinking of quantum computers as "harnessing the powers of multiple
> universes" is not a fruitful way to think of their speed-ups over
> classical computers.

I see that on some level, you understood my point.

Anton
From: Matthew Danish
Subject: Re: Computability in principle
Date: 
Message-ID: <20031106165352.GA20109@mapcar.org>
On Wed, Nov 05, 2003 at 02:05:51PM +0000, Anton van Straaten wrote:
> My suspicion is that quantum computers will indeed turn out to be
> miraculous, with one minor drawback: when they produce a correct answer, it
> will not necessarily appear in the same universe in which the question was
> asked...

In fact, as soon as the Answer is found, the Universe will instantly be
replaced with one that is far more complicated...

-- 
; Matthew Danish <·······@andrew.cmu.edu>
; OpenPGP public key: C24B6010 on keyring.debian.org
; Signed or encrypted mail welcome.
; "There is no dark side of the moon really; matter of fact, it's all dark."
From: Pascal Bourguignon
Subject: Re: Computability in principle
Date: 
Message-ID: <87d6c7pwdg.fsf@thalassa.informatimago.com>
···@jpl.nasa.gov (Erann Gat) writes:

> The "Python from Wise Guy's Viewpoint" thread has had an extraordinarily
> good run, but the sub-thread on formalisms has gotten far enough removed
> from the subject line that I've decided to start a new thread and broach a
> new but related topic.
> 
> Many formalist arguments rely on exhaustive enumeration of programs. 
> Prunesquallors "universal algorithm" is one such example.  This algorithm
> exhaustively enumerates all programs and mechanically checks to see if
> they satisfy a formal specification.  The argument has centered on the
> quesiton of whether checking against a specification is really feasable on
> theoretical grounds.  My position has been that it is not, that a
> consequence of the Goedel/Turing theorem(s) there are formal
> specifications (like halting) that are not computable, and that therefore
> any language provided for writing specifications must be either
> uncomputable or incomplete.  I would have thought this to be a completely
> uncontroversial point, but on usenet you never know.
> 
> I wish here to broach a completely different argument against the
> formalist proposition based on what can be computed in principle in our
> physical universe.  The argument is very simple: if you turned the entire
> universe into a computer and figured out how high it can count the answer
> turns out to be pretty small, only about 500 bits.  Here's the math:
> 
> There are about 10^80 particles in the known Universe.
> The universe is about 10^18 seconds old.
> The Planck time, the time that it takes light to travel the diameter of a
> proton, believed to be the smallest possible time scale on which physical
> processes can occur, is 10^-43 seconds.
> 
> So if you turned the entire universe into a computer and had it do nothing
> but count it could enumerate about 10^141 = 2^468 states.
> 
> So it is impossible even in principle given the physical constraints of
> our universe to enumerate programs larger than about sixty bytes, or
> shorter than a single line of text on a standard 80-character wide
> display.  The real practical constraint is, of course, much smaller than
> this, probably by an order of magnitude, but what's a few bytes among
> friends?
> 
> All programs of practical interest are vastly larger than 60 bytes.  I
> submit therefore that any formalist program (I use the word here in the
> sense of "Hilbert's program", not a computer program) based on exhaustive
> enumeration of programs (in the computer-program sense this time) has no
> hope of having any practical utility, and is therefore not worthy of
> serious consideration.
> 
> (Following Wadler's argument that programs and proofs amount to the same
> thing, the same argument applies to anything based on exhaustive
> enumeration of proofs as well.)
> 
> E.

A very good point. Thank you very much.

But I  would like to point  to genetic algorithms.  They  seem to give
some  good results.  But  perhaps it's  only because  they use  a very
little number of bits as genes.

So,  what if  we  have a  better  search method  than enumerating  all
possible programs?

-- 
__Pascal_Bourguignon__
http://www.informatimago.com/
From: Erann Gat
Subject: Re: Computability in principle
Date: 
Message-ID: <gat-0411031532510001@k-137-79-50-101.jpl.nasa.gov>
In article <··············@thalassa.informatimago.com>, Pascal Bourguignon
<····@thalassa.informatimago.com> wrote:

> But I  would like to point  to genetic algorithms.  They  seem to give
> some  good results.  But  perhaps it's  only because  they use  a very
> little number of bits as genes.

Also because no one (as far as I know) claims that genetic algorithms are
"correct" in anything but a statistical sense.  There is a whole class of
PAC algorithms (probably approximately correct) of which (some) genetic
algorithms are an example that are not subject to this argument.

E.
From: Jens Axel Søgaard
Subject: Re: Computability in principle
Date: 
Message-ID: <3fa846f7$0$69958$edfadb0f@dread12.news.tele.dk>
Erann Gat wrote:
> Pascal Bourguignon wrote:

>>But I  would like to point  to genetic algorithms.  They  seem to give
>>some  good results.  But  perhaps it's  only because  they use  a very
>>little number of bits as genes.

> Also because no one (as far as I know) claims that genetic algorithms are
> "correct" in anything but a statistical sense.  There is a whole class of
> PAC algorithms (probably approximately correct) of which (some) genetic
> algorithms are an example that are not subject to this argument.

This reminds me of the "No Free Lunch"-theorems. They more or less say that
all search algorithms perform the same on the average. A somewaht depressing
result. For a better explanation than I can give, see the first page of:

<http://citeseer.nj.nec.com/context/1937361/0>

-- 
Jens Axel S�gaard
From: Markus Mottl
Subject: Re: Computability in principle
Date: 
Message-ID: <boaoeu$vci$2@bird.wu-wien.ac.at>
In comp.lang.functional Jens Axel S�gaard <······@jasoegaard.dk> wrote:
> This reminds me of the "No Free Lunch"-theorems. They more or less say that
> all search algorithms perform the same on the average. A somewaht depressing
> result. For a better explanation than I can give, see the first page of:

Yes, if you consider all possible problems. No, if our universe is guided
by certain laws that pop up in various contexts.  In this case some
search algorithms with an appropriate bias will perform better. Though,
I sometimes have the feeling that our universe behaves completely at
random... ;-)

It is funny that there are actually optimal universal search
algorithms! One can prove that if there is a common bias one can exploit
for a class of problems that these optimal universal algorithms will
at worst only require a constant amount of time longer than the best
non-universal (i.e. specialized) one.

You may want to look at the work of Juergen Schmidhuber et al. (e.g. the
Optimally Ordered Problem Solver):

  http://www.idsia.ch/~juergen

Best regards,
Markus Mottl

-- 
Markus Mottl          http://www.oefai.at/~markus          ······@oefai.at
From: Sampo Smolander
Subject: genetic algorithms
Date: 
Message-ID: <boignv$5i2$1@oravannahka.helsinki.fi>
In comp.lang.functional Erann Gat <···@jpl.nasa.gov> wrote:
> In article <··············@thalassa.informatimago.com>, Pascal Bourguignon
> <····@thalassa.informatimago.com> wrote:
>> But I  would like to point  to genetic algorithms.  They  seem to give
>> some  good results.  But  perhaps it's  only because  they use  a very
>> little number of bits as genes.

> Also because no one (as far as I know) claims that genetic algorithms are
> "correct" in anything but a statistical sense.

Genetic algorithms work because many real-world problems are approximately
separable. By separability I mean that, for example, if we are searching a
maximum of a funtion of 10 arguments: f(x_1,...,x_10), then f is
approximately separable as f = g(x_1,...x_n)*h(x_n+1,...,x_10), at least
for some n: 1 < n < 10.

Now, if we have a good candidate solution v = (v_1,...,v_10) and another
good candidate w = (w_1,...,w_10), then *because of separability* there is
a good chance that a certain crossover of them, say
(v_1,...,v_4,w_5,...w_10), is a good or even a better solution.

One can probably think of problems that are not even remotely
separable, and for such problems I guess genetic algorithm
would not perform any better that a simple random search.
From: PKY
Subject: Re: genetic algorithms
Date: 
Message-ID: <9eb48ce9.0311080645.256e99e8@posting.google.com>
Sampo Smolander <····················@helsinki.fi> wrote in message news:<············@oravannahka.helsinki.fi>...
> In comp.lang.functional Erann Gat <···@jpl.nasa.gov> wrote:
> > In article <··············@thalassa.informatimago.com>, Pascal Bourguignon
> > <····@thalassa.informatimago.com> wrote:
> >> But I  would like to point  to genetic algorithms.  They  seem to give
> >> some  good results.  But  perhaps it's  only because  they use  a very
> >> little number of bits as genes.
>  
> > Also because no one (as far as I know) claims that genetic algorithms are
> > "correct" in anything but a statistical sense.
> 
> Genetic algorithms work because many real-world problems are approximately
> separable. By separability I mean that, for example, if we are searching a
> maximum of a funtion of 10 arguments: f(x_1,...,x_10), then f is
> approximately separable as f = g(x_1,...x_n)*h(x_n+1,...,x_10), at least
> for some n: 1 < n < 10.

Genetic algorithms are quite good in many cases:
Molecule Docking, Molecule superpositioning, space allocation in
cargos, Stock forecasting (www.hybrid.fi/sp500 for someone not to
believe for last year) and many many other ways...

Also one special brand of genetic algoritms, namely genetic
programming is very good playground for lisp hackers.

Currently I am trying to develop (on my free time) a lisp - based
system for trying to find out genetic diseases from gene - alleles
using genetic programming. Basically the programs tries to out
function set(s) for good forecast of ill/healthy from large number of
candidates & large number for gene alleles..

Very nice way of using GA & lisp. I would like to see someone code the
same functionality in c++ ;-)

	PKY
From: Scott McIntire
Subject: Re: genetic algorithms
Date: 
Message-ID: <VgFrb.155930$HS4.1291790@attbi_s01>
"PKY" <················@ailabsolutions.com> wrote in message
·································@posting.google.com...
> Sampo Smolander <····················@helsinki.fi> wrote in message
news:<············@oravannahka.helsinki.fi>...
> > In comp.lang.functional Erann Gat <···@jpl.nasa.gov> wrote:
> > > In article <··············@thalassa.informatimago.com>, Pascal
Bourguignon
> > > <····@thalassa.informatimago.com> wrote:
> > >> But I  would like to point  to genetic algorithms.  They  seem to
give
> > >> some  good results.  But  perhaps it's  only because  they use  a
very
> > >> little number of bits as genes.
> >
> > > Also because no one (as far as I know) claims that genetic algorithms
are
> > > "correct" in anything but a statistical sense.
> >
> > Genetic algorithms work because many real-world problems are
approximately
> > separable. By separability I mean that, for example, if we are searching
a
> > maximum of a funtion of 10 arguments: f(x_1,...,x_10), then f is
> > approximately separable as f = g(x_1,...x_n)*h(x_n+1,...,x_10), at least
> > for some n: 1 < n < 10.
>
> Genetic algorithms are quite good in many cases:
> Molecule Docking, Molecule superpositioning, space allocation in
> cargos, Stock forecasting (www.hybrid.fi/sp500 for someone not to
> believe for last year) and many many other ways...
>
> Also one special brand of genetic algoritms, namely genetic
> programming is very good playground for lisp hackers.
>
> Currently I am trying to develop (on my free time) a lisp - based
> system for trying to find out genetic diseases from gene - alleles
> using genetic programming. Basically the programs tries to out
> function set(s) for good forecast of ill/healthy from large number of
> candidates & large number for gene alleles..
>

I have some code at my sourceforge project, com-lisp-utils (
https://sourceforge.net/projects/com-lisp-utils/ ), that has a genetic
algorithm (directory rsm/ga) as well as a genetic programming package
(directory rsm/gp) that you might find useful.

-- R. Scott McIntire


> Very nice way of using GA & lisp. I would like to see someone code the
> same functionality in c++ ;-)
>
> PKY
From: Sampo Smolander
Subject: Re: genetic algorithms
Date: 
Message-ID: <boo65i$gcv$1@oravannahka.helsinki.fi>
In comp.lang.functional PKY <················@ailabsolutions.com> wrote:
> Sampo Smolander <····················@helsinki.fi> wrote in message news:<············@oravannahka.helsinki.fi>...
>> Genetic algorithms work because many real-world problems are
>> approximately separable.

> Molecule Docking,

The location coordinates are approximately separable from the
rotation coordinates.

> Molecule superpositioning,

The correlation usually decreases as we move along the molecule
backbone; very separable.

> space allocation in cargos,

How we pack one corner of the box is _usually_ separable
from how we pack the other corners.

> Stock forecasting

I'd guess short and long time trend are approximately separable
from each other.
From: PKY
Subject: Re: genetic algorithms
Date: 
Message-ID: <9eb48ce9.0311102200.152a4e3a@posting.google.com>
Sampo Smolander <····················@helsinki.fi> wrote in message news:<············@oravannahka.helsinki.fi>...
> In comp.lang.functional PKY <················@ailabsolutions.com> wrote:
> > Sampo Smolander <····················@helsinki.fi> wrote in message news:<············@oravannahka.helsinki.fi>...
> >> Genetic algorithms work because many real-world problems are
> >> approximately separable.
>  
> > Molecule Docking,
> 
> The location coordinates are approximately separable from the
> rotation coordinates.

Yep, not a good idea to crossover rotation angles with translation
coordinates.
Also crossover between different search spaces is not so bright :-)

> 
> > Molecule superpositioning,
> 
> The correlation usually decreases as we move along the molecule
> backbone; very separable.

Well, the most important factor is in any case the
rotation&translation matrix +  shape recognition of the objects in
question.

> 
> > space allocation in cargos,
> 
> How we pack one corner of the box is _usually_ separable
> from how we pack the other corners.

Well, as a matter of fact, this one is maybe the easiest to do of
these examples.

Only a few parameters have to be taken into account (e.g. fragility,
shape, volume, priority (aka money))

> 
> > Stock forecasting
> 
> I'd guess short and long time trend are approximately separable
> from each other.

Well, in the implementation what I know of, the learning period in the
cluster of 5 pentium machines is 3 months.
So it does a lot of learning on the way ;-)


One thing in common what I have found in all of the Lisp
implementations of genetic programming this far (Koza, & the other one
what was posted in this thread) is that they doesn't take into account
return types from functions :-)
Very typical way to go in Lisp oriented mind maybe...


   PKY
From: Ray Blaak
Subject: Re: Computability in principle
Date: 
Message-ID: <ufzh3paof.fsf@STRIPCAPStelus.net>
···@jpl.nasa.gov (Erann Gat) writes:
> There are about 10^80 particles in the known Universe.
> The universe is about 10^18 seconds old.
> The Planck time, the time that it takes light to travel the diameter of a
> proton, believed to be the smallest possible time scale on which physical
> processes can occur, is 10^-43 seconds.
> 
> So if you turned the entire universe into a computer and had it do nothing
> but count it could enumerate about 10^141 = 2^468 states.

Well, the universe is not dead yet. If you are *really* willing to wait you
could count as much as you need to.

Of course, it would be much quicker to simply just do what we actually do:
make good guesses about our programs, and by hook or by crook and a surprising
lack of lawsuits get them limping along.

-- 
Cheers,                                        The Rhythm is around me,
                                               The Rhythm has control.
Ray Blaak                                      The Rhythm is inside me,
········@STRIPCAPStelus.net                    The Rhythm has my soul.
From: Jerzy Karczmarczuk
Subject: Re: Computability in principle
Date: 
Message-ID: <3FA8AD88.2050008@info.unicaen.fr>
Ray Blaak wrote:
> ···@jpl.nasa.gov (Erann Gat) writes:
> 
>>There are about 10^80 particles in the known Universe.
>>The universe is about 10^18 seconds old.
>>The Planck time, the time that it takes light to travel the diameter of a
>>proton, believed to be the smallest possible time scale on which physical
>>processes can occur, is 10^-43 seconds.
>>
>>So if you turned the entire universe into a computer and had it do nothing
>>but count it could enumerate about 10^141 = 2^468 states.
> 
> 
> Well, the universe is not dead yet. If you are *really* willing to wait you
> could count as much as you need to.

And, anyway all this is *rubbish*. Perhaps Erann Gatt should read something on
cosmology. The "known universe" is not the full universe, the universe is -- in
belief of almost all astrophysicists and "fundamental" physicists -- OPEN.
Infinite, if you wish. Our "Hubble sector" might be finite *for local obser-
vations*, but it doesn't matter.

The time to get through a proton has *nothing* to do with the Planck length (and
related Planck time = length/c). Planck length is constructed with the aid of
gravitational constant, which has nothing to do with protons.

Then, "smallest possible scale on which physical processes can occur" being the
Planck length is another untruth. Certainly awfully speculative at the level
of high-school and science popularizers. NOT for string theorists.

===

Fergus noted that the world is quantum, and Erann Gatt summarized that he will
wait until quantum computers appear. This is methodologically a position impos-
sible to defend. The "computability in principle" deals with the number of
quantum states independently of any "realistic" model of quantum devices, the
Turing machine "in general" does not exist either... And, perhaps you should
be aware that we don't really know how to count quantum states in the universe.
In a curved, potentially infinite space, the spectra may be whatever, so all
this counting has a very, very weak sense.

EG:
> It is not yet clear whether quantum computation can really be made to
> work.  It may be that the practical difficulty of keeping N qbits mutually
> entangled grows exponentially with N, in which case the same argument may
> well apply.  We just don't know yet.  

First, the problem is *NOT* with "keeping the qubits entangled". They entangle
themselves quite fast, and they got entangled with the environment, which is
called the "decoherence". It is the other way round, the keeping of coherence
which is delicate. If Eran Gatt wants really to use savant words, perhaps he
should do it after having verified the meaning of them. Of course, there are
several known bounds on the efficiency of quantum information transfer etc.,
but the story is much more complicated and ambiguous that sketched above,
moreover all these theorems, bounds, etc., are weakly related (read: at all...)
to the computability in principle.

(But, besides, it is KNOWN that quantum computation works in toy cases.)


I suggest moving this thread to some .alt. limbo independently of its ambitions,
since it is not based on any qualified knowledge.


Jerzy Karczmarczuk
Caen, France
From: thomas
Subject: Re: Computability in principle
Date: 
Message-ID: <3c5586ca.0311050305.1a1597a7@posting.google.com>
Jerzy Karczmarczuk <·······@info.unicaen.fr> wrote in message news:<················@info.unicaen.fr>...
> And, anyway all this is *rubbish*. Perhaps Erann Gatt should read something on
> cosmology. The "known universe" is not the full universe, the universe is -- in
> belief of almost all astrophysicists and "fundamental" physicists -- OPEN.
> Infinite, if you wish. Our "Hubble sector" might be finite *for local obser-
> vations*, but it doesn't matter.
> 

so how do you intend to collect the results of computations performed
outside the observable universe?

thomas
From: Jerzy Karczmarczuk
Subject: Re: Computability in principle
Date: 
Message-ID: <3FA8E33D.1080409@info.unicaen.fr>
thomas wrote:
> Jerzy Karczmarczuk  wrote ...
> 
>>...   the universe is -- in
>>belief of almost all astrophysicists and "fundamental" physicists -- OPEN.
>>Infinite, if you wish. Our "Hubble sector" might be finite *for local obser-
>>vations*, but it doesn't matter.
>>
> 

> so how do you intend to collect the results of computations performed
> outside the observable universe?


Please, don't ask ME.
For me, personally, even the idea of collecting the results of computations
performed inside, is rather dubious. First, you may collect anyway things
from your past light cone, not from the Universe as such. So, the suggested
combinatorics of states by EG. was - apparently - faulty, independently of
other numeric inadequacies.

But in general I think that all such speculations, together with some other,
as, say: "is the human brain a computer" have no operational meaning, so,
please forgive me, but I can pinpoint some mistakes in such reasoning, but
I won't engage myself actively. Much wiser people than myself gave up, but
too many maniacs remain in this business.


Jerzy Karczmarczuk


PS. This "dodecahedral" (or football) shaped Universe story is quite funny,
     recently there was a discussion on it on sci.phys.research. What one should
     understand, though, that this crazy hypothesis concerns the compactness
     of the topology, not the "3D bounds". Within this vision, some flatbugs
     living on a surface of a torus would - possibly - claim: "our universe
     has a shape of SQUARE" (bi-periodically closed, of course).

     Everybody realizes how terribly important is this issue for both: functional
     programming and Lisp...
From: Marco Antoniotti
Subject: Re: Computability in principle
Date: 
Message-ID: <mrbqb.133$KR3.46985@typhoon.nyu.edu>
thomas wrote:
> Jerzy Karczmarczuk <·······@info.unicaen.fr> wrote in message news:<················@info.unicaen.fr>...
> 
>>And, anyway all this is *rubbish*. Perhaps Erann Gatt should read something on
>>cosmology. The "known universe" is not the full universe, the universe is -- in
>>belief of almost all astrophysicists and "fundamental" physicists -- OPEN.
>>Infinite, if you wish. Our "Hubble sector" might be finite *for local obser-
>>vations*, but it doesn't matter.
>>
> 
> 
> so how do you intend to collect the results of computations performed
> outside the observable universe?

Oh, come on....

	(loop for x = (extrenal-universe-computation)
	   collect x)

Cheers
--
marco
From: Erann Gat
Subject: Re: Computability in principle
Date: 
Message-ID: <gat-0511030817210001@192.168.1.51>
In article <················@info.unicaen.fr>, Jerzy Karczmarczuk
<·······@info.unicaen.fr> wrote:

> And, anyway all this is *rubbish*.

Well, I got some of the details wrong, but none of the impact the
soundness of the argument.

> The "known universe" is not the full universe

That may be, but anything outside of our light cone is irrelevant.

> The time to get through a proton has *nothing* to do with the Planck
> length (and related Planck time = length/c).

That's true.  I misspoke.  But the number and its physical significance is
still correct, which is all that matters to the argument.

> Then, "smallest possible scale on which physical processes can occur"
> being the Planck length is another untruth.

I refer you to http://www.iscid.org/encyclopedia/Planck_time

"Planck time, 10^-43 seconds, is the fundamental unit of time which cannot
be broken down into smaller, meaningful units.  The number is based on the
time it would take a photon travelling at the speed of light to across a
distance equal to the Planck length."

I maintain that "smallest possible time at which physical processes can
occur" is a logical consequence of "fundamental unit of time which cannot
be broken down into smaller, meaningful units."


> > It is not yet clear whether quantum computation can really be made to
> > work.  It may be that the practical difficulty of keeping N qbits mutually
> > entangled grows exponentially with N, in which case the same argument may
> > well apply.  We just don't know yet.  
> 
> First, the problem is *NOT* with "keeping the qubits entangled". They entangle
> themselves quite fast, and they got entangled with the environment, which is
> called the "decoherence". It is the other way round, the keeping of coherence
> which is delicate.

Right.  I should have said, "It may be that the practical difficulty of
keeping N mutually entangled qbits from decohering grows exponentially
with N."


> (But, besides, it is KNOWN that quantum computation works in toy cases.)

Of course.  But so what?  It is also known not to work in non-toy cases
given current technology.  It is not yet known whether the limiting factor
is technological or physical.


> I suggest moving this thread to some .alt. limbo independently of
> its ambitions, since it is not based on any qualified knowledge.

Since when is qualified knowledge a prerequisite for posting to usenet?

E.
From: Jerzy Karczmarczuk
Subject: Re: Computability in principle
Date: 
Message-ID: <3FAA150C.5070407@info.unicaen.fr>
Erann Gat wrote:
> Jerzy Karczmarczuk wrote:
> 
>>And, anyway all this is *rubbish*.
> 
> Well, I got some of the details wrong, but none of the impact the
> soundness of the argument.

I am sorry, but I disagree.


>>The "known universe" is not the full universe
> 
> That may be, but anything outside of our light cone is irrelevant.

1. The global structure of the universe, its topology etc. extends far
    beyond our observations, and when you play with the state counting, you
    have to include all this. BTW, in your "sound" arguments you never
    mentioned relativity.
    The causally disconnected parts of the Universe ARE observed though, and
    the general homogeneity is a strong methodological filter of cosmological
    models. You can't really count the number of states without such a model.
    Even worse, you can't say anything about the information flow from one
    space sector to another.

2. Quantum processes live outside the light cone. And influence it. The
    horizon of a black hole radiates. What is the density of particles, and
    thus the entropy, knowing that the properties of Hawking radiation
    depend on the observer?
    And don't forget that even without black holes, an accelerating observer
    can "manufacture" a horizon and observe the Unruh radiation...

I continue to claim that you don't have any serious way of computing the number
of states in the universe.


>>Then, "smallest possible scale on which physical processes can occur"
>>being the Planck length is another untruth.
> 
> 
> I refer you to http://www.iscid.org/encyclopedia/Planck_time
> 
> "Planck time, 10^-43 seconds, is the fundamental unit of time which cannot
> be broken down into smaller, meaningful units.  The number is based on the
> time it would take a photon travelling at the speed of light to across a
> distance equal to the Planck length."
> 
> I maintain that "smallest possible time at which physical processes can
> occur" is a logical consequence of "fundamental unit of time which cannot
> be broken down into smaller, meaningful units."

Dear Sir, you will refer me to a popular, Internet encyclopaedia, as to a Final
Truth? These are religious statements. Surely, NOBODY knows what happen within
the Planck scale, since the very notion of particle is based on asymptotic
(far away) reasoning, and because of the - possible - quantum metric fluctuation
you don't have this luxury at that scale. So what? There is also the Planck mass
equal to about 10^(-5) g. What's so fundamental about it? NOTHING, unless you
issue such empty statements as "this is the maximum mass of an elementary
particle, otherwise it would collapse into a black hole". Saying that no
"physical processes" can go beyond blah blah is presumptuous. The ISCID society
which owns that encyclopaedia committed a sin of pride. What does it mean that
you cannot split the Planck time in smaller units? This is silly, both quantum
physics and the gravitation operate with *continuum time*. Strings as well.
The very phrase "meaningful unit" is totalitaristic. Meaningful for whom? For
70 years at least people speculate about the discretization of the spacetime,
but nobody ever got anything serious.

Do you (and these ISCID people) really believe that we know the ultimate truth
about all possible physical processes?


>>I suggest moving this thread to some .alt. limbo independently of
>>its ambitions, since it is not based on any qualified knowledge.
> 
> Since when is qualified knowledge a prerequisite for posting to usenet?

Right. Sorry. Go ahead.
I just with age become more and more naive, and I thought that if some people
know very badly some scientific domain, then first they ask questions, and
only then propose some answers.



Jerzy Karczmarczuk
From: Erann Gat
Subject: Re: Computability in principle
Date: 
Message-ID: <gat-0611030912290001@192.168.1.51>
In article <················@info.unicaen.fr>, Jerzy Karczmarczuk
<·······@info.unicaen.fr> wrote:

> Erann Gat wrote:
> > Jerzy Karczmarczuk wrote:
> > 
> >>And, anyway all this is *rubbish*.
> > 
> > Well, I got some of the details wrong, but none of the impact the
> > soundness of the argument.
> 
> I am sorry, but I disagree.

I am sorry, but your counterarguments are rubbish.  You are arguing
against a straw man.

> You can't really count the number of states without such a model.
...
> you never mentioned relativity.

That's right, because my claim does not rely on counting the actual number
of states that could be enumerated.  All I am trying to do is establish an
upper bound on that number.  The upper bound is already so absurdly small
that the actual number of enumerable states doesn't matter.  The
conclusion stands: any formalist argument that relies on exhaustive
enumeration of programs or proofs is not worthy of serious consideration.

It doesn't much matter how you do the calculation.  Fill the entire volume
of a light cone 10 billion years wide with elementary particles on the
Planck scale and have them all performing computations at the Planck
frequency, you can still only enumerate 100 bytes.  (Verifying this is
left as an excercise for the reader.)  Our ability to conceive of large
numbers (and of NP-hard problems to generate them) vastly outstrips the
capacity of the universe to reify them.

Moreover:

> 1. The global structure of the universe, its topology etc. extends far
>     beyond our observations,

I'll respond to this with your own words:

> These are religious statements.

E.
From: Ray Blaak
Subject: Re: Computability in principle
Date: 
Message-ID: <uwuadnw28.fsf@STRIPCAPStelus.net>
···@jpl.nasa.gov (Erann Gat) writes:
> That's right, because my claim does not rely on counting the actual number
> of states that could be enumerated.  All I am trying to do is establish an
> upper bound on that number.  The upper bound is already so absurdly small
> that the actual number of enumerable states doesn't matter.  The
> conclusion stands: any formalist argument that relies on exhaustive
> enumeration of programs or proofs is not worthy of serious consideration.

You do not need to wonder about the number of particles, etc. in the universe
to show this. It is enough to show that you get exponential growth in the
number of states as you add bits.
-- 
Cheers,                                        The Rhythm is around me,
                                               The Rhythm has control.
Ray Blaak                                      The Rhythm is inside me,
········@STRIPCAPStelus.net                    The Rhythm has my soul.
From: Fergus Henderson
Subject: Re: Computability in principle
Date: 
Message-ID: <3faac27e$1@news.unimelb.edu.au>
···@jpl.nasa.gov (Erann Gat) writes:

>Fill the entire volume
>of a light cone 10 billion years wide with elementary particles on the
>Planck scale and have them all performing computations at the Planck
>frequency, you can still only enumerate 100 bytes.

You know, you can do quite a bit in 100 bytes of code.
Particularly if you're allowed to choose the instruction set.

(Kids these days think you need 100 *megabytes* to do anything reasonable.
But back when I was young...
I had a PC with 6k of RAM, and 4k of that was video RAM! :)

-- 
Fergus Henderson <···@cs.mu.oz.au>  |  "I have always known that the pursuit
The University of Melbourne         |  of excellence is a lethal habit"
WWW: <http://www.cs.mu.oz.au/~fjh>  |     -- the last words of T. S. Garp.
From: Erann Gat
Subject: Re: Computability in principle
Date: 
Message-ID: <gat-0611031514020001@k-137-79-50-101.jpl.nasa.gov>
In article <··········@news.unimelb.edu.au>, Fergus Henderson
<···@cs.mu.oz.au> wrote:

> ···@jpl.nasa.gov (Erann Gat) writes:
> 
> >Fill the entire volume
> >of a light cone 10 billion years wide with elementary particles on the
> >Planck scale and have them all performing computations at the Planck
> >frequency, you can still only enumerate 100 bytes.
> 
> You know, you can do quite a bit in 100 bytes of code.

That may well be, but it will take ten billion years (at least) to find
out what it is.  I don't want to wait that long.

> Particularly if you're allowed to choose the instruction set.

If you're allowed to choose the instruction set you don't need any code at
all.  All you need is one "instruction" that does whatever it is you want
to do.

> (Kids these days think you need 100 *megabytes* to do anything reasonable.
> But back when I was young...
> I had a PC with 6k of RAM, and 4k of that was video RAM! :)

Yeah, but how many times can you play Hunt the Wumpus before it starts to
get old?

E.
From: Will Hartung
Subject: Re: Computability in principle
Date: 
Message-ID: <boervj$184i5h$1@ID-197644.news.uni-berlin.de>
"Erann Gat" <···@jpl.nasa.gov> wrote in message
·························@k-137-79-50-101.jpl.nasa.gov...
> If you're allowed to choose the instruction set you don't need any code at
> all.  All you need is one "instruction" that does whatever it is you want
> to do.

This reminds me of someone (don't remember who, Asimov perhaps) talking
about storing knowledge using, I think, "Godelization".

Essentially, for example, you take all human knowledge, digitize it, and
turn it into a huge integer.

Then, you simply place a decimal point in front of the number, draw a line
on a stick to represent that ratio, and voila, all of humanities knowledge
on a stick.

Remember to mark a "this end up" on it. Decoding is left as an exercise for
the reader.

Regards,

Will Hartung
(·····@msoft.com)
From: Sashank Varma
Subject: Re: Computability in principle
Date: 
Message-ID: <none-61A0AE.18521606112003@news.vanderbilt.edu>
> > Particularly if you're allowed to choose the instruction set.
> 
> If you're allowed to choose the instruction set you don't need any code at
> all.  All you need is one "instruction" that does whatever it is you want
> to do.

00000000 DWIM     ; done!  time to read comp.lang.lisp!
From: Christopher C. Stacy
Subject: Re: Computability in principle
Date: 
Message-ID: <uznf83hnj.fsf@dtpq.com>
>>>>> On Thu, 06 Nov 2003 15:14:02 -0800, Erann Gat ("Erann") writes:
 >> (Kids these days think you need 100 *megabytes* to do anything reasonable.
 >> But back when I was young...
 >> I had a PC with 6k of RAM, and 4k of that was video RAM! :)

 Erann> Yeah, but how many times can you play Hunt the Wumpus
 Erann> before it starts to get old?

MACLISP, the precursor of Lisp Machine Lisp (of which
Common Lisp is essentially a subset), ran in less than 60 K.

(The PDP-10 had 36-bit fixed-size memory words,
rather than 8-bit variable length instructions,
so that equals about 295 KB).
From: Fergus Henderson
Subject: Re: Computability in principle
Date: 
Message-ID: <3fab3668$1@news.unimelb.edu.au>
···@jpl.nasa.gov (Erann Gat) writes:

>Fergus Henderson <···@cs.mu.oz.au> wrote:
>
>> (Kids these days think you need 100 *megabytes* to do anything reasonable.
>> But back when I was young...
>> I had a PC with 6k of RAM, and 4k of that was video RAM! :)
>
>Yeah, but how many times can you play Hunt the Wumpus before it starts to
>get old?

It had much better games than Hunt the Wumpus -- for example,
there was a quite reasonable version of space invaders.

-- 
Fergus Henderson <···@cs.mu.oz.au>  |  "I have always known that the pursuit
The University of Melbourne         |  of excellence is a lethal habit"
WWW: <http://www.cs.mu.oz.au/~fjh>  |     -- the last words of T. S. Garp.
From: Matthias
Subject: Re: Computability in principle
Date: 
Message-ID: <36w7k2dn292.fsf@chagall.ti.uni-mannheim.de>
···@jpl.nasa.gov (Erann Gat) writes:

> In article <················@info.unicaen.fr>, Jerzy Karczmarczuk
> <·······@info.unicaen.fr> wrote:
> 
> > And, anyway all this is *rubbish*.
> 
> Well, I got some of the details wrong, but none of the impact the
> soundness of the argument.

I didn't understand your initial argument: In the conclusion to your
original post you say:

>...> I submit therefore that any formalist program (I use the word here
>...> in the sense of "Hilbert's program", not a computer program) based
>...> on exhaustive enumeration of programs (in the computer-program sense
>...> this time) has no hope of having any practical utility, and is
>...> therefore not worthy of serious consideration.

Is your argument that we shall not use enumeration in any thought
experiments or proofs because it is impractical?  Then we probably
should also retain from elementary geometry (because of quantum
effects there is no such thing as a /circle/ or a /straight line/ in
the real world) and from math in general.

And while we are at it: If "practical utility" is the only thing
"worthy of serious consideration" we might also close down NASA and
spend the money on more-boring-but-more-practical research.  Not to
speak of all those other things that make life interesting, beautiful,
and worthwhile.
From: Erann Gat
Subject: Re: Computability in principle
Date: 
Message-ID: <gat-0611030933470001@192.168.1.51>
In article <···············@chagall.ti.uni-mannheim.de>, Matthias
<··@spam.pls> wrote:

> ···@jpl.nasa.gov (Erann Gat) writes:

> >...> I submit therefore that any formalist program (I use the word here
> >...> in the sense of "Hilbert's program", not a computer program) based
> >...> on exhaustive enumeration of programs (in the computer-program sense
> >...> this time) has no hope of having any practical utility, and is
> >...> therefore not worthy of serious consideration.
> 
> Is your argument that we shall not use enumeration in any thought
> experiments or proofs because it is impractical?

Hm, good point.  My conclusion was worded too broadly.  Perhaps, "Any
formalist claim to practical applicability whose supporting argument
relies on exhaustive enumeration..."  Prunesquallor's argument for which
he invoked the "universal algorithm" is what I particularly had in mind.

Exhaustive enumeration can be useful to establish negative results, e.g.
"Even if we could exhaustively enumerate we still couldn't compute X."  So
perhaps one could even strengthen the conclusion to something like "Any
claim of the form, 'If we could enumerate then we could do X...'..."

> And while we are at it: If "practical utility" is the only thing
> "worthy of serious consideration" we might also close down NASA and
> spend the money on more-boring-but-more-practical research.

Since I'm posting from a JPL account I can't comment on the desirability
of shutting down NASA.  However, I will say that NASA pretty much bends
over backwards to insure that the research that it does is practically
applicable, at least in principle.

> Not to
> speak of all those other things that make life interesting, beautiful,
> and worthwhile.

Speaking for myself only here, the things that make life interesting,
beautiful and worthwhile very rarely involve theoretical arguments of any
sort.  But I am not a theorist.

E.
From: Matthias Blume
Subject: Re: Computability in principle
Date: 
Message-ID: <m1oevp4bkv.fsf@tti5.uchicago.edu>
···@jpl.nasa.gov (Erann Gat) writes:

> Perhaps, "Any formalist claim to practical applicability whose
> supporting argument relies on exhaustive enumeration..."
> Prunesquallor's argument for which he invoked the "universal
> algorithm" is what I particularly had in mind.

That "algorithm" does not work anyway, even if enumeration would be
practical.

In any case, I thought that Prunesquallor (or is that Joe?) brought
forward this algorithm precisely with the goal of debunking thoes
formalists' claims, basically arguing that once you have a formal
correctness claim you trivially have a correct algorithm.  But he
falsely equated being formal with being computable, so the line of
reasoning breaks down right there.

Matthias
From: Erann Gat
Subject: Re: Computability in principle
Date: 
Message-ID: <gat-0611031028190001@k-137-79-50-101.jpl.nasa.gov>
In article <··············@tti5.uchicago.edu>, Matthias Blume
<····@my.address.elsewhere> wrote:

> ···@jpl.nasa.gov (Erann Gat) writes:
> 
> > Perhaps, "Any formalist claim to practical applicability whose
> > supporting argument relies on exhaustive enumeration..."
> > Prunesquallor's argument for which he invoked the "universal
> > algorithm" is what I particularly had in mind.
> 
> That "algorithm" does not work anyway, even if enumeration would be
> practical.

Right, but we were going around in circles on that line of argumentation
so I thought I'd try a different approach.

> In any case, I thought that Prunesquallor (or is that Joe?) brought
> forward this algorithm precisely with the goal of debunking thoes
> formalists' claims, basically arguing that once you have a formal
> correctness claim you trivially have a correct algorithm.  But he
> falsely equated being formal with being computable, so the line of
> reasoning breaks down right there.

Prunesquallor/Joe/whoever-you-are, you want to weigh in on this?

E.
From: Joe Marshall
Subject: Re: Computability in principle
Date: 
Message-ID: <oevoyn79.fsf@ccs.neu.edu>
···@jpl.nasa.gov (Erann Gat) writes:

>
> Prunesquallor/Joe/whoever-you-are, you want to weigh in on this?
>

I guess.

Matthias is correct that where I was going was to argue that a formal
enough specification *is* an algorithm in all but name.  (And let me
qualify that:  If the formal spec is wrong, then algorithms derived
from it are wrong.  If the formal spec is undecidable, then algorithms
derived from it are undecidable.  If the formal spec in not
computable, then no algorithm can be derived from it.  And I am
talking only about formal constructive proofs (thanks to Eli Barzilay
for pointing that out), but I have heard that one can simulate
general proofs with constructive formalisms.)

Matthias is wrong in saying that I wished to equate formalism with
computability, though.  Obviously a rich enough formal language would
allow you to write non-computable (or even inconsistent)
specifications.  I was trying to say that the raison d'etre of
formalism was the ability to perform mechanical reasoning upon them.

It may be that your formalism is non-computable, but the reason you
wrote it down formally is so that you would have a set of symbols you
could push around with a computer and maybe find out something.  A
non-computable specification in this sense would mean that you keep
pushing symbols around 'til doomsday and not get anywhere.
Presumably, though, that isn't your intent.  You hope that pushing
symbols around will lead to a contradition or tautology or something.
I imagine that most formal specifications are written down *with the
intent* of mechanically proving them correct, even though there are no
guarantees that this can be done.

I was attempting to point out that non-computable specifications have
rather limited utility because `all divergent functions compute the
same value'.  I can see little utility in an undecidable specification
because there would be no way to determine if something met it.  I
only wanted to argue about the subset of specifications that were
decidable, because presumably that is the `interesting' subset.

I certainly wasn't denying the existance of undecidable or
non-computable specifications, only the utility of them.

I hope this clarifies things.
From: Matthias Blume
Subject: Re: Computability in principle
Date: 
Message-ID: <m2znf65hgm.fsf@hanabi-air.shimizu.blume>
Joe Marshall <···@ccs.neu.edu> writes:

> ···@jpl.nasa.gov (Erann Gat) writes:
> 
> >
> > Prunesquallor/Joe/whoever-you-are, you want to weigh in on this?
> >
> 
> I guess.
> 
> Matthias is correct that where I was going was to argue that a formal
> enough specification *is* an algorithm in all but name.

I have already provided you with a concrete example where this is not
true.  I would even go so far and say that non-trivial formal
correctness statements will /rarely/ be computable (in the sense that
they give you a decision procedure for the result of a computation
satisfying the correctness criterion).

> I was trying to say that the raison d'etre of formalism was the
> ability to perform mechanical reasoning upon them.

But that does not require the formalism to consist of computable
things only.

> It may be that your formalism is non-computable, but the reason you
> wrote it down formally is so that you would have a set of symbols you
> could push around with a computer and maybe find out something.

No, the reason that I write it down is that I might be able to find a
proof for the statement that I am after.  Yes, one can use computers
to help in finding such a proof. In fact, if a proof exists then the
computer can always find it -- in principle.  Of course, the trouble is
that we usually do not know whether a proof exists until we find one.

> A non-computable specification in this sense would mean that you
> keep pushing symbols around 'til doomsday and not get anywhere.
> Presumably, though, that isn't your intent.  You hope that pushing
> symbols around will lead to a contradition or tautology or
> something.

I am not sure where you picked up the idea that I want to "push around
symbols".

> I imagine that most formal specifications are written down *with the
> intent* of mechanically proving them correct, even though there are no
> guarantees that this can be done.

You are imagining incorrectly.  Many people are happy if they can
*verify* a *given* proof mechanically.  For that you need a fully
formalized logic, too.

I am not at all saying that the computer should prove program
correctness all on its own. It would be nice if it could, but it
can't.  The other extreme is to have the programmer provide the whole
proof and have the machine check it.  For example, PCC can operate in
this mode.  But, indeed, much research is done on generating proofs
for PCC automatically or at least semi-automatically.  I expect that
if the technology ever gets mature enough, it will be of the
"semi-automatic" nature where the computer does much of the routine
work and where the programmer must give a few crucial hints here and
there.

> I can see little utility in an undecidable specification
> because there would be no way to determine if something met it.

Yes, with what you call an "undecidable specification"(*) there is no way
of providing a procedure for deciding whether an /arbitrary/ program
meets the specification.  But that's not what we need.  All we need is
being able to show that a /particular/ program is correct.

> I only wanted to argue about the subset of specifications that were
> decidable, because presumably that is the `interesting' subset.

Well, hopefully I have convinced you that the interesting subset is
far larger.

Matthias

PS: Let's be clear here: The set of programs that satisfies a
non-trivial specification is *always* undecidable.  (That's what
Rice's theorem says.)  What you refer to when you say "decidable
specification" is a predicate P(x,y) which is true iff y is the
correct output of the program that corresponds to input x.  For many
useful specifications, this P(x,y) is decidabel, for many other useful
specifications it is not.
From: ·············@comcast.net
Subject: Re: Computability in principle
Date: 
Message-ID: <r80h190c.fsf@comcast.net>
Matthias Blume <····@my.address.elsewhere> writes:

> Joe Marshall <···@ccs.neu.edu> writes:
>
>> ···@jpl.nasa.gov (Erann Gat) writes:
>> 
>> >
>> > Prunesquallor/Joe/whoever-you-are, you want to weigh in on this?
>> >
>> 
>> I guess.
>> 
>> Matthias is correct that where I was going was to argue that a formal
>> enough specification *is* an algorithm in all but name.
>
> I have already provided you with a concrete example where this is not
> true.  

Yes, but not with enough information for me to evaluate it.

> I would even go so far and say that non-trivial formal correctness
> statements will /rarely/ be computable (in the sense that they give
> you a decision procedure for the result of a computation satisfying
> the correctness criterion).

Is this the equivalent of saying `If the formal correctness statement
is non trivial, then there is no algorithm to determine whether some
particular computation satisfies it.'

This seems absurd.  For instance, your specification for a compiler
says that the object code computes the same value as the source code.
So I put `(+ 2 3)' in as the source code and the compiler emits a
`load 5' instruction and you're telling me that I cannot determine
whether *that particular* compilation is correct?

Or going the other way around:  if a formal correctness statement
generally doesn't give you a meaningful decision process for
determining whether the result of a particular computation is correct
or not, what the hell is it good for?  Why bother writing `the result
of the program will have property X' if there is no way to determine
if any particular result has that property?

>> I was trying to say that the raison d'etre of formalism was the
>> ability to perform mechanical reasoning upon them.
>
> But that does not require the formalism to consist of computable
> things only.

Of course not.

It does `require' in some sense that one can `compute' (i.e.,
syntactically manipulate) using the formalism, though --- at least in
some interesting case.  If all your rules for manipulating the
formalism are uncomputable (`you can beta-reduce form (X Z) iff there
exists a Y such that (X Y) is not a theorem'), there would be little
point in bothering to formalize.

>> It may be that your formalism is non-computable, but the reason you
>> wrote it down formally is so that you would have a set of symbols you
>> could push around with a computer and maybe find out something.
>
> No, the reason that I write it down is that I might be able to find a
> proof for the statement that I am after.  Yes, one can use computers
> to help in finding such a proof.  In fact, if a proof exists then the
> computer can always find it -- in principle.  Of course, the trouble is
> that we usually do not know whether a proof exists until we find one.

This seems to directly contradict your above statement:  `non-trivial
formal correctness statements will /rarely/ be computable'

At the basic level I'm saying `formalism is either useful or it isn't'
and you seem to be arguing against both viewpoints.

>> A non-computable specification in this sense would mean that you
>> keep pushing symbols around 'til doomsday and not get anywhere.
>> Presumably, though, that isn't your intent.  You hope that pushing
>> symbols around will lead to a contradition or tautology or
>> something.
>
> I am not sure where you picked up the idea that I want to "push around
> symbols".

If you write something down formally, you are writing it down with
enough precision that you can manipulate it symbolically.  Are you
doing this for sheer masochism?  I don't think it is unreasonable to
assume that you invented a formal symbolic notation in order to use
it.

>> I imagine that most formal specifications are written down *with the
>> intent* of mechanically proving them correct, even though there are no
>> guarantees that this can be done.
>
> You are imagining incorrectly.  Many people are happy if they can
> *verify* a *given* proof mechanically.  For that you need a fully
> formalized logic, too.

I can see no useful distinction between your second sentence and mine.
Do people write things down with an intent?  One hopes.  Do they want
to prove something?  Apparently.  Can they?  I say it isn't
guaranteed, you say they are happy if they can do it.  It really
sounds like it is an iffy proposition, but sometimes achievable.

> I am not at all saying that the computer should prove program
> correctness all on its own. It would be nice if it could, but it
> can't.  

Nor am I.

> The other extreme is to have the programmer provide the whole
> proof and have the machine check it.  For example, PCC can operate in
> this mode.  But, indeed, much research is done on generating proofs
> for PCC automatically or at least semi-automatically.  I expect that
> if the technology ever gets mature enough, it will be of the
> "semi-automatic" nature where the computer does much of the routine
> work and where the programmer must give a few crucial hints here and
> there.

This *directly* contradicts your statement above:  non-trivial formal
correctness statements will /rarely/ be computable (in the sense that
they give you a decision procedure for the result of a computation
satisfying the correctness criterion).

Can you, or can you not have the computer check a non-trivial proof?
If you can, please stop telling me that you cannot, and if you cannot,
why is everyone doing it?

>> I can see little utility in an undecidable specification
>> because there would be no way to determine if something met it.
>
> Yes, with what you call an "undecidable specification"(*) there is no way
> of providing a procedure for deciding whether an /arbitrary/ program
> meets the specification.  But that's not what we need.  All we need is
> being able to show that a /particular/ program is correct.

Look, either there exists a program for which one can determine if it
meets the spec, or there is no program for which one can determine if
it meets the spec.  If there is *no program* for which one can
determine whether or not it meets the spec, then exactly what use is
the spec?  (If the spec cannot say `yea' or `nay' about anything, why
do we have it?)

If the spec, however, *can* say `yea' or `nay' to at least *one*
program, then we can enumerate programs until the spec says `yea'
or `nay'.

>> I only wanted to argue about the subset of specifications that were
>> decidable, because presumably that is the `interesting' subset.
>
> Well, hopefully I have convinced you that the interesting subset is
> far larger.

No, you haven't.  I can see no use for a specification for which one
cannot extract information.

> Matthias
>
> PS: Let's be clear here: The set of programs that satisfies a
> non-trivial specification is *always* undecidable.  (That's what
> Rice's theorem says.)  

I know that.  I don't want to count or enumerate that set, nor do I
care if I cannot determine membership *in general*.  I just want to 
determine membership for *some* subset.  

If I propose a specification for which it is impossible to determine
if anything can satisfy it, that's rather pointless, no?

The set of all programs that would have been aesthetically pleasing to
Charles Babbage is an example of one such specification.  It is an
absolutely useless specification.

> What you refer to when you say "decidable
> specification" is a predicate P(x,y) which is true iff y is the
> correct output of the program that corresponds to input x.  For many
> useful specifications, this P(x,y) is decidable, for many other useful
> specifications it is not.

Agreed.  But if there exists *no* x,y pair for which P(x,y) can be
computed, then P is not a very useful construct.
From: Stephen J. Bevan
Subject: Re: Computability in principle
Date: 
Message-ID: <m3brrlvtvf.fsf@dino.dnsalias.com>
·············@comcast.net writes:
> Matthias Blume <····@my.address.elsewhere> writes:
> > Joe Marshall <···@ccs.neu.edu> writes:
> >> Matthias is correct that where I was going was to argue that a formal
> >> enough specification *is* an algorithm in all but name.
> >
> > I have already provided you with a concrete example where this is not
> > true.  
> 
> Yes, but not with enough information for me to evaluate it.

Here's a simple example :-

  sqrt x = y where x = y*y

This definition can be used to check that a proposed implementation of
square root calculates the right answer (with respect to this
definition) but it doesn't provide any obvious way to calculate the
square root of a value other than by an exhaustive search which isn't
a very effective algorithm.
From: Jens Axel Søgaard
Subject: Re: Computability in principle
Date: 
Message-ID: <3faef8d9$0$69936$edfadb0f@dread12.news.tele.dk>
Stephen J. Bevan wrote:
> Here's a simple example :-
> 
>   sqrt x = y where x = y*y
> 
> This definition can be used to check that a proposed implementation of
> square root calculates the right answer (with respect to this
> definition) but it doesn't provide any obvious way to calculate the
> square root of a value other than by an exhaustive search which isn't
> a very effective algorithm.

What is the sign of y?

-- 
Jens Axel S�gaard
From: Stephen J. Bevan
Subject: Re: Computability in principle
Date: 
Message-ID: <m365hswuet.fsf@dino.dnsalias.com>
Jens Axel S�gaard <······@jasoegaard.dk> writes:
> Stephen J. Bevan wrote:
> > Here's a simple example :-
> >   sqrt x = y where x = y*y
> > This definition can be used to check that a proposed implementation
> > of square root calculates the right answer (with respect to this
> > definition) but it doesn't provide any obvious way to calculate the
> > square root of a value other than by an exhaustive search which isn't
> > a very effective algorithm.
> 
> What is the sign of y?

the?
From: Pascal Bourguignon
Subject: Re: Computability in principle
Date: 
Message-ID: <87u15clh13.fsf@thalassa.informatimago.com>
·······@dino.dnsalias.com (Stephen J. Bevan) writes:

> Jens Axel S�gaard <······@jasoegaard.dk> writes:
> > Stephen J. Bevan wrote:
> > > Here's a simple example :-
> > >   sqrt x = y where x = y*y
> > > This definition can be used to check that a proposed implementation
> > > of square root calculates the right answer (with respect to this
> > > definition) but it doesn't provide any obvious way to calculate the
> > > square root of a value other than by an exhaustive search which isn't
> > > a very effective algorithm.
> > 
> > What is the sign of y?
> 
> the?

A (mathematical) function is not defined by an expression.

A function is defined by a  source domain, a target domain and a range
of means, one of which may be an expression relating the source to the
image.


So, you may have for example:

    sqrt-1 : R+ ---> R+
             x |---> y  such as y*y = x

or:

    sqrt-2 : R+ ---> R-
             x |---> y  such as y*y = x

or:

    sqrt-3 : R- ---> C
             x |---> y  such as y*y = x
    (and then, yes, y has no sign, what would be the sign of a complex?)

or:

    sqrt-4 : R+ ---> RxR
             x |---> (y,y') such as y<=0 and y*y=x and y'>=0 and y'*y'=x
which you can write as Y*Y=x assuming Y=(y,y') with y<=0, y'>=0 and:
         * : RxR x RxR ---> R
      ((x,x'),(y,y')) |----> z if z = x*y and z = x'*y'.

or:

    sqrt-5 : {0,1} ----> {0,1}
                x |----> 0 if x=0
                         1 if x=1  (which you can also write as y*y=x).

or:

    sqrt-6 : R^(n*n) ----> R^(n*n)
                  x |----> y such as y*y = x
    (and then, yes, y has no sign, what would be the sign of a matrix?)

and you can keep inventing other functions f such as  f(x)*f(x)=x


-- 
__Pascal_Bourguignon__
http://www.informatimago.com/
From: Fergus Henderson
Subject: Re: Computability in principle
Date: 
Message-ID: <3faf80a6$1@news.unimelb.edu.au>
Pascal Bourguignon <····@thalassa.informatimago.com> writes:

>·······@dino.dnsalias.com (Stephen J. Bevan) writes:
>
>> Jens Axel S�gaard <······@jasoegaard.dk> writes:
>> > Stephen J. Bevan wrote:
>> > > Here's a simple example :-
>> > >   sqrt x = y where x = y*y
>> > > This definition can be used to check that a proposed implementation
>> > > of square root calculates the right answer (with respect to this
>> > > definition) but it doesn't provide any obvious way to calculate the
>> > > square root of a value other than by an exhaustive search which isn't
>> > > a very effective algorithm.
>> > 
>> > What is the sign of y?
>> 
>> the?
>
>A (mathematical) function is not defined by an expression.

Yes, but a specification need not be complete.

-- 
Fergus Henderson <···@cs.mu.oz.au>  |  "I have always known that the pursuit
The University of Melbourne         |  of excellence is a lethal habit"
WWW: <http://www.cs.mu.oz.au/~fjh>  |     -- the last words of T. S. Garp.
From: Pascal Bourguignon
Subject: Re: Computability in principle
Date: 
Message-ID: <87llqnlily.fsf@thalassa.informatimago.com>
Fergus Henderson <···@cs.mu.oz.au> writes:

> Pascal Bourguignon <····@thalassa.informatimago.com> writes:
> 
> >·······@dino.dnsalias.com (Stephen J. Bevan) writes:
> >
> >> Jens Axel S�gaard <······@jasoegaard.dk> writes:
> >> > Stephen J. Bevan wrote:
> >> > > Here's a simple example :-
> >> > >   sqrt x = y where x = y*y
> >> > > This definition can be used to check that a proposed implementation
> >> > > of square root calculates the right answer (with respect to this
> >> > > definition) but it doesn't provide any obvious way to calculate the
> >> > > square root of a value other than by an exhaustive search which isn't
> >> > > a very effective algorithm.
> >> > 
> >> > What is the sign of y?
> >> 
> >> the?
> >
> >A (mathematical) function is not defined by an expression.
> 
> Yes, but a specification need not be complete.

Then I would hesitate to call it a specification.

What could be the use of an incomplete "specification"?

Just  give  a wish  list  and establish  either  a  genie-victim or  a
developer-user collaboration, and forget formalism.


-- 
__Pascal_Bourguignon__
http://www.informatimago.com/
From: Fergus Henderson
Subject: Re: Computability in principle
Date: 
Message-ID: <3fb06fb0$1@news.unimelb.edu.au>
Pascal Bourguignon <····@thalassa.informatimago.com> writes:

>Fergus Henderson <···@cs.mu.oz.au> writes:
>
>> Pascal Bourguignon <····@thalassa.informatimago.com> writes:
>> 
>> >·······@dino.dnsalias.com (Stephen J. Bevan) writes:
>> >
>> >> Jens Axel S�gaard <······@jasoegaard.dk> writes:
>> >> > Stephen J. Bevan wrote:
>> >> > > Here's a simple example :-
>> >> > >   sqrt x = y where x = y*y
>> >> > > This definition can be used to check that a proposed implementation
>> >> > > of square root calculates the right answer (with respect to this
>> >> > > definition) but it doesn't provide any obvious way to calculate the
>> >> > > square root of a value other than by an exhaustive search which isn't
>> >> > > a very effective algorithm.
>> >> > 
>> >> > What is the sign of y?
>> >> 
>> >> the?
>> >
>> >A (mathematical) function is not defined by an expression.
>> 
>> Yes, but a specification need not be complete.
>
>Then I would hesitate to call it a specification.
>
>What could be the use of an incomplete "specification"?

Proving that the program meets the requirements.

Generally, the requirements do _not_ constitute a complete specification.

For example, the specification for a sorting program may require that the
output be sorted on the keys, but not require that the sort be stable.
A formal specification can be used to prove that the output is indeed sorted.
This is useful, even though the specification is not complete.

For another example, the specification for a compiler may require that
the compiler report an error for any invalid program, but does not need
to specify exactly what the error message will be.

-- 
Fergus Henderson <···@cs.mu.oz.au>  |  "I have always known that the pursuit
The University of Melbourne         |  of excellence is a lethal habit"
WWW: <http://www.cs.mu.oz.au/~fjh>  |     -- the last words of T. S. Garp.
From: Pascal Bourguignon
Subject: Re: Computability in principle
Date: 
Message-ID: <87wua6kdna.fsf@thalassa.informatimago.com>
Fergus Henderson <···@cs.mu.oz.au> writes:
> For another example, the specification for a compiler may require that
> the compiler report an error for any invalid program, but does not need
> to specify exactly what the error message will be.

I could agree with your definition, but the users won't:

% valid-lisp -c test.lisp
ERROR.
% 


Or, let's try to see it  the other way: how would you formally specify
that "the compiler reports an error for any invalid program"?


-- 
__Pascal_Bourguignon__
http://www.informatimago.com/
From: Fergus Henderson
Subject: Re: Computability in principle
Date: 
Message-ID: <3fb13718$1@news.unimelb.edu.au>
Pascal Bourguignon <····@thalassa.informatimago.com> writes:
>Or, let's try to see it  the other way: how would you formally specify
>that "the compiler reports an error for any invalid program"?

That would depend on the specification language chosen
and the source language which the compiler is supposed to compile
(and perhaps other things).

But it could be something as simple as the `:- assert' declaration
in the following Mercury specification:

	:- type target_program = string.
	:- type source_program = string.
	:- type error_messages = string.

	:- func compile(source_program) = compile_result.

	:- type compile_result ---> ok(target_program) ; error(error_messages).

	:- assert((not valid_program(Input)) => compile(Input) = error).

	:- pred valid_program(source_program).
	valid_program("dwim").

Obviously you might want to adjust the definition of valid_program/1
to suit your source language.

-- 
Fergus Henderson <···@cs.mu.oz.au>  |  "I have always known that the pursuit
The University of Melbourne         |  of excellence is a lethal habit"
WWW: <http://www.cs.mu.oz.au/~fjh>  |     -- the last words of T. S. Garp.
From: Fergus Henderson
Subject: Re: Computability in principle
Date: 
Message-ID: <3fb139d3$1@news.unimelb.edu.au>
Fergus Henderson <···@cs.mu.oz.au> writes:

>But it could be something as simple as the `:- assert' declaration
>in the following Mercury specification:
>
>	:- type target_program = string.
>	:- type source_program = string.
>	:- type error_messages = string.
>
>	:- func compile(source_program) = compile_result.
>
>	:- type compile_result ---> ok(target_program) ; error(error_messages).
>
>	:- assert((not valid_program(Input)) => compile(Input) = error).

Sorry, I should run the type checker on my specifications before posting
them.  That line should be

	:- assert((not valid_program(Input)) => compile(Input) = error(_)).

	                                                              ^^^
-- 
Fergus Henderson <···@cs.mu.oz.au>  |  "I have always known that the pursuit
The University of Melbourne         |  of excellence is a lethal habit"
WWW: <http://www.cs.mu.oz.au/~fjh>  |     -- the last words of T. S. Garp.
From: Stephen J. Bevan
Subject: Re: Computability in principle
Date: 
Message-ID: <m3vfpnsfc3.fsf@dino.dnsalias.com>
Pascal Bourguignon <····@thalassa.informatimago.com> writes:
> ·······@dino.dnsalias.com (Stephen J. Bevan) writes:
> 
> > Jens Axel S�gaard <······@jasoegaard.dk> writes:
> > > Stephen J. Bevan wrote:
> > > > Here's a simple example :-
> > > >   sqrt x = y where x = y*y
> > > > This definition can be used to check that a proposed implementation
> > > > of square root calculates the right answer (with respect to this
> > > > definition) but it doesn't provide any obvious way to calculate the
> > > > square root of a value other than by an exhaustive search which isn't
> > > > a very effective algorithm.
> > > 
> > > What is the sign of y?
> > 
> > the?
> 
> A (mathematical) function is not defined by an expression.
> 
> A function is defined by a  source domain, a target domain and a range
> of means, one of which may be an expression relating the source to the
> image.

[big snip]

The expression produces the correct result with various specific
sources and targets; I didn't expect an advocate of dynamic typing to
want them to be stated explicitly.
From: Matthias Blume
Subject: Re: Computability in principle
Date: 
Message-ID: <m2u15c4z8c.fsf@hanabi-air.shimizu.blume>
·············@comcast.net writes:

[ I snipped a whole lot including some of the things that I was going
to answer because what follows below seems to clarify the source of
the misunderstanding. ]

> Can you, or can you not have the computer check a non-trivial proof?
> If you can, please stop telling me that you cannot, and if you cannot,
> why is everyone doing it?

I never said that the computer cannot check a proof, trivial or not.
You can have the computer check *every* proof it is is written down
formally.

> >> I can see little utility in an undecidable specification
> >> because there would be no way to determine if something met it.
> >
> > Yes, with what you call an "undecidable specification"(*) there is no way
> > of providing a procedure for deciding whether an /arbitrary/ program
> > meets the specification.  But that's not what we need.  All we need is
> > being able to show that a /particular/ program is correct.
> 
> Look, either there exists a program for which one can determine if it
> meets the spec, or there is no program for which one can determine if
> it meets the spec.  If there is *no program* for which one can
> determine whether or not it meets the spec, then exactly what use is
> the spec?  (If the spec cannot say `yea' or `nay' about anything, why
> do we have it?)

It has just become clear to me: you are using the word "undecidable"
in a non-standard way.  A predicate P is usually called decidable if
there is a computable total function which returns "yes" where P holds
and "no" where P does not hold.  Other P we call undecidable.  Your
usage suggests that you don't call P undecidable if there is subset of
P's domain where P is decidable (in the standard sense).

The spec can usually say "nay" to many things.  (That's, btw, what
testing is all about.)  Many specs will be able to say "nay" to *all*
things that don't satisfy the spec (but it will not be known in
advance how long it will take to say "nay").  If that is so, then the
negation of the spec is recursively enumerable.  On the other hand,
many specs will not be able to say "yea" to all things that satisfy
the spec (although there usually will be a way of deriving "yea" at
least for some correct things -- if they exist).

> If the spec, however, *can* say `yea' or `nay' to at least *one*
> program, then we can enumerate programs until the spec says `yea'
> or `nay'.

... unless there is no thing that satisfies the spec.  Indeed, if you
know that there are always satisfying elements, and that for some of
them there is a proof of correctness, then your technique would work.
In the compiler example, unless the target language is somehow
crippled, there always is an object program that is equivalent to the
given source program.  In this case you need to code up a search for a
program that is provably equivalent to the source program.  Just
"testing" the correctness statement, though, does not cut it.  You
have to code up the process of proving program equivalence.  Since
proving program equivalence is not possible in general, you have to
focus on specific programs -- most likely those that are obtained /by
construction from the source program/.  The process of constructing an
object program from the source program is what compilation is about in
the first place, though.  In effect, you have to prove your compiler
correct.  Again, one cannot prove /arbitrary/ compilers correct, so
you have to write one with provability of correctness in mind.

> No, you haven't.  I can see no use for a specification for which one
> cannot extract information.

I never said that one cannot extract information from a specification,
far from it.  All I said is that if the specification is some sort of
logical predicate P(x,y) on input values x and corresponding correct
output values y, then P does not have to be computable (in the
standard sense).

> If I propose a specification for which it is impossible to determine
> if anything can satisfy it, that's rather pointless, no?
> 
> The set of all programs that would have been aesthetically pleasing to
> Charles Babbage is an example of one such specification.  It is an
> absolutely useless specification.

That is not a specification at all as it cannot be formalized.

> > What you refer to when you say "decidable
> > specification" is a predicate P(x,y) which is true iff y is the
> > correct output of the program that corresponds to input x.  For many
> > useful specifications, this P(x,y) is decidable, for many other useful
> > specifications it is not.
> 
> Agreed.  But if there exists *no* x,y pair for which P(x,y) can be
> computed, then P is not a very useful construct.

That is true.

Matthias
From: Joe Marshall
Subject: Re: Computability in principle
Date: 
Message-ID: <islshx8t.fsf@ccs.neu.edu>
Matthias Blume <····@my.address.elsewhere> writes:

> It has just become clear to me: you are using the word "undecidable"
> in a non-standard way.  A predicate P is usually called decidable if
> there is a computable total function which returns "yes" where P holds
> and "no" where P does not hold.  Other P we call undecidable.  Your
> usage suggests that you don't call P undecidable if there is subset of
> P's domain where P is decidable (in the standard sense).

I apologize for being non-standard and confusing.

> Indeed, if you know that there are always satisfying elements, and
> that for some of them there is a proof of correctness, then your
> technique would work.

Yes.  I think that this is one of the restrictions on the
`M-algorithm' that we were discussing so long ago.  It can generate a
program within a factor of 5 of optimal from a constructive proof, but
it relies on one already having the proof.  So using this M-algorithm
to determine if P=NP presupposes that you have already proven P=NP.
From: Fergus Henderson
Subject: Re: Computability in principle
Date: 
Message-ID: <3faff36b$1@news.unimelb.edu.au>
Joe Marshall <···@ccs.neu.edu> writes:

>Matthias Blume <····@my.address.elsewhere> writes:
>
>> Indeed, if you know that there are always satisfying elements, and
>> that for some of them there is a proof of correctness, then your
>> technique would work.
>
>Yes.  I think that this is one of the restrictions on the
>`M-algorithm' that we were discussing so long ago.  It can generate a
>program within a factor of 5 of optimal from a constructive proof,

Two points.  First, don't forget the constant factor -- it's important!
Second, what do you mean by "from a constructive proof"?
A constructive proof of what??

I think you are misremembering.  The M-algorithm takes as input an
algorithm, not a constructive proof.

>but
>it relies on one already having the proof.  So using this M-algorithm
>to determine if P=NP presupposes that you have already proven P=NP.

No, this is wrong/meaningless.  As we discussed already,
there is no easy way to use the M-algorithm to determine if P=NP.
What you can do is to use the M-algorithm to solve NP problems in
polynomial time, if P=NP is both and provable in the formal system.
This does not require P=NP to have been already proven, just that a
proof exists.

-- 
Fergus Henderson <···@cs.mu.oz.au>  |  "I have always known that the pursuit
The University of Melbourne         |  of excellence is a lethal habit"
WWW: <http://www.cs.mu.oz.au/~fjh>  |     -- the last words of T. S. Garp.
From: Joe Marshall
Subject: Re: Computability in principle
Date: 
Message-ID: <vfprorfa.fsf@ccs.neu.edu>
Fergus Henderson <···@cs.mu.oz.au> writes:

> Joe Marshall <···@ccs.neu.edu> writes:
>
>>Matthias Blume <····@my.address.elsewhere> writes:
>>
>>> Indeed, if you know that there are always satisfying elements, and
>>> that for some of them there is a proof of correctness, then your
>>> technique would work.
>>
>>Yes.  I think that this is one of the restrictions on the
>>`M-algorithm' that we were discussing so long ago.  It can generate a
>>program within a factor of 5 of optimal from a constructive proof,
>
> Two points.  First, don't forget the constant factor -- it's important!

Yeah, yeah.  My point is *not* to defend this!  This M-algorithm is
cute for the problems that it solves, but it only solves
`well-defined' problems.

> I think you are misremembering.  The M-algorithm takes as input an
> algorithm, not a constructive proof.

Here's what the paper says:
  ``Given a formal specification of a problem depending on some
    parameter x (element of X), we are interested in a fast algorithm
    computing solution y (element of Y).  This means that we are
    interested in a fast algorithm computing f: X->Y, where f is a
    formal (logical, mathematical, not necessarily algorithmic),
    specification of the problem.''

So f is some formal specification, not necessarily an algorithm, but
necessarily something you could manipulate with a computer.

> No, this is wrong/meaningless.  As we discussed already,
> there is no easy way to use the M-algorithm to determine if P=NP.

I *know* that M-algorithm cannot solve P=NP (probably, it could be
that P *does* = NP and that this is provable, but assuming that it
isn't).  Given that it *can* solve well-defined problems for which it
can find a proof, then `P=NP' must not be `well-defined' `formal
specification' of the problem.
 
From: Fergus Henderson
Subject: Re: Computability in principle
Date: 
Message-ID: <3fb061e5$1@news.unimelb.edu.au>
Joe Marshall <···@ccs.neu.edu> writes:

>Fergus Henderson <···@cs.mu.oz.au> writes:
>
>> I think you are misremembering.  The M-algorithm takes as input an
>> algorithm, not a constructive proof.
>
>Here's what the paper says:
>  ``Given a formal specification of a problem depending on some
>    parameter x (element of X), we are interested in a fast algorithm
>    computing solution y (element of Y).  This means that we are
>    interested in a fast algorithm computing f: X->Y, where f is a
>    formal (logical, mathematical, not necessarily algorithmic),
>    specification of the problem.''
>
>So f is some formal specification, not necessarily an algorithm, but
>necessarily something you could manipulate with a computer.

I stand corrected.  It takes as input either "a given algorithm ...
or, more generally a specification of a function".

However, a specification is not the same thing as a proof!

>> No, this is wrong/meaningless.  As we discussed already,
>> there is no easy way to use the M-algorithm to determine if P=NP.
>
>I *know* that M-algorithm cannot solve P=NP

Convince the rest of us, and earn your Turing award!

>(probably,

Ah, there's the rub.

>it could be that P *does* = NP and that this is provable, but assuming
>that it isn't).

You don't have any truly convincing grounds for that assumption.

>Given that it *can* solve well-defined problems for which it
>can find a proof, then `P=NP' must not be `well-defined' `formal
>specification' of the problem.

The M-algorithm can only solve _solvable_ well-defined problems.
The claim in the paper is not that it solves all well-defined problems,
but rather than it "solves all well-defined problems *as quickly as
the fastest algorithm computing a solution* ... save for a factor 5
and ... additive terms".  If there is no algorithm computing a solution,
then this claim is empty.

Also, your statement here is confusing, because you blur the distinction
between (a) solving an NP-hard problem in P time and (b) "solving the
P=NP problem", i.e. either proving P=NP or proving P!=NP. 
If (a) is provably unsolvable, then (b) is solvable, with answer "P!=NP".

Even if you _assume_ that P=NP is not provable, that does not show
that the P=NP problem is "not well defined".  The other alternatives
are that P!=NP is provable, or that the "P=NP" problem is well-defined,
but unsolvable.

-- 
Fergus Henderson <···@cs.mu.oz.au>  |  "I have always known that the pursuit
The University of Melbourne         |  of excellence is a lethal habit"
WWW: <http://www.cs.mu.oz.au/~fjh>  |     -- the last words of T. S. Garp.
From: Fergus Henderson
Subject: Re: Computability in principle
Date: 
Message-ID: <3faf803f$1@news.unimelb.edu.au>
·············@comcast.net writes:

>Matthias Blume <····@my.address.elsewhere> writes:
>
>> I would even go so far and say that non-trivial formal correctness
>> statements will /rarely/ be computable (in the sense that they give
>> you a decision procedure for the result of a computation satisfying
>> the correctness criterion).
>
>Is this the equivalent of saying `If the formal correctness statement
>is non trivial, then there is no algorithm to determine whether some
>particular computation satisfies it.'
>
>This seems absurd.  For instance, your specification for a compiler
>says that the object code computes the same value as the source code.
>So I put `(+ 2 3)' in as the source code and the compiler emits a
>`load 5' instruction and you're telling me that I cannot determine
>whether *that particular* compilation is correct?

Sure you can tell particular cases.  But that's not a decision procedure.
A decision procedure has to be able to decide _all_ cases.
If you put `(+ 2 3)' in the source code, and the compiler emits code
of the form

	call find_counter_example_to_goldbach_conjecture
	load 5

plus an appropriate definition of find_counter_example_to_goldbach_conjecture,
is that correct?  Obviously it's pretty silly for a compiler to generate
such code, but if your specification doesn't say anything about performance
of the generated, it will still meet the specification, *if* Goldbach's
Conjecture is false.  If Goldbach's conjecture is true, however, the generated
code will continue searching for a counterexample forever (or at least
until it runs of memory), and so the generated program is not equivalent
to the source code.

A little more realistically, suppose the source program contained the
call to check Goldbach's conjecture, and the compiler generated an object
program in which that call had been optimized away.  Again proving correctness
would require determining whether Goldbach's conjecture holds.

>Or going the other way around:  if a formal correctness statement
>generally doesn't give you a meaningful decision process for
>determining whether the result of a particular computation is correct
>or not, what the hell is it good for?

It doesn't give you a decision process for determining whether the
result of _any_ computation is correct.  But it is useful because for
_some_ computations, you can indeed determine whether or not the result
is correct.

>Look, either there exists a program for which one can determine if it
>meets the spec, or there is no program for which one can determine if
>it meets the spec.

Sure; the former is the interesting case.

>If the spec, however, *can* say `yea' or `nay' to at least *one*
>program, then we can enumerate programs until the spec says `yea'
>or `nay'.

OK.  It's actually a lot more complicated than that, because it's really
the spec _plus a proof_ which says yea or nay, and the search for proofs
is infinite.  You can't just enumerate programs, otherwise as soon
as you get to the first program for which the spec says neither yea nor
nay, the search for a proof will loop indefinitely.  To make it actually
work, you have to interleave the search for programs with the search for
proofs.

But nevertheless, with that modification, such a search is possible.
So suppose you enumerate until the spec (+ proof) says "nay".   What then?

-- 
Fergus Henderson <···@cs.mu.oz.au>  |  "I have always known that the pursuit
The University of Melbourne         |  of excellence is a lethal habit"
WWW: <http://www.cs.mu.oz.au/~fjh>  |     -- the last words of T. S. Garp.
From: Erann Gat
Subject: Re: Computability in principle
Date: 
Message-ID: <gat-1011030857460001@192.168.1.51>
In article <··········@news.unimelb.edu.au>, Fergus Henderson
<···@cs.mu.oz.au> wrote:

> ... such a search is possible.

No, it isn't.  That's the whole point of the posting that began this thread.

E.
From: Fergus Henderson
Subject: Re: Computability in principle
Date: 
Message-ID: <3fafecdb$1@news.unimelb.edu.au>
···@jpl.nasa.gov (Erann Gat) writes:

>In article <··········@news.unimelb.edu.au>, Fergus Henderson
><···@cs.mu.oz.au> wrote:
>
>> ... such a search is possible.
>
>No, it isn't.  That's the whole point of the posting that began this thread.

It is possible to start such a search.  That's all my line of argument
required.  Whether it will produce anything useful in a reasonable amount
of time and space is another question.

-- 
Fergus Henderson <···@cs.mu.oz.au>  |  "I have always known that the pursuit
The University of Melbourne         |  of excellence is a lethal habit"
WWW: <http://www.cs.mu.oz.au/~fjh>  |     -- the last words of T. S. Garp.
From: Erann Gat
Subject: Re: Computability in principle
Date: 
Message-ID: <gat-1011031234300001@k-137-79-50-101.jpl.nasa.gov>
In article <··········@news.unimelb.edu.au>, Fergus Henderson
<···@cs.mu.oz.au> wrote:

> ···@jpl.nasa.gov (Erann Gat) writes:
> 
> >In article <··········@news.unimelb.edu.au>, Fergus Henderson
> ><···@cs.mu.oz.au> wrote:
> >
> >> ... such a search is possible.
> >
> >No, it isn't.  That's the whole point of the posting that began this thread.
> 
> It is possible to start such a search.  That's all my line of argument
> required.  Whether it will produce anything useful in a reasonable amount
> of time and space is another question.

Well, going back over the discussion I can't quite figure out what point
you were trying to make.  I will, however, observe that 1) conducting such
a search and simply running (loop) will produce the same result with
probability indistinguishable from 1, 2) merely "starting" a search could
mean running the first machine instruction in the search algorithm and
then suspending the process, and so "starting a search" is
indistinguishable from starting a whole slew of other computations that
just happen to begin with the same machine instruction, to say nothing of
3) the claim that such a search can be started (but not finished) hardly
seems like the sort of thing anyone would dispute, and so 4) whatever
point you were trying to make that relied on (starting) such a search
being "possible" seems likely to be either wrong or vacuous.

Er, what point were you trying to make?

E.
From: Fergus Henderson
Subject: Re: Computability in principle
Date: 
Message-ID: <3fb06a35$1@news.unimelb.edu.au>
···@jpl.nasa.gov (Erann Gat) writes:

>Er, what point were you trying to make?

I was arguing against Joe Marshall's claim that "a specification formal
enough to be worthy of the name *is* a program in and of itself and
thus neither more nor less amenable to analysis than the program that
attempts to implement it."

(Joe, are you still defending this claim?)

Joe Marshall's claim was based on the false idea that a specification
was either going to be "non-computable" [sic], and hence useless,
or equivalent to an algorithm, in the sense that one could derive
an algorithm from it by brute force generate-and-test.

Among many other problems, Joe Marshall seems to have overlooked the
possibility that this generate-and-test approach might fail because it
might never be possible to prove that a program is correct with respect
to the specification, but that the specification might nevertheless
still be useful because it is possible to prove some programs incorrect
w.r.t. the spec.

Of course there are more obvious problems with Joe Marshall's claim,
e.g. the fact that it ignores the possibility of specifications which
come with resource limits.

-- 
Fergus Henderson <···@cs.mu.oz.au>  |  "I have always known that the pursuit
The University of Melbourne         |  of excellence is a lethal habit"
WWW: <http://www.cs.mu.oz.au/~fjh>  |     -- the last words of T. S. Garp.
From: Joe Marshall
Subject: Re: Computability in principle
Date: 
Message-ID: <brripu2w.fsf@comcast.net>
Fergus Henderson <···@cs.mu.oz.au> writes:

> ···@jpl.nasa.gov (Erann Gat) writes:
>
>>Er, what point were you trying to make?
>
> I was arguing against Joe Marshall's claim that "a specification formal
> enough to be worthy of the name *is* a program in and of itself and
> thus neither more nor less amenable to analysis than the program that
> attempts to implement it."
>
> (Joe, are you still defending this claim?)

It depends on the specification.  I hate to waffle on this because
universal statements are so much more fun on usenet, but I think that
the spec must have these properties:

  1.  The spec gives a mapping from programs to one of three
      values:  true, false, or _|_

  1a. There exists a program for which the spec says `true'.

  2.  Programs satisfying the spec have a non-empty problem domain.

  3.  If the spec says that A passes and the spec says that B passes,
      then there is no element in the problem domain for which program
      A produces a different value than program B.

> Joe Marshall's claim was based on the false idea that a specification
> was either going to be "non-computable" [sic], and hence useless,

This is 1a above.  A spec that rejects all programs, or is equivalent
to _|_ for all programs isn't of much use.  (Although, there may be
minor utility in showing that spec has no realizable solution.)

By `non-computable', I meant that for all programs there is no
algorithm for which one can determine if a program meets the spec.
For example, `The set of all programs that compute the square-root of
two but that cannot be proven to do so.'

> or equivalent to an algorithm, in the sense that one could derive
> an algorithm from it by brute force generate-and-test.

I think that this is true for those specs that have the above
properties.

> Among many other problems, Joe Marshall seems to have overlooked the
> possibility that this generate-and-test approach might fail because it
> might never be possible to prove that a program is correct with respect
> to the specification, but that the specification might nevertheless
> still be useful because it is possible to prove some programs incorrect
> w.r.t. the spec.

Sure, a spec could be used to reject programs even if it cannot accept
them, but that's why I put in 1a.  Obviously the brute force approach
won't work if there are can be no examples that satisfy the spec.  But
because programs are enumerable, the brute force approach will work if
there is at least one program that satisfies the spec.

> Of course there are more obvious problems with Joe Marshall's claim,
> e.g. the fact that it ignores the possibility of specifications which
> come with resource limits.

I don't think that is an issue with the above limitations.  A program
that violates the spec with regard to resources is simply `false'.

If you put resource restrictions on the program, however, you should
be prepared for the possibility that no program satisfies it.  For
instance, you could specify that program X sorts N elements using
fewer than N/2 comparisons, but you shouldn't be surprised when you
cannot find any programs that satisfy it. 

-- 
~jrm
From: Ray Blaak
Subject: Re: Computability in principle
Date: 
Message-ID: <usmkt79ro.fsf@STRIPCAPStelus.net>
Joe Marshall <·············@comcast.net> writes:
>   3.  If the spec says that A passes and the spec says that B passes,
>       then there is no element in the problem domain for which program
>       A produces a different value than program B.

Different sorting algorithms could sort a little differently, be non-stable,
etc., but can still all be correct with regards to a general enough idea of
sorting.

-- 
Cheers,                                        The Rhythm is around me,
                                               The Rhythm has control.
Ray Blaak                                      The Rhythm is inside me,
········@STRIPCAPStelus.net                    The Rhythm has my soul.
From: Joe Marshall
Subject: Re: Computability in principle
Date: 
Message-ID: <k765gz89.fsf@ccs.neu.edu>
Ray Blaak <········@STRIPCAPStelus.net> writes:

> Joe Marshall <·············@comcast.net> writes:
>>   3.  If the spec says that A passes and the spec says that B passes,
>>       then there is no element in the problem domain for which program
>>       A produces a different value than program B.
>
> Different sorting algorithms could sort a little differently, be non-stable,
> etc., but can still all be correct with regards to a general enough idea of
> sorting.

Good point.  I think I can safely withdraw case 3, then.
From: Matthias Blume
Subject: Re: Computability in principle
Date: 
Message-ID: <m1llqoyo1m.fsf@tti5.uchicago.edu>
Fergus Henderson <···@cs.mu.oz.au> writes:

> ···@jpl.nasa.gov (Erann Gat) writes:
> 
> >In article <··········@news.unimelb.edu.au>, Fergus Henderson
> ><···@cs.mu.oz.au> wrote:
> >
> >> ... such a search is possible.
> >
> >No, it isn't.  That's the whole point of the posting that began this thread.
> 
> It is possible to start such a search.  That's all my line of argument
> required.  Whether it will produce anything useful in a reasonable amount
> of time and space is another question.

Not only can you start the search, you know that if there is at least
one solution with an associated proof for being a solution, then the
search will eventually turn up with a solution and an associated proof.

Of course, doing so will rarely be "practical".
From: Joe Marshall
Subject: Re: Computability in principle
Date: 
Message-ID: <ekwghwsg.fsf@ccs.neu.edu>
Fergus Henderson <···@cs.mu.oz.au> writes:

>>Or going the other way around:  if a formal correctness statement
>>generally doesn't give you a meaningful decision process for
>>determining whether the result of a particular computation is correct
>>or not, what the hell is it good for?
>
> It doesn't give you a decision process for determining whether the
> result of _any_ computation is correct.  But it is useful because for
> _some_ computations, you can indeed determine whether or not the result
> is correct.

My point exactly.  I apologize for making it sound as if the formal
spec was able to validate (or reject) *every* possible program given
to it.  I was attempting to point out that useful formal specs
generally validate (or reject) *some* programs.

> OK.  It's actually a lot more complicated than that, because it's really
> the spec _plus a proof_ which says yea or nay, and the search for proofs
> is infinite.  You can't just enumerate programs, otherwise as soon
> as you get to the first program for which the spec says neither yea nor
> nay, the search for a proof will loop indefinitely.  To make it actually
> work, you have to interleave the search for programs with the search for
> proofs.
>
> But nevertheless, with that modification, such a search is possible.

Yes.  This is what I was trying to get at with my crude attempts at
exposition.  (Although I wasn't enumerating *programs*, but rather
program *results*.)

> So suppose you enumerate until the spec (+ proof) says "nay".   What then?

Depends on what you mean.  Either you have found a random program that
provably does *not* implement the spec, or you have found a proof that
your spec is unrealizable.
From: Fergus Henderson
Subject: Re: Computability in principle
Date: 
Message-ID: <3faff427$1@news.unimelb.edu.au>
Joe Marshall <···@ccs.neu.edu> writes:

>Fergus Henderson <···@cs.mu.oz.au> writes:
>
>> So suppose you enumerate until the spec (+ proof) says "nay".   What then?
>
>Depends on what you mean.  Either you have found a random program that
>provably does *not* implement the spec, or you have found a proof that
>your spec is unrealizable.

I mean the former.

-- 
Fergus Henderson <···@cs.mu.oz.au>  |  "I have always known that the pursuit
The University of Melbourne         |  of excellence is a lethal habit"
WWW: <http://www.cs.mu.oz.au/~fjh>  |     -- the last words of T. S. Garp.
From: Fergus Henderson
Subject: Re: Computability in principle
Date: 
Message-ID: <3faf87ef$1@news.unimelb.edu.au>
Joe Marshall <···@ccs.neu.edu> writes:

>Matthias is correct that where I was going was to argue that a formal
>enough specification *is* an algorithm in all but name.  (And let me
>qualify that:  If the formal spec is wrong, then algorithms derived
>from it are wrong.

Not necessarily.  For example, the bug in the formal spec could be that
it is not sufficiently complete (it does not include all of the real
requirements).  But such a bug would not preclude the existence of
programs which meet both the incomplete formal spec and the real requirements.

>If the formal spec is undecidable, then algorithms derived from it are
>undecidable.

Do you mean "If the formal spec is undecidable, then ALL algorithms
conforming to it are undecidable"?  I don't think that follows.

But you'd better precisely define what you mean by "undecidable specification"
and "undecidable algorithm", because the standard terminology only defines
the meaning of "undecidable problem".

-- 
Fergus Henderson <···@cs.mu.oz.au>  |  "I have always known that the pursuit
The University of Melbourne         |  of excellence is a lethal habit"
WWW: <http://www.cs.mu.oz.au/~fjh>  |     -- the last words of T. S. Garp.
From: Joe Marshall
Subject: Re: Computability in principle
Date: 
Message-ID: <7k26ptii.fsf@comcast.net>
Fergus Henderson <···@cs.mu.oz.au> writes:

> Joe Marshall <···@ccs.neu.edu> writes:
>
>>If the formal spec is undecidable, then algorithms derived from it are
>>undecidable.
>
> Do you mean "If the formal spec is undecidable, then ALL algorithms
> conforming to it are undecidable"?  I don't think that follows.
>
> But you'd better precisely define what you mean by "undecidable specification"
> and "undecidable algorithm", because the standard terminology only defines
> the meaning of "undecidable problem".

By `undecidable specification' I mean that for all programs there is
no algorithm by which one can determine if the program meets the
specification.

By `undecidable algorithm' I meant that the algorithm is equivalent to
_|_ for all input.


> Do you mean "If the formal spec is undecidable, then ALL algorithms
> conforming to it are undecidable"?  I don't think that follows.

I think I phrased that poorly.  You are right that what I said doesn't
make sense.  How about this:

If there is no algorithm by which one can determine whether a program
meets the specification, then there is no algorithm by which one could
synthesize a valid program from the specification.


-- 
~jrm
From: Ray Blaak
Subject: Re: Computability in principle
Date: 
Message-ID: <u1xslpem0.fsf@STRIPCAPStelus.net>
Matthias <··@spam.pls> writes:
> Is your argument that we shall not use enumeration in any thought
> experiments or proofs because it is impractical?  Then we probably
> should also retain from elementary geometry (because of quantum
> effects there is no such thing as a /circle/ or a /straight line/ in
> the real world) and from math in general.

There is a difference. Enumeration is *in principle* impractical because it is
computationally inefficient. Adding a single bit of description doubles the
number of states you need to enumerate. That is, the number of states grows
exponentially with the number of bits.

-- 
Cheers,                                        The Rhythm is around me,
                                               The Rhythm has control.
Ray Blaak                                      The Rhythm is inside me,
········@STRIPCAPStelus.net                    The Rhythm has my soul.
From: Fergus Henderson
Subject: Re: Computability in principle
Date: 
Message-ID: <3fa8c2c2$1@news.unimelb.edu.au>
Jerzy Karczmarczuk <·······@info.unicaen.fr> writes:

>Fergus noted that the world is quantum, and Erann Gatt summarized that he will
>wait until quantum computers appear.

Actually, I was going to point that out, but to give credit where credit
is due, Markus Mottl got in before me.

-- 
Fergus Henderson <···@cs.mu.oz.au>  |  "I have always known that the pursuit
The University of Melbourne         |  of excellence is a lethal habit"
WWW: <http://www.cs.mu.oz.au/~fjh>  |     -- the last words of T. S. Garp.
From: Pascal Bourguignon
Subject: Re: Computability in principle
Date: 
Message-ID: <873cd3p28r.fsf@thalassa.informatimago.com>
Jerzy Karczmarczuk <·······@info.unicaen.fr> writes:

> Ray Blaak wrote:
> > ···@jpl.nasa.gov (Erann Gat) writes:
> >
> >>There are about 10^80 particles in the known Universe.
> >>The universe is about 10^18 seconds old.
> >>The Planck time, the time that it takes light to travel the diameter of a
> >>proton, believed to be the smallest possible time scale on which physical
> >>processes can occur, is 10^-43 seconds.
> >>
> >>So if you turned the entire universe into a computer and had it do nothing
> >>but count it could enumerate about 10^141 = 2^468 states.
> >
> > Well, the universe is not dead yet. If you are *really* willing to
> > wait you could count as much as you need to.
> 
> And, anyway all this is *rubbish*. Perhaps Erann Gatt should read
> something on cosmology. The "known universe" is not the full
> universe, the universe is -- in belief of almost all astrophysicists
> and "fundamental" physicists -- OPEN.  Infinite, if you wish. Our
> "Hubble sector" might be finite *for local obser- vations*, but it
> doesn't matter.

Not on the latest "news". It looks like it's a big dodecahedron.

http://www.spacedaily.com/2003/031008190026.x1vw15hj.html


> [...]
> Fergus noted that the world is quantum, and Erann Gatt summarized
> that he will wait until quantum computers appear. This is
> methodologically a position impos- sible to defend. The
> "computability in principle" deals with the number of quantum states
> independently of any "realistic" model of quantum devices, the
> Turing machine "in general" does not exist either... And, perhaps
> you should be aware that we don't really know how to count quantum
> states in the universe.  In a curved, potentially infinite space,
> the spectra may be whatever, so all this counting has a very, very
> weak sense.

-- 
__Pascal_Bourguignon__
http://www.informatimago.com/
From: Markus Trippelsdorf
Subject: Re: Computability in principle
Date: 
Message-ID: <1qti754jpwp6b.dlg@trippelsdorf.net>
On 05 Nov 2003 10:56:52 +0100, Pascal Bourguignon wrote:

> Jerzy Karczmarczuk <·······@info.unicaen.fr> writes:
> 
>> Ray Blaak wrote:
>>> ···@jpl.nasa.gov (Erann Gat) writes:
>>>
>>>>There are about 10^80 particles in the known Universe.
>>>>The universe is about 10^18 seconds old.
>>>>The Planck time, the time that it takes light to travel the diameter of a
>>>>proton, believed to be the smallest possible time scale on which physical
>>>>processes can occur, is 10^-43 seconds.
>>>>
>>>>So if you turned the entire universe into a computer and had it do nothing
>>>>but count it could enumerate about 10^141 = 2^468 states.
>>>
>>> Well, the universe is not dead yet. If you are *really* willing to
>>> wait you could count as much as you need to.
>> 
>> And, anyway all this is *rubbish*. Perhaps Erann Gatt should read
>> something on cosmology. The "known universe" is not the full
>> universe, the universe is -- in belief of almost all astrophysicists
>> and "fundamental" physicists -- OPEN.  Infinite, if you wish. Our
>> "Hubble sector" might be finite *for local obser- vations*, but it
>> doesn't matter.
> 
> Not on the latest "news". It looks like it's a big dodecahedron.
> 
> http://www.spacedaily.com/2003/031008190026.x1vw15hj.html
> 

That is already ruled out, see:

http://lanl.arxiv.org/abs/astro-ph/0310233

Markus
From: Lord Isildur
Subject: Re: Computability in principle
Date: 
Message-ID: <Pine.LNX.4.58-035.0311051509230.16998@unix44.andrew.cmu.edu>
um, if you turned the entire universe into a computer, you would have an
extremely fast and special-purpose machine whose sole capability would be
to simulate the entire universe in real time.

isildur


> >>>>So if you turned the entire universe into a computer and had it do nothing
> >>>>but count it could enumerate about 10^141 = 2^468 states.
From: Joe Marshall
Subject: Re: Computability in principle
Date: 
Message-ID: <k76er0xe.fsf@ccs.neu.edu>
Lord Isildur <·······@andrew.cmu.edu> writes:

> um, if you turned the entire universe into a computer, you would have an
> extremely fast and special-purpose machine whose sole capability would be
> to simulate the entire universe in real time.

It's been done.
From: Matthias Blume
Subject: Re: Computability in principle
Date: 
Message-ID: <m1wuae4fsj.fsf@tti5.uchicago.edu>
Joe Marshall <···@ccs.neu.edu> writes:

> Lord Isildur <·······@andrew.cmu.edu> writes:
> 
> > um, if you turned the entire universe into a computer, you would have an
> > extremely fast and special-purpose machine whose sole capability would be
> > to simulate the entire universe in real time.
> 
> It's been done.

How can you be sure?  Maybe what we know as our universe is just the
idle loop of some much, much bigger computer?
From: Erann Gat
Subject: Re: Computability in principle
Date: 
Message-ID: <gat-0511031503050001@k-137-79-50-101.jpl.nasa.gov>
In article <··············@tti5.uchicago.edu>, Matthias Blume
<····@my.address.elsewhere> wrote:

> Joe Marshall <···@ccs.neu.edu> writes:
> 
> > Lord Isildur <·······@andrew.cmu.edu> writes:
> > 
> > > um, if you turned the entire universe into a computer, you would have an
> > > extremely fast and special-purpose machine whose sole capability would be
> > > to simulate the entire universe in real time.
> > 
> > It's been done.
> 
> How can you be sure?  Maybe what we know as our universe is just the
> idle loop of some much, much bigger computer?

There's actually some evidence that this is in fact the case.  See
http://www.flownet.com/gat/papers/QM.pdf, section 5 and the associated
reference.

E.
From: Joachim Durchholz
Subject: Re: Computability in principle
Date: 
Message-ID: <bodup5$ukp$1@news.oberberg.net>
Erann Gat wrote:
> There's actually some evidence that this is in fact the case.  See
> http://www.flownet.com/gat/papers/QM.pdf, section 5 and the associated
> reference.

The real URL is http://www.flownet.com/gat/QM.pdf .
(I can't judge the contents of the paper, I don't know enough about 
quantum mechanics for that.)

Regards,
Jo
From: Erann Gat
Subject: Re: Computability in principle
Date: 
Message-ID: <gat-0611031031480001@k-137-79-50-101.jpl.nasa.gov>
In article <············@news.oberberg.net>, Joachim Durchholz
<·················@web.de> wrote:

> Erann Gat wrote:
> > There's actually some evidence that this is in fact the case.  See
> > http://www.flownet.com/gat/papers/QM.pdf, section 5 and the associated
> > reference.
> 
> The real URL is http://www.flownet.com/gat/QM.pdf .

Doh!

> (I can't judge the contents of the paper, I don't know enough about 
> quantum mechanics for that.)

Actually, the paper was written for an educated lay audience, so you might
want to give it a shot even if you don't think you know much about QM. 
(Don't dive right in to section 5 though -- start at the beginning.)

E.
From: Lord Isildur
Subject: Re: Computability in principle
Date: 
Message-ID: <Pine.LNX.4.58-035.0311061332270.6829@unix44.andrew.cmu.edu>
the basic gist of this seems to be (i just stared looking at it) that
the weird causality debates which people keep introducing into the scene
really have no place in QM or any other physics... which is a very reasonable thing to claim. it is a remnant of
a very western, egocentric idea that has some very widely spread roots in
our culture... things do not happen because we observe them, nor is our
meaurement relevant. 'measure' is a poor term for 'interact with'
in, for example, a scenario of correlated particles, collapsing one of them
will not instantaneously force the other to behave accordingly. instead, it
tells you something about the other one which existed anyway, but until
you know one half, you cant know the other hald. the only thing which was
instantaneous was the inference, nothing physical. moreover, diddling with
one of them will allow you to tell what would happen if you diddled identically
with the other.. it does not make the other somehow spontaneously diddled with.
since these experiments almost as a rule tend to ignore this, a large
portion of the community and society at large keeps on imagining some bizarre
metaphysics.. party, i think, because they like to believe it. it's
enigmatic and nifty, if altogether unrealistics, and seriously, people have
already shown that they are capable of believing the most ridiculous stuff
imaginable sometimes.

(i hope that paper doesnt turn out to be a total ridculosity, now
that ive gone and effectively sided with it :)

Isildur


On Thu, 6 Nov 2003, Joachim Durchholz wrote:

> Erann Gat wrote:
> > There's actually some evidence that this is in fact the case.  See
> > http://www.flownet.com/gat/papers/QM.pdf, section 5 and the associated
> > reference.
>
> The real URL is http://www.flownet.com/gat/QM.pdf .
> (I can't judge the contents of the paper, I don't know enough about
> quantum mechanics for that.)
>
> Regards,
> Jo
>
>
From: Garry Hodgson
Subject: Re: Re: Computability in principle
Date: 
Message-ID: <2003110517551068072906@k2.sage.att.com>
Matthias Blume <····@my.address.elsewhere> wrote:

> Joe Marshall <···@ccs.neu.edu> writes:
> 
> > Lord Isildur <·······@andrew.cmu.edu> writes:
> > 
> > > um, if you turned the entire universe into a computer, you would have an
> > > extremely fast and special-purpose machine whose sole capability would be
> > > to simulate the entire universe in real time.
> > 
> > It's been done.
> 
> How can you be sure?  Maybe what we know as our universe is just the
> idle loop of some much, much bigger computer?

or maybe we're the screensaver.

----
Garry Hodgson, Technology Consultant, AT&T Labs

Be happy for this moment.
This moment is your life.
From: Espen Vestre
Subject: Re: Computability in principle
Date: 
Message-ID: <kwy8uv3s89.fsf@merced.netfonds.no>
Jerzy Karczmarczuk <·······@info.unicaen.fr> writes:

> Ray Blaak wrote:
> > ···@jpl.nasa.gov (Erann Gat) writes:
> >
> >>There are about 10^80 particles in the known Universe.
> >>The universe is about 10^18 seconds old.
> >>The Planck time, the time that it takes light to travel the diameter of a
> >>proton, believed to be the smallest possible time scale on which physical
> >>processes can occur, is 10^-43 seconds.
> >>
> >>So if you turned the entire universe into a computer and had it do nothing
> >>but count it could enumerate about 10^141 = 2^468 states.
> > Well, the universe is not dead yet. If you are *really* willing to
> > wait you
> > could count as much as you need to.
> 
> And, anyway all this is *rubbish*. 

Isn't that a bit too harsh? Or are you suggesting that all the
attempts to find a bit-measure of the entropy in the (known part of)
the universe is also a piece of crap?

I was very fascinated when I read the stuff on the "Holographic
Universe" in Scientific American

(http://www.sciam.com/article.cfm?articleID=000AF072-4891-1F0A-97AE80A84189EEDF)

but as someone with only a couple of beginner's astronomy courses from
university, it's of course very hard to keep track of all the weird
(but fascinating!) stuff that goes on in contemporary Cosmology.

Anyway, when reading "the brain is a computer" or "everything is a
computer" or "everything is bits and bytes" type of claims, it's good
advice to remember that such claims is more often "a sign of the
times" than anything else. E.g. when statistical methods were getting
a foothold and variance analysis was established, psychologists were
so excited they started to claim that the brain was doing nothing but
variance analysis :-)
-- 
  (espen)
From: Tak To
Subject: Re: Computability in principle
Date: 
Message-ID: <7aydnbokD6CQ1DSiRVn-ig@comcast.com>
Espen Vestre wrote:
> Anyway, when reading "the brain is a computer" or "everything
 > is a computer" or "everything is bits and bytes" type of
 > claims, it's good advice to remember that such claims is more
 > often "a sign of the times" than anything else.

It could be.  However, a fair number of people believe the
equivalence of brain and a computer based on the observation
(somewhat simplified here) that all known finitely defined
formalisms have equivalent computing power -- a.k.a. Church's
Thesis.  Note that _is_finitely_defined_ is not the same as
_has_finite_number_of_states_; thus even if the brain can hold
infinitely many states, it still can be finitely defined.

Tak
--
----------------------------------------------------------------------
Tak To                                            ·····@alum.mit.eduxx
--------------------------------------------------------------------^^
  [taode takto ~{LU5B~}]      NB: trim the xx to get my real email addr
From: Joachim Durchholz
Subject: Re: Computability in principle
Date: 
Message-ID: <bob8l3$pfh$1@news.oberberg.net>
Espen Vestre wrote:

> I was very fascinated when I read the stuff on the "Holographic
> Universe" in Scientific American

The German version of SA carried a German translation.
I wasn't impressed - some hard results, but too much speculative 
content. I'm pretty sure that holograms will be the next New Age wave 
after that article... and I agree with Jerzy that there's already too 
much pseudophilosophical rubbish being aired, articles that highlight 
the speculative instead of the established just further such tendencies.

Just my 2c.

Jo
From: John Thingstad
Subject: Re: Computability in principle
Date: 
Message-ID: <opryhofrwhxfnb1n@news.chello.no>
On 05 Nov 2003 13:38:46 +0100, Espen Vestre 
<·····@*do-not-spam-me*.vestre.net> wrote:

> Jerzy Karczmarczuk <·······@info.unicaen.fr> writes:
>
>> Ray Blaak wrote:
>> > ···@jpl.nasa.gov (Erann Gat) writes:
>> >
>> >>There are about 10^80 particles in the known Universe.
>> >>The universe is about 10^18 seconds old.
>> >>The Planck time, the time that it takes light to travel the diameter 
>> of a
>> >>proton, believed to be the smallest possible time scale on which 
>> physical
>> >>processes can occur, is 10^-43 seconds.
>> >>
>> >>So if you turned the entire universe into a computer and had it do 
>> nothing
>> >>but count it could enumerate about 10^141 = 2^468 states.
>> > Well, the universe is not dead yet. If you are *really* willing to
>> > wait you
>> > could count as much as you need to.
>>
>> And, anyway all this is *rubbish*.
>
> Isn't that a bit too harsh? Or are you suggesting that all the
> attempts to find a bit-measure of the entropy in the (known part of)
> the universe is also a piece of crap?
>
> I was very fascinated when I read the stuff on the "Holographic
> Universe" in Scientific American
>
> (http://www.sciam.com/article.cfm?articleID=000AF072-4891-1F0A-97AE80A84189EEDF)
>
> but as someone with only a couple of beginner's astronomy courses from
> university, it's of course very hard to keep track of all the weird
> (but fascinating!) stuff that goes on in contemporary Cosmology.
>
> Anyway, when reading "the brain is a computer" or "everything is a
> computer" or "everything is bits and bytes" type of claims, it's good
> advice to remember that such claims is more often "a sign of the
> times" than anything else. E.g. when statistical methods were getting
> a foothold and variance analysis was established, psychologists were
> so excited they started to claim that the brain was doing nothing but
> variance analysis :-)

the brain is nothing but a fractal tree of integrate and fire pulse 
coupled oscillators!

-- 
Using M2, Opera's revolutionary e-mail client: http://www.opera.com/m2/
From: Nick Name
Subject: Re: Computability in principle
Date: 
Message-ID: <4K1qb.415545$R32.13789137@news2.tin.it>
Erann Gat wrote:

> 
> Many formalist arguments rely on exhaustive enumeration of programs.
> Prunesquallors "universal algorithm" is one such example.��This
> algorithm exhaustively enumerates all programs and mechanically checks
> to see if they satisfy a formal specification.��The�argument�has
> centered�on�the quesiton of whether checking against a specification
> is really feasable on theoretical grounds.��My�position�has�been�that
> it�is�not

I have not read the other thread but how would such an algorithm deal
with nonterminating programs? I mean: how do you mechanically check a
program to see if it satisfies a formal specification, given the
undecidability of this problem? Sorry if I misunderstood this position.

V.
From: Joe Marshall
Subject: Re: Computability in principle
Date: 
Message-ID: <oevqr5g0.fsf@ccs.neu.edu>
···@jpl.nasa.gov (Erann Gat) writes:

> The "Python from Wise Guy's Viewpoint" thread has had an extraordinarily
> good run, but the sub-thread on formalisms has gotten far enough removed
> from the subject line that I've decided to start a new thread and broach a
> new but related topic.
>
> Many formalist arguments rely on exhaustive enumeration of programs. 
> Prunesquallors "universal algorithm" is one such example.  This algorithm
> exhaustively enumerates all programs and mechanically checks to see if
> they satisfy a formal specification.  The argument has centered on the
> quesiton of whether checking against a specification is really feasable on
> theoretical grounds.  My position has been that it is not, that a
> consequence of the Goedel/Turing theorem(s) there are formal
> specifications (like halting) that are not computable, and that therefore
> any language provided for writing specifications must be either
> uncomputable or incomplete.  I would have thought this to be a completely
> uncontroversial point, but on usenet you never know.

I hesitate to continue because I'm getting very frustrated in my
attempts to even find a common definition of `formal'.  As it is, I
agree completely that any language for writing specifications must
necessarily be uncomputable, incomplete, or weak.  In fact, I'm
probably in the minority in arguing that it is *so* uncomputable,
incomplete, or weak as to be practically useless.

(The basis for the argument is that a specification formal enough
to be worthy of the name *is* a program in and of itself and thus
neither more nor less amenable to analysis than the program that
attempts to implement it.)

[argument based on physical constraints snipped]

Practicality?  On Usenet? 
From: Pascal Costanza
Subject: Re: Computability in principle
Date: 
Message-ID: <boe7jo$7ln$1@newsreader2.netcologne.de>
Joe Marshall wrote:

> I hesitate to continue because I'm getting very frustrated in my
> attempts to even find a common definition of `formal'. 

http://www.calculemus.org/lect/L-I-MNS/12/ekon-i-modele/smith-foundtns.pdf 
is an interesting read in this regard.


Pascal
From: William D Clinger
Subject: Re: Computability in principle
Date: 
Message-ID: <fb74251e.0311061745.4c5fc2a7@posting.google.com>
···@jpl.nasa.gov (Erann Gat) wrote:
> So it is impossible even in principle given the physical constraints of
> our universe to enumerate programs larger than about sixty bytes, or
> shorter than a single line of text on a standard 80-character wide
> display.

Ah, but this presumes that programs are represented as character strings,
which we know to be a space-inefficient and very un-Lispy encoding.  Even
a naive encoding of programs as abstract syntax trees is about 10 times
as space-efficient as a string of ISO 8859-1 characters, and would allow
the universal computer to enumerate a correspondingly larger set of
programs before its resources are exhausted [1,2].

If the more space-efficient encoding is not enough to enumerate all the
programs you care about, then you could always put out an RFP to acquire
additional universes.

I confess I do not know the bureaucratic details of that process myself.
Indeed I do not know how to obtain permission to reprogram the immediate
universe, which (as Lord Isildur and Joe Marshall have observed) appears
to be fully engaged in a different computation altogether, that of
simulating the universe.  With truly remarkable accuracy, if you ask me.

Will

[1]  You know, the ISO 8859-1 standard.

[2]  William D Clinger, private correspondence.  Clinger obtained this
measurement by modifying a small (9000 SLOC) compiler for Quirk20 [3]
and using the modified compiler to compute an upper bound for the number
of bits required to encode the abstract syntax trees for a set of 20
small test programs.

[3]  Quirk20 [4] is a subset of Java extended with block structure and
first class procedures.

[4]  http://www.ccs.neu.edu/course/com1355/quirk20.html
From: Ray Blaak
Subject: Re: Computability in principle
Date: 
Message-ID: <ur80kpv35.fsf@STRIPCAPStelus.net>
··········@verizon.net (William D Clinger) writes:
> [3]  Quirk20 [4] is a subset of Java extended with block structure and
> first class procedures.
> 
> [4]  http://www.ccs.neu.edu/course/com1355/quirk20.html

Comments:

01. You can have constant and global primitives, but not objects? Seems like a
strange restriction. Why not allow global object fields at least?

02. Functions are: proc <rtype> <id> ( <formals> ) <block> 
But "proc" reads like "procedure", and yet there is a return type. I suggest
"func" instead.

03. There doesn't seem to be a way to name function types. This causes a
    maintenance problem. Consider:

  proc void foo(function (from (int) to int) {...}
  proc void bar(function (from (int) to int) {...}
  ...etc...

And now you realize you forgot a missing parameter and have to update all
usage locations. I suggest to allow a named function type declaration
notataion that furthermore better corresponds to the usual method notation:

  functype int F (int x, int y); // Declares F as a function type

  func void foo(F f) {...}
  func void bar(F f) {...}
  func List apply(F f, List inputs) {...}

-- 
Cheers,                                        The Rhythm is around me,
                                               The Rhythm has control.
Ray Blaak                                      The Rhythm is inside me,
········@STRIPCAPStelus.net                    The Rhythm has my soul.
From: William D Clinger
Subject: Re: Computability in principle
Date: 
Message-ID: <fb74251e.0311071644.6ad17d99@posting.google.com>
Ray Blaak's comments were quite reasonable, but Quirk20 isn't.

It is the 20th language I have designed to serve as the source
language for a term project in compiler construction.  As such,
I want it to have enough weird quirks to be representative of
real languages, but simple enough for students to complete a
compiler for it in one term---ten weeks in this case.  Several
of the suggestions that Ray made were adopted in QuirkN, for
certain values of N less than 20.  To discourage students from
trying to reuse a previous year's compiler, the language changes
from year to year in completely arbitrary ways.

Quirk20 is suitable for measuring the compression achieved by
naive encodings of abstract syntax trees, but I wouldn't want
to program in it.

Will
From: Marcin 'Qrczak' Kowalczyk
Subject: Re: Computability in principle
Date: 
Message-ID: <pan.2003.11.07.10.12.41.647305@knm.org.pl>
On Thu, 06 Nov 2003 17:45:19 -0800, William D Clinger wrote:

> Ah, but this presumes that programs are represented as character strings,
> which we know to be a space-inefficient and very un-Lispy encoding.  Even
> a naive encoding of programs as abstract syntax trees is about 10 times
> as space-efficient as a string of ISO 8859-1 characters,

It's not true, except perhaps with lots of repeated very long symbols.
Otherwise why would Chicken have a threshold for compressing large quoted
values into strings to be read at runtime?

-- 
   __("<         Marcin Kowalczyk
   \__/       ······@knm.org.pl
    ^^     http://qrnik.knm.org.pl/~qrczak/
From: William D Clinger
Subject: Re: Computability in principle
Date: 
Message-ID: <fb74251e.0311071701.7ae84609@posting.google.com>
Marcin 'Qrczak' Kowalczyk <······@knm.org.pl> wrote in message news:<······························@knm.org.pl>...
> > a naive encoding of programs as abstract syntax trees is about 10 times
> > as space-efficient as a string of ISO 8859-1 characters,
> 
> It's not true, except perhaps with lots of repeated very long symbols.
> Otherwise why would Chicken have a threshold for compressing large quoted
> values into strings to be read at runtime?

Let's examine the structure of this fallacy:  Implementation A
uses algorithm B to solve problem C, so algorithm B must be the
best algorithm for solving problem D.

It is reasonable for an implementation to use string compression
algorithms when compressing strings.  Quoted structures can be
sort of stringlike, so I can understand why string compression
algorithms might be useful when compressing quoted values.  That
does not imply that string compression is the best way to compress
entire programs.

As cited in my post, the factor of 10 came from actual measurements
using a rather naive encoding.

Will
From: Joe Marshall
Subject: Re: Computability in principle
Date: 
Message-ID: <k76cymww.fsf@ccs.neu.edu>
··········@verizon.net (William D Clinger) writes:

> If the more space-efficient encoding is not enough to enumerate all the
> programs you care about, then you could always put out an RFP to acquire
> additional universes.

RFP = Request For Physics?

> I confess I do not know the bureaucratic details of that process myself.
> Indeed I do not know how to obtain permission to reprogram the immediate
> universe, which (as Lord Isildur and Joe Marshall have observed) appears
> to be fully engaged in a different computation altogether, that of
> simulating the universe.  With truly remarkable accuracy, if you ask me.

It isn't too bad, but it does seem to have a few noticable resource
issues:  if you are in the vicinity of a large enough number of
particles, the clock rate seems to slow down.
From: Lupo LeBoucher
Subject: Re: Computability in principle
Date: 
Message-ID: <ZIudnZjnT7cuZzeiRVn-hg@io.com>
Posted and CC:ed

In article <····················@k-137-79-50-101.jpl.nasa.gov>,
Erann Gat <···@jpl.nasa.gov> wrote:
>
>There are about 10^80 particles in the known Universe.
>The universe is about 10^18 seconds old.
>The Planck time, the time that it takes light to travel the diameter of a
>proton, believed to be the smallest possible time scale on which physical
>processes can occur, is 10^-43 seconds.
>
>So if you turned the entire universe into a computer and had it do nothing
>but count it could enumerate about 10^141 = 2^468 states.

In fact, it is a little smaller than this, at least according to Seth 
Lloyd. I thought he lost his marbles or was making a joke when I first 
read this in Physical Review Letters, but it is sort of important for the 
reasons you mention:
http://focus.aps.org/story/v9/st27

>All programs of practical interest are vastly larger than 60 bytes.  I
>submit therefore that any formalist program (I use the word here in the
>sense of "Hilbert's program", not a computer program) based on exhaustive
>enumeration of programs (in the computer-program sense this time) has no
>hope of having any practical utility, and is therefore not worthy of
>serious consideration.

IMO, the entire field of computing is about to get a serious humbling. 
Really, this should have happened 20 years ago, when computer scientists 
failed to produce anything resembling what the previous generation were 
promising the world (strong AI, good character/voice/image recognition, 
reliable software, self-programming computers, machine language 
translation blah blah blah), but there is a more fundamental issue.

For the last decade or so, people have begun to notice that analog 
computers (like the one between your ears, unless you believe in whatever 
Penrose does) have absurdly larger computational capabilities than 
digital/Turing ones do.

For references, do a websearch on 
"Bournez, Cosnard, Super-Turing"
"Christopher Moore, Dynamical Systems" 
"Hava Siegelmann, neural" and
"Smale, Shub, Blum, NP-completeness"

This sort of thing is far from done yet, but it does appear that there is 
something new under the sun. And it is tied in with the properties of real 
numbers versus integers.

Quantum computing is potentially powerful stuff too (assuming you can 
build one), but in a way, one can look at it is being "analog" in the 
sense that the 1's and 0's are actually a complex number in the Hilbert 
Space. If this were not true, all the stuff about entanglement would fade 
in importance.

ObLisp: I am actually learning Lisp to give me a tool to think about this 
shit more effectively.

-Lupo
"You see, wire telegraph is a kind of a very, very long cat. You pull his
tail in New York and his head is meowing in Los Angeles. Do you understand
this? And radio operates exactly the same way: you send signals here, they
receive them there. The only difference is that there is no cat." -Albert
Einstein explains the radio                         <··@fnord.io.com>
From: Erann Gat
Subject: Re: Computability in principle
Date: 
Message-ID: <gat-0611031819440001@192.168.1.51>
In article <······················@io.com>, ··@io.com (Lupo LeBoucher) wrote:

> Posted and CC:ed
> 
> In article <····················@k-137-79-50-101.jpl.nasa.gov>,
> Erann Gat <···@jpl.nasa.gov> wrote:
> >
> >There are about 10^80 particles in the known Universe.
> >The universe is about 10^18 seconds old.
> >The Planck time, the time that it takes light to travel the diameter of a
> >proton, believed to be the smallest possible time scale on which physical
> >processes can occur, is 10^-43 seconds.
> >
> >So if you turned the entire universe into a computer and had it do nothing
> >but count it could enumerate about 10^141 = 2^468 states.
> 
> In fact, it is a little smaller than this, at least according to Seth 
> Lloyd. I thought he lost his marbles or was making a joke when I first 
> read this in Physical Review Letters, but it is sort of important for the 
> reasons you mention:
> http://focus.aps.org/story/v9/st27

Interesting.  He uses a completely different methodology but comes up with
an answser that's within twenty orders of magnitude, which is not that
much of an error when dealing with numbers of this magnitude.  (If you
really want to blow your mind with big numbers see
http://home.earthlink.net/~mrob/pub/math/largenum.html)

E.
From: Joe Marshall
Subject: Re: Computability in principle
Date: 
Message-ID: <fzh0ymib.fsf@ccs.neu.edu>
··@io.com (Lupo LeBoucher) writes:

> For the last decade or so, people have begun to notice that analog 
> computers (like the one between your ears, unless you believe in whatever 
> Penrose does) have absurdly larger computational capabilities than 
> digital/Turing ones do.

Yes.  The digital abstraction tosses out an enormous amount of phase space
and uses it to increase the hamming distance between what's left.