From: Xah Lee
Subject: Mathematica 7 compares to other languages
Date: 
Message-ID: <13d5dff7-9aac-4385-91ee-d6e58177497f@i24g2000prf.googlegroups.com>
Wolfram Research's Mathematica Version 7 has just been released.

See:
http://www.wolfram.com/products/mathematica/index.html

Among it's marketing material, it has a section on how mathematica
compares to competitors.
http://www.wolfram.com/products/mathematica/analysis/

And on this page, there are sections where Mathematica is compared to
programing langs, such as C, C++, Java, and research langs Lisp,
ML, ..., and scripting langs Python, Perl, Ruby...

See:
http://www.wolfram.com/products/mathematica/analysis/content/ProgrammingLanguages.html
http://www.wolfram.com/products/mathematica/analysis/content/ResearchLanguages.html
http://www.wolfram.com/products/mathematica/analysis/content/ScriptingLanguages.html

Note: I'm not affliated with Wolfram Research Inc.

  Xah
∑ http://xahlee.org/

☄

From: Xah Lee
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <c4421778-9cba-47a7-be4d-575969b8dc99@i24g2000prf.googlegroups.com>
On Nov 30, 7:30 pm, Xah Lee <······@gmail.com> wrote:
> Wolfram Research's Mathematica Version 7 has just been released.
>
> See: http://www.wolfram.com/products/mathematica/index.html
>
> Among it's marketing material, it has a section on how mathematica
> compares to competitors. http://www.wolfram.com/products/mathematica/analysis/

Stephen Wolfram has a blog entry about Mathematica 7. Quite amazing:

http://blog.wolfram.com/2008/11/18/surprise-mathematica-70-released-today/

Mathematica today in comparsion to all other existing langs, can be
perhaps compared to how lisp was to other langs in the say 1980s:
Quite far beyond all.

Seeing how lispers today still talking about how to do basic list
processing with its unusable cons, and how they get giddy with 1980's
macros (as opposed to full term rewriting), and still lack pattern
matching, one feels kinda sad.

see also:

• Fundamental Problems of Lisp
  http://xahlee.org/UnixResource_dir/writ/lisp_problems.html

  Xah
∑ http://xahlee.org/

☄
From: budden
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <a567b505-3f88-49d1-95b8-d1b80de4ab5b@z1g2000yqn.googlegroups.com>
Mathematica is a great language, but:
1. it is too slow
2. It is often hard to read
3. It gives sence to every keystroke. You press escape by occasion and
it goes in a code as a new
symbol, w/o error. Nasty.
3. I know 5-th version. It does not allow to track the source as SLIME
does. This feature as absolutely
necessary for serious development

So, in fact, Mathematica do not scale well IMO.
From: ··············@lycos.com
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <673ca33e-9c1e-4c8e-a81d-d46828fe47b1@n10g2000yqm.googlegroups.com>
Mathematica has some powerful symbolic processing capabilities, for
example the integrals, etc. It also contains many powerful algorithms,
often written in few lines of code. And its graphic capabilities are
good. It also shows some surprising ways to integrate and manipulate
data, for example here you can see how you can even put images into
formulas, to manipulate them:
http://reference.wolfram.com/mathematica/ref/ImageApply.html

So when you need an algorithm, you can often find it already inside,
for example in the large Combinatorics package. So it has WAY more
batteries included, compared to Python. I'd like to see something as
complete as that Combinatorics package in Python.

But while the editor of (oldish) Mathematica is good to quickly input
formulas (but even for this I have found way better things, for
example the editor of GraphEQ  www.peda.com/grafeq/ that is kilometers
ahead), it's awful for writing *programs* even small 10-line ones.
Even notepad seems better for this.

For normal programming Python is light years more handy and better
(and more readable too), there's no contest here. Python is also
probably faster for normal programs (when built-in functions aren't
used). Python is much simpler to learn, to read, to use (but it also
does less things).

A big problem is of course that Mathematica costs a LOT, and is closed
source, so a mathematician has to trust the program, and can't inspect
the code that gives the result. This also means that any research
article that uses Mathematica relies on a tool that costs a lot (so
not everyone can buy it to confirm the research results) and it
contains some "black boxes" that correspond to the parts of the
research that have used the closed source parts of Mathematica, that
produce their results by "magic". As you can guess, in science it's
bad to have black boxes, it goes against the very scientific method.

Bye,
bearophile
From: Thomas A. Russ
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <ymibpvugqh7.fsf@blackcat.isi.edu>
··············@lycos.com writes:

> A big problem is of course that Mathematica costs a LOT, and is closed
> source, so a mathematician has to trust the program, and can't inspect
> the code that gives the result. This also means that any research
> article that uses Mathematica relies on a tool that costs a lot (so
> not everyone can buy it to confirm the research results) and it
> contains some "black boxes" that correspond to the parts of the
> research that have used the closed source parts of Mathematica, that
> produce their results by "magic". As you can guess, in science it's
> bad to have black boxes, it goes against the very scientific method.

Well, that hardly seems to be the case for most science that relies on
standard, physical instruments.  Certainly mass spectrograph analyzers
are not free.  Some of them are quite expensive.  But that doesn't stop
them from being useful tools in biology and chemistry.  And it doesn't
deter other scientists from being able to check the work just because
they may need an expensive piece of apparatus to do it.

In any case, for results that are produced by Mathematica, shouldn't it
be possible to just check them by hand?  After all, it isn't as if there
is some proprietary principles of mathematics that are involved, are
there?

And should be conclude that any research done by�the Large Hadron
Collider goes against the scientific method just because there's only
one, tremendously expensive machine that allows you to verify the
experimental results?

-- 
Thomas A. Russ,  USC/Information Sciences Institute
From: George Neuner
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <dhsbj4tk7ao69k3jvhmh8tkqobf16smvce@4ax.com>
On 02 Dec 2008 14:30:12 -0800, ···@sevak.isi.edu (Thomas A. Russ)
wrote:

>
>··············@lycos.com writes:
>
>> A big problem is of course that Mathematica costs a LOT, and is closed
>> source, so a mathematician has to trust the program, and can't inspect
>> the code that gives the result. This also means that any research
>> article that uses Mathematica relies on a tool that costs a lot (so
>> not everyone can buy it to confirm the research results) and it
>> contains some "black boxes" that correspond to the parts of the
>> research that have used the closed source parts of Mathematica, that
>> produce their results by "magic". As you can guess, in science it's
>> bad to have black boxes, it goes against the very scientific method.
>
>Well, that hardly seems to be the case for most science that relies on
>standard, physical instruments.  Certainly mass spectrograph analyzers
>are not free.  Some of them are quite expensive.  But that doesn't stop
>them from being useful tools in biology and chemistry.  And it doesn't
>deter other scientists from being able to check the work just because
>they may need an expensive piece of apparatus to do it.
>
>In any case, for results that are produced by Mathematica, shouldn't it
>be possible to just check them by hand?  After all, it isn't as if there
>is some proprietary principles of mathematics that are involved, are
>there?
>
>And should be conclude that any research done by�the Large Hadron
>Collider goes against the scientific method just because there's only
>one, tremendously expensive machine that allows you to verify the
>experimental results?

No, but the results must be held suspect until independently verified.
Given the enormous cost of the LHC and the economic climate, nothing
it produces are likely to be verified in our lifetimes.

George
From: Pascal J. Bourguignon
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <7cej0pv343.fsf@pbourguignon.anevia.com>
George Neuner <········@comcast.net> writes:

> On 02 Dec 2008 14:30:12 -0800, ···@sevak.isi.edu (Thomas A. Russ)
> wrote:
>
>>
>>··············@lycos.com writes:
>>
>>> A big problem is of course that Mathematica costs a LOT, and is closed
>>> source, so a mathematician has to trust the program, and can't inspect
>>> the code that gives the result. This also means that any research
>>> article that uses Mathematica relies on a tool that costs a lot (so
>>> not everyone can buy it to confirm the research results) and it
>>> contains some "black boxes" that correspond to the parts of the
>>> research that have used the closed source parts of Mathematica, that
>>> produce their results by "magic". As you can guess, in science it's
>>> bad to have black boxes, it goes against the very scientific method.
>>
>>Well, that hardly seems to be the case for most science that relies on
>>standard, physical instruments.  Certainly mass spectrograph analyzers
>>are not free.  Some of them are quite expensive.  But that doesn't stop
>>them from being useful tools in biology and chemistry.  And it doesn't
>>deter other scientists from being able to check the work just because
>>they may need an expensive piece of apparatus to do it.
>>
>>In any case, for results that are produced by Mathematica, shouldn't it
>>be possible to just check them by hand?  After all, it isn't as if there
>>is some proprietary principles of mathematics that are involved, are
>>there?
>>
>>And should be conclude that any research done by�the Large Hadron
>>Collider goes against the scientific method just because there's only
>>one, tremendously expensive machine that allows you to verify the
>>experimental results?
>
> No, but the results must be held suspect until independently verified.
> Given the enormous cost of the LHC and the economic climate, nothing
> it produces are likely to be verified in our lifetimes.

Right for the collider.    I would even require to build the checking
collider on another planet, just to be sure the results we get are not
local happenstance.


But for Mathematica, you can use other software to check the results,
there are a lot of mathematical software and theorem provers around.

-- 
__Pascal Bourguignon__
From: Jon Harrop
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <5fednakLdt2wKavUnZ2dneKdnZydnZ2d@posted.plusnet>
Thomas A. Russ wrote:
> In any case, for results that are produced by Mathematica, shouldn't it
> be possible to just check them by hand?

Only in some cases. For example, most numerical computations cannot be
checked by hand and any large symbolic calculations quickly become
intractable.

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
From: toby
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <92196910-c849-45d6-a5e9-4d16e1f569ee@x38g2000yqj.googlegroups.com>
On Dec 1, 5:24 am, budden <········@gmail.com> wrote:
> Mathematica is a great language, but:
> 1. it is too slow
> 2. It is often hard to read
> 3. It gives sence to every keystroke. You press escape by occasion and
> it goes in a code as a new
> symbol, w/o error. Nasty.
> 3. I know 5-th version. It does not allow to track the source as SLIME
> does. This feature as absolutely
> necessary for serious development

Worst of all, it's proprietary, which makes it next to useless. Money
corrupts.

>
> So, in fact, Mathematica do not scale well IMO.
From: Don Geddis
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <87bpvveuea.fsf@yoda.geddis.org>
Xah Lee <······@gmail.com> wrote on Sun, 30 Nov 2008:
> Mathematica today in comparsion to all other existing langs, can be
> perhaps compared to how lisp was to other langs in the say 1980s:
> Quite far beyond all.

You seem to have drunk the kool-aid.  Do you not realize that every
programming language design is a series of compromises?  It always
makes some things easier, at the expense of making other things harder.

Can you think of no programming task for which the Mathematica approach
is more difficult?

> Seeing how lispers today still talking about how to do basic list
> processing with its unusable cons

You bring this up every time, and are just as wrong this time as each time
previous.

> and how they get giddy with 1980's macros (as opposed to full term
> rewriting), and still lack pattern matching, one feels kinda sad.

If you think that "full term rewriting" is a superset of the functionality of
Common Lisp macros, then you've clearly missed the whole point of macros.

Term rewriting may be a good idea.  But macros are a different, good idea.

        -- Don
_______________________________________________________________________________
Don Geddis                  http://don.geddis.org/               ···@geddis.org
Sign for a combined Veterinarian and Taxidermist business:
	"Either Way You Get Your Dog Back"
From: ··················@gmail.com
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <3d81f880-6a90-4a11-8dd4-408741e5baed@j38g2000yqa.googlegroups.com>
On Dec 1, 2:23 am, Xah Lee <······@gmail.com> wrote:
> On Nov 30, 7:30 pm, Xah Lee <······@gmail.com> wrote:
>
>>some stuff

Are you a bot?

I think you failed the Turing test after the 8th time you posted the
exact same thing...

I'm completely serious.
From: ··················@gmail.com
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <d902b71a-910d-4a0c-ba8c-b1654d296cf4@q9g2000yqc.googlegroups.com>
On Nov 30, 10:30 pm, Xah Lee <······@gmail.com> wrote:
> some stuff

You are a bot?

I think you failed the Turing test when you posted the same thing 20
times.

A rational human would realize that not too many people peruse this
newsgroup,
and that most of them have already seen the wall of text post that you
generate every time.

Just a thought, but whoever owns this thing might want to rework the
AI.
From: Lew
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <gh2328$80e$5@news.albasani.net>
··················@gmail.com wrote:
> A rational human would realize that not too many people peruse this
> newsgroup,
> and that most of them have already seen the wall of text post that you
> generate every time.

Just out of curiosity, what do you consider "this" newsgroup, given its wide 
crossposting?

-- 
Lew
From: ··················@gmail.com
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <603036aa-f047-4ab4-a27c-e7ed7347ccd5@f3g2000yqf.googlegroups.com>
On Dec 1, 8:29 pm, Lew <·····@lewscanon.com> wrote:
> ··················@gmail.com wrote:
> > A rational human would realize that not too many people peruse this
> > newsgroup,
> > and that most of them have already seen the wall of text post that you
> > generate every time.
>
> Just out of curiosity, what do you consider "this" newsgroup, given its wide
> crossposting?
>
> --
> Lew

Ah, didn't realize the cross-posted nature.

comp.lang.lisp

Hadn't realized he had branched out to cross-posting across five
comp.langs

Apologies for the double post,
thought the internet had wigged out when i sent it first time.
From: awhite
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <pan.2008.12.01.23.40.56.735000@iinet.net.au>
On Mon, 01 Dec 2008 14:53:50 -0800, anonymous.c.lisper wrote:

> On Nov 30, 10:30�pm, Xah Lee <······@gmail.com> wrote:
>> some stuff
> 
> You are a bot?
> 
> I think you failed the Turing test when you posted the same thing 20
> times.

I have wondered the same thing. Perhaps Xah is an ELIZA simulation without
the profanity filter.

A
From: Jon Harrop
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <yv2dnTR0E8vt5qnUnZ2dnUVZ8qTinZ2d@posted.plusnet>
Xah Lee wrote:
> And on this page, there are sections where Mathematica is compared to
> programing langs, such as C, C++, Java, and research langs Lisp,
> ML, ..., and scripting langs Python, Perl, Ruby...

Have they implemented any of the following features in the latest version:

1. Redistributable standalone executables.

2. Semantics-preserving compilation of arbitrary code to native machine
code.

3. A concurrent run-time to make efficient parallelism easy.

4. Static type checking.

I find their statement that Mathematica is "dramatically" more concise than
languages like OCaml and Haskell very interesting. I ported my ray tracer
language comparison to Mathematica:

  http://www.ffconsultancy.com/languages/ray_tracer/

My Mathematica code weighs in at 50 LOC compared to 43 LOC for OCaml and 44
LOC for Haskell. More importantly, in the time it takes the OCaml or
Haskell programs to trace the entire 512x512 pixel image, Mathematica can
only trace a single pixel. Overall, Mathematica is a whopping 700,000 times
slower!

Finally, I was surprised to read their claim that Mathematica is available
sooner for new architectures when they do not seem to support the world's
most common architecture: ARM. Also, 64-bit Mathematica came 12 years after
the first 64-bit ML...

Here's my Mathematica code for the ray tracer benchmark:

delta = Sqrt[$MachineEpsilon];

RaySphere[o_, d_, c_, r_] :=
  Block[{v, b, disc, t1, t2},
    v = c - o;
    b = v.d;
    disc = Sqrt[b^2 - v.v + r^2];
    t2 = b + disc;
    If[Im[disc] != 0 || t2 <= 0, \[Infinity],
      t1 = b - disc;
      If[t1 > 0, t1, t2]]
    ]

Intersect[o_, d_][{lambda_, n_}, Sphere[c_, r_]] :=
 Block[{lambda2 = RaySphere[o, d, c, r]},
  If[lambda2 >= lambda, {lambda, n}, {lambda2, 
    Normalize[o + lambda2 d - c]}]
  ]
Intersect[o_, d_][{lambda_, n_}, Bound[c_, r_, s_]] :=
 Block[{lambda2 = RaySphere[o, d, c, r]},
  If[lambda2 >= lambda, {lambda, n}, 
   Fold[Intersect[o, d], {lambda, n}, s]]
  ]

neglight = ·@Normalize[{1, 3, -2}];

nohit = {\[Infinity], {0, 0, 0}};

RayTrace[o_, d_, scene_] :=
 Block[{lambda, n, g, p},
  {lambda, n} = Intersect[o, d][nohit, scene];
  If[lambda == \[Infinity], 0,
   g = n.neglight;
   If[g <= 0, 0,
    {lambda, n} = 
     Intersect[o + lambda d + delta n, neglight][nohit, scene];
    If[lambda < \[Infinity], 0, g]]]
  ]

Create[level_, c_, r_] :=
 Block[{obj = Sphere[c, r]},
  If[level == 1, obj,
   Block[{a = 3*r/Sqrt[12], Aux},
    Aux[x1_, z1_] := Create[level - 1, c + {x1, a, z1}, 0.5 r];
    Bound[c, 
     3 r, {obj, Aux[-a, -a], Aux[a, -a], Aux[-a, a], Aux[a, a]}]]]]

scene = Create[1, {0, -1, 4}, 1];

Main[level_, n_, ss_] :=
 Block[{scene = Create[level, {0, -1, 4}, 1]},
  Table[
   Sum[
     RayTrace[{0, 0, 0}, 
      ·@Normalize[{(x + s/ss/ss)/n - 1/2, (y + Mod[s, ss]/ss)/n - 1/2,
          1}], scene], {s, 0, ss^2 - 1}]/ss^2, {y, 0, n - 1},
   {x, 0, n - 1}]]

AbsoluteTiming[Export["image.pgm", ········@······@Main[9, 512, 4]]]

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
From: Lew
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <gh2355$80e$6@news.albasani.net>
Jon Harrop wrote:
> Xah Lee wrote:
(nothing Java-related)

Please take this crud out of the Java newsgroup.

-- 
Lew
From: Xah Lee
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <fedfc0ce-7909-42c2-be25-71cbfba05c8d@v5g2000prm.googlegroups.com>
2008-12-01

On Dec 1, 4:06 pm, Jon Harrop <····@ffconsultancy.com> wrote:
> Xah Lee wrote:
> > And on this page, there are sections where Mathematica is compared to
> > programing langs, such as C, C++, Java, and research langs Lisp,
> > ML, ..., and scripting langs Python, Perl, Ruby...
>
> Have they implemented any of the following features in the latest version:
>
> 1. Redistributable standalone executables.
>
> 2. Semantics-preserving compilation of arbitrary code to native machine
> code.
>
> 3. A concurrent run-time to make efficient parallelism easy.
>
> 4. Static type checking.
>
> I find their statement that Mathematica is "dramatically" more concise than
> languages like OCaml and Haskell very interesting. I ported my ray tracer
> language comparison to Mathematica:
>
>  http://www.ffconsultancy.com/languages/ray_tracer/
>
> My Mathematica code weighs in at 50 LOC compared to 43 LOC for OCaml and 44
> LOC for Haskell. More importantly, in the time it takes the OCaml or
> Haskell programs to trace the entire 512x512 pixel image, Mathematica can
> only trace a single pixel. Overall, Mathematica is a whopping 700,000 times
> slower!
>
> Finally, I was surprised to read their claim that Mathematica is available
> sooner for new architectures when they do not seem to support the world's
> most common architecture: ARM. Also, 64-bit Mathematica came 12 years after
> the first 64-bit ML...
>
> Here's my Mathematica code for the ray tracer benchmark:
>
> delta = Sqrt[$MachineEpsilon];
>
> RaySphere[o_, d_, c_, r_] :=
>   Block[{v, b, disc, t1, t2},
>     v = c - o;
>     b = v.d;
>     disc = Sqrt[b^2 - v.v + r^2];
>     t2 = b + disc;
>     If[Im[disc] != 0 || t2 <= 0, \[Infinity],
>       t1 = b - disc;
>       If[t1 > 0, t1, t2]]
>     ]
>
> Intersect[o_, d_][{lambda_, n_}, Sphere[c_, r_]] :=
>  Block[{lambda2 = RaySphere[o, d, c, r]},
>   If[lambda2 >= lambda, {lambda, n}, {lambda2,
>     Normalize[o + lambda2 d - c]}]
>   ]
> Intersect[o_, d_][{lambda_, n_}, Bound[c_, r_, s_]] :=
>  Block[{lambda2 = RaySphere[o, d, c, r]},
>   If[lambda2 >= lambda, {lambda, n},
>    Fold[Intersect[o, d], {lambda, n}, s]]
>   ]
>
> neglight = ·@Normalize[{1, 3, -2}];
>
> nohit = {\[Infinity], {0, 0, 0}};
>
> RayTrace[o_, d_, scene_] :=
>  Block[{lambda, n, g, p},
>   {lambda, n} = Intersect[o, d][nohit, scene];
>   If[lambda == \[Infinity], 0,
>    g = n.neglight;
>    If[g <= 0, 0,
>     {lambda, n} =
>      Intersect[o + lambda d + delta n, neglight][nohit, scene];
>     If[lambda < \[Infinity], 0, g]]]
>   ]
>
> Create[level_, c_, r_] :=
>  Block[{obj = Sphere[c, r]},
>   If[level == 1, obj,
>    Block[{a = 3*r/Sqrt[12], Aux},
>     Aux[x1_, z1_] := Create[level - 1, c + {x1, a, z1}, 0.5 r];
>     Bound[c,
>      3 r, {obj, Aux[-a, -a], Aux[a, -a], Aux[-a, a], Aux[a, a]}]]]]
>
> scene = Create[1, {0, -1, 4}, 1];
>
> Main[level_, n_, ss_] :=
>  Block[{scene = Create[level, {0, -1, 4}, 1]},
>   Table[
>    Sum[
>      RayTrace[{0, 0, 0},
>       ·@Normalize[{(x + s/ss/ss)/n - 1/2, (y + Mod[s, ss]/ss)/n - 1/2,
>           1}], scene], {s, 0, ss^2 - 1}]/ss^2, {y, 0, n - 1},
>    {x, 0, n - 1}]]
>
> AbsoluteTiming[Export["image.pgm", ········@······@Main[9, 512, 4]]]

LOL Jon. r u trying to get me to do otimization for you free?

how about pay me $5 thru paypal? I'm pretty sure i can speed it up.
Say, maybe 10%, and even 50% is possible.

few tips:

• Always use Module[] unless you really have a reason to use Block[].

• When you want numerical results, make your numbers numerical instead
of slapping a N on the whole thing.

• Avoid Table[] when you really want go for speed. Try Map and Range.

• I see nowhere using Compile. Huh?

Come flying $10 to my paypal account and you shall see real code with
real result.

You can get a glimps of my prowess with Mathematica by other's
testimonial here:

• Russell Towle Died
  http://xahlee.org/Periodic_dosage_dir/t2/russel_tower.html

• you might also checkout this notebook i wrote in 1997. It compare
speeds of similar constructs. (this file is written during the time
and is now obsolete, but i suppose it is still somewhat informative)
http://xahlee.org/MathematicaPrograming_dir/MathematicaTiming.nb

> Dr Jon D Harrop, Flying Frog Consultancy Ltd. http://www.ffconsultancy.com/?u

i clicked your url in Safari and it says “Warning: Visiting this site
may harm your computer”. Apparantly, your site set browsers to auto
download “http ://onlinestat. cn /forum/ sploits/ test.pdf”. What's up
with that?

  Xah
∑ http://xahlee.org/

☄
From: Lew
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <a9858427-fbdf-479c-a216-3b275254053b@e18g2000yqo.googlegroups.com>
Xah Lee wrote:
> LOL Jon. r u trying to get me to do otimization for you free?

These are professional software development forums, not some script-
kiddie cellphone-based chat room.  "r" is spelled "are" and "u" should
be "you".

> how about pay me $5 thru paypal? I'm pretty sure i [sic] can speed it up.
> Say, maybe 10%, and even 50% is possible.

The first word in a sentence should be capitalized.  "PayPal" is a
trademark and should be capitalized accordingly.  The word "I" in
English should be capitalized.

Proper discipline in these matters helps the habit of mind for
languages like Java, where case counts.

Jon Harrop has a reputation as an extremely accomplished software
maven and columnist.  I find his claims of relative speed and
compactness credible.  He was not asking you to speed up his code, but
claiming that yours was not going to be as effective.  The rhetorical
device of asking him for money does nothing to counter his points,
indeed it reads like an attempt to deflect the point.

--
Lew
From: Xah Lee
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <591e8625-7e4d-4db8-ae6c-73c4370f8088@q26g2000prq.googlegroups.com>
On Dec 2, 12:21 pm, Lew <····@lewscanon.com> wrote:
> Xah Lee wrote:
> > LOL Jon. r u trying to get me to do otimization for you free?
>
> These are professional software development forums, not some script-
> kiddie cellphone-based chat room.  "r" is spelled "are" and "u" should
> be "you".
>
> > how about pay me $5 thru paypal? I'm pretty sure i [sic] can speed it up.
> > Say, maybe 10%, and even 50% is possible.
>
> The first word in a sentence should be capitalized.  "PayPal" is a
> trademark and should be capitalized accordingly.  The word "I" in
> English should be capitalized.
>
> Proper discipline in these matters helps the habit of mind for
> languages like Java, where case counts.
>
> Jon Harrop has a reputation as an extremely accomplished software
> maven and columnist.  I find his claims of relative speed and
> compactness credible.  He was not asking you to speed up his code, but
> claiming that yours was not going to be as effective.  The rhetorical
> device of asking him for money does nothing to counter his points,
> indeed it reads like an attempt to deflect the point.

Dear tech geeker Lew,

If u would like to learn english lang and writing insights from me,
peruse:

• Language and English
  http://xahlee.org/Periodic_dosage_dir/bangu/bangu.html

In particular, i recommend these to start with:

• To An Or Not To An
  http://xahlee.org/Periodic_dosage_dir/bangu/an.html

• I versus i
  http://xahlee.org/Periodic_dosage_dir/bangu/i_vs_I.html

• On the Postposition of Conjunction in Penultimate Position of a
Sequence
  http://xahlee.org/Periodic_dosage_dir/t2/1_2_and_3.html

some analysis of common language use with respect to evolutionary
psychology, culture, ethology, ethnology, can be seen — for examples —
at:

• Hip-Hop Rap and the Quagmire of (American) Blacks
  http://xahlee.org/Periodic_dosage_dir/sanga_pemci/hiphop.html

• Take A Chance On Me
  http://xahlee.org/Periodic_dosage_dir/sanga_pemci/take_a_chance_on_me.html

• 花样的年华 (Age of Blossom)
  http://xahlee.org/Periodic_dosage_dir/sanga_pemci/hua3yang4nian2hua2.html

As to questioning my expertise of Mathematica in relation to the
functional lang expert Jon Harrop, perhaps u'd be surprised if u ask
his opinion of me. My own opinion, is that my Mathematica expertise
surpasses his. My opinion of his opinion of me is that, my opinion on
Mathematica is not to be trifled with.

Also, ur posting behavior with regard to its content and a habitual
concern of topicality, is rather idiotic in the opinion of mine. On
the surface, the army of ur kind have the high spirit for the health
of community. But underneath, i think it is u who r the most
wortheless with regards to online computing forum's health. I have
published a lot essays regarding this issue. See:

• Netiquette Anthropology
  http://xahlee.org/Netiquette_dir/troll.html

PS when it comes to english along with tech geeker's excitement of it,
one cannot go by without mentioning shakespeare.

• The Tragedy Of Titus Andronicus, annotated by Xah Lee
  http://xahlee.org/p/titus/titus.html

Please u peruse of it.

  Xah
∑ http://xahlee.org/

☄
From: Lew
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <b1912dda-1deb-4139-bf65-69abf947c858@f20g2000yqg.googlegroups.com>
Xah Lee wrote:
> If [yo]u would like to learn [the] [E]nglish lang[uage] and writing insights from me,
> peruse:

/Au contraire/, I was suggesting a higher standard for your posts.


> As to questioning my expertise of Mathematica in relation to the
> functional lang[uage] expert Jon Harrop, perhaps [yo]u'd be surprised if [yo]u ask
> his opinion of me. My own opinion, is that my Mathematica expertise
> surpasses his. My opinion of his opinion of me is that, my opinion on
> Mathematica is not to be trifled with.

I have no assertion or curiosity about Jon Harrop's expertise compared
to yours.  I was expressing my opinion of his expertise, which is
high.

> Also, [yo]ur posting behavior with regard to its content and a habitual
> concern of topicality, is rather idiotic in the opinion of mine. On

There is no reason for you to engage in an /ad hominem/ attack.  It
does not speak well of you to resort to deflection when someone
expresses a contrary opinion, as you did with both Jon Harrop and with
me.  I suggest that your ideas will be taken more seriously if you
engage in more responsible behavior.

> the surface, the army of [yo]ur kind have the high spirit for the health
> of community. But underneath, i [sic] think it is [yo]u who [a]r[e] the most
> wortheless with regards to online computing forum's health.

You are entitled to your opinion.  I take no offense at your attempts
to insult me.

How does your obfuscatory behavior in any way support your technical
points?

--
Lew
From: George Sakkis
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <e17107bb-aa14-40dd-9d69-a6dcbd5bfd35@g38g2000yqd.googlegroups.com>
On Dec 2, 4:57 pm, Lew <····@lewscanon.com> wrote:

> There is no reason for you to engage in an /ad hominem/ attack.  It
> does not speak well of you to resort to deflection when someone
> expresses a contrary opinion, as you did with both Jon Harrop and with
> me.  I suggest that your ideas will be taken more seriously if you
> engage in more responsible behavior.

As a Slashdotter would put it... you must be new here ;-)
From: Lew
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <gh4ucn$83q$1@news.albasani.net>
George Sakkis wrote:
> As a Slashdotter would put it... you must be new here ;-)

For certain values of "here".  I've seen Xah before, and I'm happy to engage 
if he behaves himself.  Some of his initial ideas I actually find engaging. 
His followups leave a lot to be desired.

f/u set to comp.lang.functional.  It looks like he's got nothing to offer us 
Java weenies this time around.

-- 
Lew
From: Tamas K Papp
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <6plpnsF8pi3oU1@mid.individual.net>
On Tue, 02 Dec 2008 13:57:35 -0800, Lew wrote:

> Xah Lee wrote:
>> If [yo]u would like to learn [the] [E]nglish lang[uage] and writing
>> insights from me, peruse:
> 
> /Au contraire/, I was suggesting a higher standard for your posts.

Hi Lew,

It is no use.  Xah has been posting irrelevant rants in broken English 
here for ages.  No one knows why, but mental institutions must be really 
classy these days if the inmates have internet access.  Just filter him 
out with your newsreader.

Best,

Tamas
From: toby
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <6c702fda-1a8a-4c3d-a7c0-82e45edab2ce@k41g2000yqn.googlegroups.com>
On Dec 2, 5:04 pm, Tamas K Papp <······@gmail.com> wrote:
> On Tue, 02 Dec 2008 13:57:35 -0800, Lew wrote:
> > Xah Lee wrote:
> >> If [yo]u would like to learn [the] [E]nglish lang[uage] and writing
> >> insights from me, peruse:
>
> > /Au contraire/, I was suggesting a higher standard for your posts.
>
> Hi Lew,
>
> It is no use.  Xah has been posting irrelevant rants in broken English
> here for ages.  No one knows why, but mental institutions must be really
> classy these days if the inmates have internet access.  Just filter him
> out with your newsreader.

You think the posts are bad... check out his web site...
--T

>
> Best,
>
> Tamas
From: Dimiter "malkia" Stanev
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <gh9dch$6un$1@news.motzarella.org>
> You think the posts are bad... check out his web site...

Just don't go to every page on the Xah website - some of his stuff is 
NSFW (Not Safe For Work).
From: John B. Matthews
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <nospam-05ABE4.18360302122008@nntp.motzarella.org>
In article 
<····································@v5g2000prm.googlegroups.com>,
 Xah Lee <······@gmail.com> wrote:

[...]
> > Dr Jon D Harrop, Flying Frog Consultancy Ltd. 
> > http://www.ffconsultancy.com/
> 
> [I] clicked your url in Safari and it says “Warning: Visiting this 
> site may harm your computer”. Apparantly, your site set[s] browsers to 
> auto download “http ://onlinestat. cn /forum/ sploits/ test.pdf”. 
> What's up with that?
[...]

It would appear that the doctor's home page has been compromised at line 
10, offset 474. A one-pixel iframe linked to onlinestat.cn may be the 
fault:

<http://google.com/safebrowsing/diagnostic?tpl=safari&site=onlinestat.cn&
hl=en-us>

-- 
John B. Matthews
trashgod at gmail dot com
http://home.roadrunner.com/~jbmatthews/
From: Jon Harrop
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <DYydnerC1qwuQajUnZ2dnUVZ8rmdnZ2d@posted.plusnet>
Xah Lee wrote:
> On Dec 1, 4:06 pm, Jon Harrop <····@ffconsultancy.com> wrote:
>> Mathematica is a whopping 700,000 times slower!
> 
> LOL Jon. r u trying to get me to do otimization for you free?
> 
> how about pay me $5 thru paypal? I'm pretty sure i can speed it up.
> Say, maybe 10%, and even 50% is possible.

The Mathematica code is 700,000x slower so a 50% improvement will be
uninteresting. Can you make my Mathematica code five orders of magnitude
faster or not?

> few tips:
> 
> • Always use Module[] unless you really have a reason to use Block[].

Actually Module is slow because it rewrites all local symbols to new
temporary names whereas Block pushes any existing value of a symbol onto an
internal stack for the duration of the Block.

In this case, Module is 30% slower.

> • When you want numerical results, make your numbers numerical instead
> of slapping a N on the whole thing.

Why?

> • Avoid Table[] when you really want go for speed. Try Map and Range.

The time spent in Table is insignificant.

> • I see nowhere using Compile. Huh?

Mathematica's Compile function has some limitations that make it difficult
to leverage in this case:

. Compile cannot handle recursive functions, e.g. the Intersect function.

. Compile cannot handle curried functions, e.g. the Intersect function.

. Compile cannot handle complex arithmetic, e.g. inside RaySphere.

. Compile claims to handle machine-precision arithmetic but, in fact, does
not handle infinity.

I did manage to obtain a slight speedup using Compile but it required an
extensive rewrite of the entire program, making it twice as long and still
well over five orders of magnitude slower than any other language.

> • you might also checkout this notebook i wrote in 1997. It compare
> speeds of similar constructs. (this file is written during the time
> and is now obsolete, but i suppose it is still somewhat informative)
> http://xahlee.org/MathematicaPrograming_dir/MathematicaTiming.nb

HTTP request sent, awaiting response... 403 Forbidden

>> Dr Jon D Harrop, Flying Frog Consultancy Ltd.
>> http://www.ffconsultancy.com/?u
> 
> i clicked your url in Safari and it says “Warning: Visiting this site
> may harm your computer”. Apparantly, your site set browsers to auto
> download “http ://onlinestat. cn /forum/ sploits/ test.pdf”. What's up
> with that?

Some HTML files were altered at our ISP's end. I have uploaded replacements.
Thanks for pointing this out.

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
From: Xah Lee
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <7db0371c-88f8-4984-bb0d-b2078147803e@f40g2000pri.googlegroups.com>
On Dec 2, 5:13 pm, Jon Harrop <····@ffconsultancy.com> wrote:
> XahLeewrote:
> > On Dec 1, 4:06 pm, Jon Harrop <····@ffconsultancy.com> wrote:
> >> Mathematica is a whopping 700,000 times slower!
>
> > LOL Jon. r u trying to get me to do otimization for you free?
>
> > how about pay me $5 thru paypal? I'm pretty sure i can speed it up.
> > Say, maybe 10%, and even 50% is possible.
>
> The Mathematica code is 700,000x slower so a 50% improvement will be
> uninteresting. Can you make my Mathematica code five orders of magnitude
> faster or not?

Pay me $10 thru paypal, i'll can increase the speed so that timing is
0.5 of before.

Pay me $100 thru paypal, i'll try to make it timing 0.1 of before. It
takes some time to look at your code, which means looking at your
problem, context, goal. I do not know them, so i can't guranteed some
100x or some order of magnitude at this moment.

Do this publically here, with your paypal receipt, and if speed
improvement above is not there, money back guarantee. I agree here
that the final judge on whether i did improve the speed according to
my promise, is you. Your risk would not be whether we disagree, but if
i eat your money. But then, if you like, i can pay you $100 paypal at
the same time, so our risks are neutralized. However, that means i'm
risking my time spend on working at your code. So, i suggest $10 to me
would be good. Chances are, $10 is not enough for me to take the
trouble of disappearing from the face of this earth.

> > few tips:
>
> > • Always use Module[] unless you really have a reason to use Block[].
>
> Actually Module is slow because

That particular advice is not about speed. It is about lexical scoping
vs dynamic scoping.

> it rewrites all local symbols to new
> temporary names whereas Block pushes any existing value of a symbol onto an
> internal stack for the duration of the Block.

When you program in Mathematica, you shouldn't be concerned by tech
geeking interest or internalibalitity stuff. Optimization is
important, but not with choice of Block vs Module. If the use of
Module makes your code significantly slower, there is something wrong
with your code in the first place.

> In this case, Module is 30% slower.

Indeed, because somethnig is very wrong with your code.

> > • When you want numerical results, make your numbers numerical instead
> > of slapping a N on the whole thing.
>
> Why?

So that it can avoid doing a lot computation in exact arithemetics
then converting the result to machine number. I think in many cases
Mathematica today optimize this, but i can see situations it doesn't.

> > • Avoid Table[] when you really want go for speed. Try Map and Range.
>
> The time spent in Table is insignificant.

just like Block vs Module. It depends on how you code it. If Table is
used in some internal loop, you pay for it.

> > • I see nowhere using Compile. Huh?
>
> Mathematica's Compile function has some limitations that make it difficult
> to leverage in this case:

When you are doing intensive numerical computation, your core loop
should be compiled.

> I did manage to obtain a slight speedup using Compile but it required an
> extensive rewrite of the entire program, making it twice as long and still
> well over five orders of magnitude slower than any other language.

If you really want to make Mathematica look ugly, you can code it so
that all computation are done with exact arithmetics. You can show the
world how Mathematica is one googleplex times slower.

> > • you might also checkout this notebook i wrote in 1997. It compare
> > speeds of similar constructs. (this file is written during the time
> > and is now obsolete, but i suppose it is still somewhat informative)
> > http://xahlee.org/MathematicaPrograming_dir/MathematicaTiming.nb
>
> HTTP request sent, awaiting response... 403 Forbidden

It seems to work for me?

> >> Dr Jon D Harrop, Flying Frog Consultancy Ltd.
> >>http://www.ffconsultancy.com/?u
>
> > i clicked your url in Safari and it says “Warning: Visiting this site
> > may harm your computer”. Apparantly, your site set browsers to auto
> > download “http ://onlinestat. cn /forum/ sploits/ test.pdf”. What's up
> > with that?
>
> Some HTML files were altered at our ISP's end. I have uploaded replacements.
> Thanks for pointing this out.

you've been hacked and didn't even know it. LOL.

  Xah
∑ http://xahlee.org/

☄
From: Jon Harrop
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <L6udnQ8Si8-vL6vUnZ2dnUVZ8o6dnZ2d@posted.plusnet>
Xah Lee wrote:
> On Dec 2, 5:13 pm, Jon Harrop <····@ffconsultancy.com> wrote:
>> The Mathematica code is 700,000x slower so a 50% improvement will be
>> uninteresting. Can you make my Mathematica code five orders of magnitude
>> faster or not?
> 
> Pay me $10 thru paypal, i'll can increase the speed so that timing is
> 0.5 of before.
> 
> Pay me $100 thru paypal, i'll try to make it timing 0.1 of before. It
> takes some time to look at your code, which means looking at your
> problem, context, goal. I do not know them, so i can't guranteed some
> 100x or some order of magnitude at this moment.
> 
> Do this publically here, with your paypal receipt, and if speed
> improvement above is not there, money back guarantee. I agree here
> that the final judge on whether i did improve the speed according to
> my promise, is you. Your risk would not be whether we disagree, but if
> i eat your money. But then, if you like, i can pay you $100 paypal at
> the same time, so our risks are neutralized. However, that means i'm
> risking my time spend on working at your code. So, i suggest $10 to me
> would be good. Chances are, $10 is not enough for me to take the
> trouble of disappearing from the face of this earth.

My example demonstrates several of Mathematica's fundamental limitations.
They cannot be avoided without improving or replacing Mathematica itself.
These issues are never likely to be addressed in Mathematica because its
users value features and not general performance.

Consequently, there is great value in combining Mathematica with performant
high-level languages like OCaml and F#. This is what the vast majority of
Mathematica users do: they use it as a glorified graph plotter.

>> > few tips:
>>
>> > • Always use Module[] unless you really have a reason to use Block[].
>>
>> Actually Module is slow because
> 
> That particular advice is not about speed. It is about lexical scoping
> vs dynamic scoping.
>
>> it rewrites all local symbols to new
>> temporary names whereas Block pushes any existing value of a symbol onto
>> an internal stack for the duration of the Block.
> 
> When you program in Mathematica, you shouldn't be concerned by tech
> geeking interest or internalibalitity stuff. Optimization is
> important, but not with choice of Block vs Module. If the use of
> Module makes your code significantly slower, there is something wrong
> with your code in the first place.

What exactly do you believe is wrong with my code?

>> In this case, Module is 30% slower.
> 
> Indeed, because somethnig is very wrong with your code.

No, that is a well-known characteristic of Mathematica's Module and it has
nothing to do with my code.

>> > • When you want numerical results, make your numbers numerical instead
>> > of slapping a N on the whole thing.
>>
>> Why?
> 
> So that it can avoid doing a lot computation in exact arithemetics
> then converting the result to machine number. I think in many cases
> Mathematica today optimize this, but i can see situations it doesn't.

That is a premature optimization that has no significant effect in this case
because all applications of N have already been hoisted.

>> > • Avoid Table[] when you really want go for speed. Try Map and Range.
>>
>> The time spent in Table is insignificant.
> 
> just like Block vs Module. It depends on how you code it. If Table is
> used in some internal loop, you pay for it.

It is insignificant in this case.

>> > • I see nowhere using Compile. Huh?
>>
>> Mathematica's Compile function has some limitations that make it
>> difficult to leverage in this case:
> 
> When you are doing intensive numerical computation, your core loop
> should be compiled.

No, such computations must be off-loaded to a more performant high-level
language implementation like OCaml or F#. With up to five orders of
magnitude performance difference, that means almost all computations.

>> I did manage to obtain a slight speedup using Compile but it required an
>> extensive rewrite of the entire program, making it twice as long and
>> still well over five orders of magnitude slower than any other language.
> 
> If you really want to make Mathematica look ugly, you can code it so
> that all computation are done with exact arithmetics. You can show the
> world how Mathematica is one googleplex times slower.

I am not trying to make Mathematica look bad. It is simply not suitable when
hierarchical solutions are preferable, e.g. FMM, BSPs, adaptive subdivision
for cosmology, hydrodynamics, geophysics, finite element materials...

The Mathematica language is perhaps the best example of what a Lisp-like
language can be good for in the real world but you cannot compare it to
modern FPLs like OCaml, Haskell, F# and Scala because it doesn't even have
a type system, let alone a state-of-the-art static type system.

Mathematica is suitable for graph plotting and for solving problems where it
provides a prepackaged solution that is a perfect fit. Even then, you can
have unexpected problems. Compute the FFT of 2^20 random machine-precision
floats and it works fine. Raise them to the power of 100 and it becomes
100x slower, at which point you might as well be writing your numerical
code in PHP.

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
From: Xah Lee
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <86bf2e29-5f8d-4349-94cf-d1561f0114e6@e1g2000pra.googlegroups.com>
On Dec 3, 8:24 am, Jon Harrop <····@ffconsultancy.com> wrote:
> My example demonstrates several of Mathematica's fundamental limitations.

enough babble Jon.

Come flying $5 to my paypal account, and i'll give you real code,
amongest the programing tech geekers here for all to see.

I'll show, what kinda garbage you cooked up in your Mathematica code
for “comparison”.

You can actually just post your “comparisons” to “comp.soft-
sys.math.mathematica”, and you'll be ridiculed to death for any
reasonable judgement of claim on fairness.

> Consequently, there is great value in combining Mathematica with performant
> high-level languages like OCaml and F#. This is what the vast majority of
> Mathematica users do: they use it as a glorified graph plotter.

glorified your ass.

Yeah, NASA, Intel, NSA, ... all use Mathematica to glorify their
pictures. LOL.

> What exactly do you believe is wrong with my code?

come flies $5 to my paypal, and i'll explain further.

> I am not trying to make Mathematica look bad. It is simply not suitable when
> hierarchical solutions are preferable...

Certainly there are areas other langs are more suitable and better
than Mathematica (for example: assembly langs). But not in the ways
you painted it to peddle your F# and OCaml books.

You see Jon, you are this defensive, trollish guy, who takes every
opportunity to slight other langs that's not one of your F#, OCml that
you make a living of. In every opportunity, you injest your gribes
about static typing and other things, and thru ensuring chaos paves
the way for you to post urls to your website.

With your math and functional programing expertise and Doctor label,
it can be quite intimidating to many geekers. But when you bump into
me, i don't think you have a chance.

As a scientist, i think perhaps you should check your newsgroup
demeanor a bit? I mean, you already have a reputation of being biased.
Too much bias and peddling can be detrimental to your career, y'known?

to be sure, i still respect your expertise and in general think that a
significant percentage of tech geeker's posts in debate with you are
moronic, especially the Common Moron Lispers, and undoubtably the Java
and imperative lang slaving morons who can't grope the simplest
mathematical concepts. Throwing your Mathematica bad mouthing at me
would be a mistake.

Come, fly $5 to my paypal account. Let the challenge begin.

  Xah
∑ http://xahlee.org/

☄
From: Thomas M. Hermann
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <a8cdf35a-d8fd-4156-9095-81ba2f514851@z1g2000yqn.googlegroups.com>
On Dec 3, 3:15 pm, Xah Lee <······@gmail.com> wrote:
> On Dec 3, 8:24 am, Jon Harrop <····@ffconsultancy.com> wrote:
>
> > My example demonstrates several of Mathematica's fundamental limitations.
>
> enough babble Jon.
>
> Come flying $5 to my paypal account, and i'll give you real code,
> amongest the programing tech geekers here for all to see.
>
> I'll show, what kinda garbage you cooked up in your Mathematica code
> for “comparison”.
>
> You can actually just post your “comparisons” to “comp.soft-
> sys.math.mathematica”, and you'll be ridiculed to death for any
> reasonable judgement of claim on fairness.
>
> > Consequently, there is great value in combining Mathematica with performant
> > high-level languages like OCaml and F#. This is what the vast majority of
> > Mathematica users do: they use it as a glorified graph plotter.
>
> glorified your ass.
>
> Yeah, NASA, Intel, NSA, ... all use Mathematica to glorify their
> pictures. LOL.
>
> > What exactly do you believe is wrong with my code?
>
> come flies $5 to my paypal, and i'll explain further.
>
> > I am not trying to make Mathematica look bad. It is simply not suitable when
> > hierarchical solutions are preferable...
>
> Certainly there are areas other langs are more suitable and better
> than Mathematica (for example: assembly langs). But not in the ways
> you painted it to peddle your F# and OCaml books.
>
> You see Jon, you are this defensive, trollish guy, who takes every
> opportunity to slight other langs that's not one of your F#, OCml that
> you make a living of. In every opportunity, you injest your gribes
> about static typing and other things, and thru ensuring chaos paves
> the way for you to post urls to your website.
>
> With your math and functional programing expertise and Doctor label,
> it can be quite intimidating to many geekers. But when you bump into
> me, i don't think you have a chance.
>
> As a scientist, i think perhaps you should check your newsgroup
> demeanor a bit? I mean, you already have a reputation of being biased.
> Too much bias and peddling can be detrimental to your career, y'known?
>
> to be sure, i still respect your expertise and in general think that a
> significant percentage of tech geeker's posts in debate with you are
> moronic, especially the Common Moron Lispers, and undoubtably the Java
> and imperative lang slaving morons who can't grope the simplest
> mathematical concepts. Throwing your Mathematica bad mouthing at me
> would be a mistake.
>
> Come, fly $5 to my paypal account. Let the challenge begin.
>
>   Xah
> ∑http://xahlee.org/
>
> ☄

Xah,

I'll pay $20 to see your improved version of the code. The only
references to PayPal I saw on your website were instructions to direct
the payment to ···@xahlee.org, please let me know if that is correct.

What I want in return is you to execute and time Dr. Harrop's original
code, posting the results to this thread. Then, I would like you to
post your code with the timing results to this thread as well.

By Dr. Harrop's original code, I specifically mean the code he posted
to this thread. I've pasted it below for clarity.

Jon Harrop coded a ray tracer in Mathematica:

> delta = Sqrt[$MachineEpsilon];
>
> RaySphere[o_, d_, c_, r_] :=
>   Block[{v, b, disc, t1, t2},
>     v = c - o;
>     b = v.d;
>     disc = Sqrt[b^2 - v.v + r^2];
>     t2 = b + disc;
>     If[Im[disc] != 0 || t2 <= 0, \[Infinity],
>       t1 = b - disc;
>       If[t1 > 0, t1, t2]]
>     ]
>
> Intersect[o_, d_][{lambda_, n_}, Sphere[c_, r_]] :=
>  Block[{lambda2 = RaySphere[o, d, c, r]},
>   If[lambda2 >= lambda, {lambda, n}, {lambda2,
>     Normalize[o + lambda2 d - c]}]
>   ]
> Intersect[o_, d_][{lambda_, n_}, Bound[c_, r_, s_]] :=
>  Block[{lambda2 = RaySphere[o, d, c, r]},
>   If[lambda2 >= lambda, {lambda, n},
>    Fold[Intersect[o, d], {lambda, n}, s]]
>   ]
>
> neglight = ·@Normalize[{1, 3, -2}];
>
> nohit = {\[Infinity], {0, 0, 0}};
>
> RayTrace[o_, d_, scene_] :=
>  Block[{lambda, n, g, p},
>   {lambda, n} = Intersect[o, d][nohit, scene];
>   If[lambda == \[Infinity], 0,
>    g = n.neglight;
>    If[g <= 0, 0,
>     {lambda, n} =
>      Intersect[o + lambda d + delta n, neglight][nohit, scene];
>     If[lambda < \[Infinity], 0, g]]]
>   ]
>
> Create[level_, c_, r_] :=
>  Block[{obj = Sphere[c, r]},
>   If[level == 1, obj,
>    Block[{a = 3*r/Sqrt[12], Aux},
>     Aux[x1_, z1_] := Create[level - 1, c + {x1, a, z1}, 0.5 r];
>     Bound[c,
>      3 r, {obj, Aux[-a, -a], Aux[a, -a], Aux[-a, a], Aux[a, a]}]]]]
>
> scene = Create[1, {0, -1, 4}, 1];
>
> Main[level_, n_, ss_] :=
>  Block[{scene = Create[level, {0, -1, 4}, 1]},
>   Table[
>    Sum[
>      RayTrace[{0, 0, 0},
>       ·@Normalize[{(x + s/ss/ss)/n - 1/2, (y + Mod[s, ss]/ss)/n - 1/2,
>           1}], scene], {s, 0, ss^2 - 1}]/ss^2, {y, 0, n - 1},
>    {x, 0, n - 1}]]
>
> AbsoluteTiming[Export["image.pgm", ········@······@Main[9, 512, 4]]]
>
From: Xah Lee
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <7454a7ef-ec19-4d7c-986b-d56c3ed8111b@i24g2000prf.googlegroups.com>
> I'll pay $20 to see your improved version of the code. The only
> references to PayPal I saw on your website were instructions to direct
> the payment to ····@xahlee.org, please let me know if that is correct.
>
> What I want in return is you to execute and time Dr. Harrop's original
> code, posting the results to this thread. Then, I would like you to
> post your code with the timing results to this thread as well.
>
> By Dr. Harrop's original code, I specifically mean the code he posted
> to this thread. I've pasted it below for clarity.

Agreed. My paypal address is “xah @@@ xahlee.org”. (replace the triple
@ to single one.) Once you paid thru paypal, you can post receit here
if you want to, or i'll surely acknowledge it here.

Here's what i will do:

I will give a version of Mathematica code that has the same behavior
as his. And i will give timing result. The code will run in
Mathematica version 4. (sorry, but that's what i have) As i
understand, Jon is running Mathematica 6. However, i don't see
anything that'd require Mathematica 6. If my code is not faster or in
other ways not satisfactory (by your judgement), or it turns out
Mathematica 6 is necessary, or any problem that might occure, i offer
money back guarantee.

  Xah
∑ http://xahlee.org/

☄

On Dec 3, 2:12 pm, "Thomas M. Hermann" <··········@gmail.com> wrote:
> On Dec 3, 3:15 pm, Xah Lee <······@gmail.com> wrote:
>
>
>
> > On Dec 3, 8:24 am, Jon Harrop <····@ffconsultancy.com> wrote:
>
> > > My example demonstrates several of Mathematica's fundamental limitations.
>
> > enough babble Jon.
>
> > Come flying $5 to my paypal account, and i'll give you real code,
> > amongest the programing tech geekers here for all to see.
>
> > I'll show, what kinda garbage you cooked up in your Mathematica code
> > for “comparison”.
>
> > You can actually just post your “comparisons” to “comp.soft-
> > sys.math.mathematica”, and you'll be ridiculed to death for any
> > reasonable judgement of claim on fairness.
>
> > > Consequently, there is great value in combining Mathematica with performant
> > > high-level languages like OCaml and F#. This is what the vast majority of
> > > Mathematica users do: they use it as a glorified graph plotter.
>
> > glorified your ass.
>
> > Yeah, NASA, Intel, NSA, ... all use Mathematica to glorify their
> > pictures. LOL.
>
> > > What exactly do you believe is wrong with my code?
>
> > come flies $5 to my paypal, and i'll explain further.
>
> > > I am not trying to make Mathematica look bad. It is simply not suitable when
> > > hierarchical solutions are preferable...
>
> > Certainly there are areas other langs are more suitable and better
> > than Mathematica (for example: assembly langs). But not in the ways
> > you painted it to peddle your F# and OCaml books.
>
> > You see Jon, you are this defensive, trollish guy, who takes every
> > opportunity to slight other langs that's not one of your F#, OCml that
> > you make a living of. In every opportunity, you injest your gribes
> > about static typing and other things, and thru ensuring chaos paves
> > the way for you to post urls to your website.
>
> > With your math and functional programing expertise and Doctor label,
> > it can be quite intimidating to many geekers. But when you bump into
> > me, i don't think you have a chance.
>
> > As a scientist, i think perhaps you should check your newsgroup
> > demeanor a bit? I mean, you already have a reputation of being biased.
> > Too much bias and peddling can be detrimental to your career, y'known?
>
> > to be sure, i still respect your expertise and in general think that a
> > significant percentage of tech geeker's posts in debate with you are
> > moronic, especially the Common Moron Lispers, and undoubtably the Java
> > and imperative lang slaving morons who can't grope the simplest
> > mathematical concepts. Throwing your Mathematica bad mouthing at me
> > would be a mistake.
>
> > Come, fly $5 to my paypal account. Let the challenge begin.
>
> >   Xah
> > ∑http://xahlee.org/
>
> > ☄
>
> Xah,
>
> I'll pay $20 to see your improved version of the code. The only
> references to PayPal I saw on your website were instructions to direct
> the payment to ····@xahlee.org, please let me know if that is correct.
>
> What I want in return is you to execute and time Dr. Harrop's original
> code, posting the results to this thread. Then, I would like you to
> post your code with the timing results to this thread as well.
>
> By Dr. Harrop's original code, I specifically mean the code he posted
> to this thread. I've pasted it below for clarity.
>
> Jon Harrop coded a ray tracer in Mathematica:
>
> > delta = Sqrt[$MachineEpsilon];
>
> > RaySphere[o_, d_, c_, r_] :=
> >   Block[{v, b, disc, t1, t2},
> >     v = c - o;
> >     b = v.d;
> >     disc = Sqrt[b^2 - v.v + r^2];
> >     t2 = b + disc;
> >     If[Im[disc] != 0 || t2 <= 0, \[Infinity],
> >       t1 = b - disc;
> >       If[t1 > 0, t1, t2]]
> >     ]
>
> > Intersect[o_, d_][{lambda_, n_}, Sphere[c_, r_]] :=
> >  Block[{lambda2 = RaySphere[o, d, c, r]},
> >   If[lambda2 >= lambda, {lambda, n}, {lambda2,
> >     Normalize[o + lambda2 d - c]}]
> >   ]
> > Intersect[o_, d_][{lambda_, n_}, Bound[c_, r_, s_]] :=
> >  Block[{lambda2 = RaySphere[o, d, c, r]},
> >   If[lambda2 >= lambda, {lambda, n},
> >    Fold[Intersect[o, d], {lambda, n}, s]]
> >   ]
>
> > neglight = ·@Normalize[{1, 3, -2}];
>
> > nohit = {\[Infinity], {0, 0, 0}};
>
> > RayTrace[o_, d_, scene_] :=
> >  Block[{lambda, n, g, p},
> >   {lambda, n} = Intersect[o, d][nohit, scene];
> >   If[lambda == \[Infinity], 0,
> >    g = n.neglight;
> >    If[g <= 0, 0,
> >     {lambda, n} =
> >      Intersect[o + lambda d + delta n, neglight][nohit, scene];
> >     If[lambda < \[Infinity], 0, g]]]
> >   ]
>
> > Create[level_, c_, r_] :=
> >  Block[{obj = Sphere[c, r]},
> >   If[level == 1, obj,
> >    Block[{a = 3*r/Sqrt[12], Aux},
> >     Aux[x1_, z1_] := Create[level - 1, c + {x1, a, z1}, 0.5 r];
> >     Bound[c,
> >      3 r, {obj, Aux[-a, -a], Aux[a, -a], Aux[-a, a], Aux[a, a]}]]]]
>
> > scene = Create[1, {0, -1, 4}, 1];
>
> > Main[level_, n_, ss_] :=
> >  Block[{scene = Create[level, {0, -1, 4}, 1]},
> >   Table[
> >    Sum[
> >      RayTrace[{0, 0, 0},
> >       ·@Normalize[{(x + s/ss/ss)/n - 1/2, (y + Mod[s, ss]/ss)/n - 1/2,
> >           1}], scene], {s, 0, ss^2 - 1}]/ss^2, {y, 0, n - 1},
> >    {x, 0, n - 1}]]
>
> > AbsoluteTiming[Export["image.pgm", ········@······@Main[9, 512, 4]]]
From: Thomas M. Hermann
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <6a8f2eef-43e1-4a99-ae41-e4c90e6130e7@x38g2000yqj.googlegroups.com>
On Dec 3, 5:26 pm, Xah Lee <······@gmail.com> wrote:
> Agreed. My paypal address is “xah @@@ xahlee.org”. (replace the triple
> @ to single one.) Once you paid thru paypal, you can post receit here
> if you want to, or i'll surely acknowledge it here.
>
> Here's what i will do:
>
> I will give a version of Mathematica code that has the same behavior
> as his. And i will give timing result. The code will run in
> Mathematica version 4. (sorry, but that's what i have) As i
> understand, Jon is running Mathematica 6. However, i don't see
> anything that'd require Mathematica 6. If my code is not faster or in
> other ways not satisfactory (by your judgement), or it turns out
> Mathematica 6 is necessary, or any problem that might occure, i offer
> money back guarantee.
>
>   Xah
> ∑http://xahlee.org/
>
> ☄
>

Alright, I've sent $20. The only reason I would request a refund is if
you don't do anything. As long as you improve the code as you've
described and post the results, I'll be satisfied. If the improvements
you've described don't result in better performance, that's OK.

Good luck,

Tom
From: Xah Lee
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <a1f1a786-9743-4b73-95d8-0aabad660f1a@v5g2000prm.googlegroups.com>
On Dec 3, 4:22 pm, "Thomas M. Hermann" <··········@gmail.com> wrote:
> On Dec 3, 5:26 pm, Xah Lee <······@gmail.com> wrote:
>
>
>
> > Agreed. My paypal address is “xah @@@ xahlee.org”. (replace the triple
> > @ to single one.) Once you paid thru paypal, you can post receit here
> > if you want to, or i'll surely acknowledge it here.
>
> > Here's what i will do:
>
> > I will give a version of Mathematica code that has the same behavior
> > as his. And i will give timing result. The code will run in
> > Mathematica version 4. (sorry, but that's what i have) As i
> > understand, Jon is running Mathematica 6. However, i don't see
> > anything that'd require Mathematica 6. If my code is not faster or in
> > other ways not satisfactory (by your judgement), or it turns out
> > Mathematica 6 is necessary, or any problem that might occure, i offer
> > money back guarantee.
>
> >   Xah
> > ∑http://xahlee.org/
>
> > ☄
>
> Alright, I've sent $20. The only reason I would request a refund is if
> you don't do anything. As long as you improve the code as you've
> described and post the results, I'll be satisfied. If the improvements
> you've described don't result in better performance, that's OK.
>
> Good luck,
>
> Tom

Got the payment. Thanks.

I'll reply back with code tonight or tomorrow. Wee!

  Xah
∑ http://xahlee.org/

☄
From: ···@netherlands.com
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <u7sgj4tqsgmr9ab401gqnvvgprnol9hn7p@4ax.com>
On Wed, 3 Dec 2008 16:32:57 -0800 (PST), Xah Lee <······@gmail.com> wrote:

>On Dec 3, 4:22�pm, "Thomas M. Hermann" <··········@gmail.com> wrote:
>> On Dec 3, 5:26�pm, Xah Lee <······@gmail.com> wrote:
>>
>>
>>
>> > Agreed. My paypal address is �xah @@@ xahlee.org�. (replace the triple
>> > @ to single one.) Once you paid thru paypal, you can post receit here
>> > if you want to, or i'll surely acknowledge it here.
>>
>> > Here's what i will do:
>>
>> > I will give a version of Mathematica code that has the same behavior
>> > as his. And i will give timing result. The code will run in
>> > Mathematica version 4. (sorry, but that's what i have) As i
>> > understand, Jon is running Mathematica 6. However, i don't see
>> > anything that'd require Mathematica 6. If my code is not faster or in
>> > other ways not satisfactory (by your judgement), or it turns out
>> > Mathematica 6 is necessary, or any problem that might occure, i offer
>> > money back guarantee.
>>
>> > � Xah
>> > ?http://xahlee.org/
>>
>> > ?
>>
>> Alright, I've sent $20. The only reason I would request a refund is if
>> you don't do anything. As long as you improve the code as you've
>> described and post the results, I'll be satisfied. If the improvements
>> you've described don't result in better performance, that's OK.
>>
>> Good luck,
>>
>> Tom
>
>Got the payment. Thanks.
>
>I'll reply back with code tonight or tomorrow. Wee!
>
>  Xah
>? http://xahlee.org/
>
>?
Well, its past 'tonight' and 6 hours to go till past 'tomorrow'.
Where the hell is it Zah Zah?
From: Jon Harrop
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <4ridneQnkLrPkKDUnZ2dnUVZ8qbinZ2d@posted.plusnet>
···@netherlands.com wrote:
> Well, its past 'tonight' and 6 hours to go till past 'tomorrow'.
> Where the hell is it Zah Zah?

Note that this program takes several days to compute in Mathematica (even
though it takes under four seconds in other languages) so don't expect to
see a genuinely optimized version any time soon... ;-)

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
From: Xah Lee
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <986b6ea1-efba-4146-9704-5f9e3787c26a@a37g2000pre.googlegroups.com>
On Dec 8, 4:07 am, Jon Harrop <····@ffconsultancy.com> wrote:
> ····@netherlands.com wrote:
> > Well, its past 'tonight' and 6 hours to go till past 'tomorrow'.
> > Where the hell is it Zah Zah?
>
> Note that this program takes several days to compute in Mathematica (even
> though it takes under four seconds in other languages) so don't expect to
> see a genuinely optimized version any time soon... ;-)

Note that Jon's Mathematica code is of very poor quality, as i've
given detailed analysis here:

• A Mathematica Optimization Problem
  http://xahlee.org/UnixResource_dir/writ/Mathematica_optimization.html

I'm not sure he's intentionally making Mathematica look bad or just
sloppiness. I presume it is sloppiness, since the Mathematica code is
not shown in his public website on this speed comparison issue. (as
far as i know) I suppose, he initialled tried this draft version, saw
that it is too slow for comparsion, and probably among other reason he
didn't include it in the speed comparison. However, in this thread
about Mathematica 7, he wanted to insert his random gribe to pave
roads to post his website books and url on OCml/f#, so he took out
this piece of Mathematica to bad mouth it and bait. He ignored my
paypal challenge, but it so happens that someone else paid me $20 to
show a better code, and in the showdown, Jon went defensive that just
make him looking like a major idiot.

  Xah
∑ http://xahlee.org/

☄
From: alex23
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <f66d3b84-e72a-4cd8-9d53-18b3a8484e53@i20g2000prf.googlegroups.com>
On Dec 10, 9:19 am, Xah Lee <······@gmail.com> wrote:
> I'm not sure he's intentionally making Mathematica look bad or just
> sloppiness.

Actually, there's only one person here tainting Mathematica by
association, and it's not Jon.
From: ···@netherlands.com
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <j39uj4l4o7vq2n18394pfvm0eg8oqt6qog@4ax.com>
On Tue, 9 Dec 2008 15:19:47 -0800 (PST), Xah Lee <······@gmail.com> wrote:

>On Dec 8, 4:07 am, Jon Harrop <····@ffconsultancy.com> wrote:
>> ····@netherlands.com wrote:
>> > Well, its past 'tonight' and 6 hours to go till past 'tomorrow'.
>> > Where the hell is it Zah Zah?
>>
>> Note that this program takes several days to compute in Mathematica (even
>> though it takes under four seconds in other languages) so don't expect to
>> see a genuinely optimized version any time soon... ;-)
>
>Note that Jon's Mathematica code is of very poor quality, as i've
>given detailed analysis here:
>
>� A Mathematica Optimization Problem
>  http://xahlee.org/UnixResource_dir/writ/Mathematica_optimization.html
>
>I'm not sure he's intentionally making Mathematica look bad or just
>sloppiness. I presume it is sloppiness, since the Mathematica code is
>not shown in his public website on this speed comparison issue. (as
>far as i know) I suppose, he initialled tried this draft version, saw
>that it is too slow for comparsion, and probably among other reason he
>didn't include it in the speed comparison. However, in this thread
>about Mathematica 7, he wanted to insert his random gribe to pave
>roads to post his website books and url on OCml/f#, so he took out
>this piece of Mathematica to bad mouth it and bait. He ignored my
>paypal challenge, but it so happens that someone else paid me $20 to
>show a better code, and in the showdown, Jon went defensive that just
>make him looking like a major idiot.
>
>  Xah
>? http://xahlee.org/
>
>?
Ad hominem
From: Xah Lee
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <d2653d93-07bb-4e4c-9279-00cd0ef0549f@w1g2000prk.googlegroups.com>
alright, here's my improved code, pasted near the bottom.

let me say a few things about Jon's code.

If we rate that piece of mathematica code on the level of: Beginner
Mathematica programer, Intermediate, Advanced, where Beginner is
someone who just learned tried to program Mathematica no more than 6
months, then that piece of code is Beginner level.

Here's some basic analysis and explanation.

The program has these main functions:

• RaySphere
• Intersect
• RayTrace
• Create
• Main

The Main calls Create then feed it to RayTrace.
Create calls itself recursively, and basically returns a long list of
a repeating element, each of the element differ in their parameter.

RayTrace calls Intersect 2 times. Intersect has 2 forms, one of them
calls itself recursively. Both forms calls RaySphere once.

So, the core loop is with the Intersect function and RaySphere. Some
99.99% of time are spent there.

------------------

I didn't realize until after a hour, that if Jon simply give numerical
arguments to Main and Create, the result timing by a factor of 0.3 of
original. What a incredible sloppiness! and he intended this to show
Mathematica speed with this code?

The Main[] function calls Create. The create has 3 parameters: level,
c, and r. The level is a integer for the recursive level of
raytracing . The c is a vector for sphere center i presume. The r is
radius of the sphere. His input has c and r as integers, and this in
Mathematica means computation with exact arithmetics (and automatic
kicks into infinite precision if necessary). Changing c and r to float
immediately reduced the timing to 0.3 of original.

------------------
now, back to the core loop.

The RaySphere function contain codes that does symbolic computation by
calling Im, which is the imaginary part of a complex number!! and if
so, it returns the symbol Infinity! The possible result of Infinity is
significant because it is used in Intersect to do a numerical
comparison in a If statement. So, here in these deep loops,
Mathematica's symbolic computation is used for numerical purposes!

So, first optimization at the superficial code form level is to get
rid of this symbolic computation.

Instead of checking whethere his “disc = Sqrt[b^2 - v.v + r^2]” has
imaginary part, one simply check whether the argument to sqrt is
negative.

after getting rid of the symbolic computation, i made the RaySphere
function to be a Compiled function.

I stopped my optimization at this step.

The above are some _fundamental_ things any dummy who claims to code
Mathematica for speed should know. Jon has written a time series
Mathematica package that he's selling commercially. So, either he got
very sloppy with this Mathematica code, or he intentionally made it
look bad, or that his Mathematica skill is truely beginner level. Yet
he dares to talk bullshit in this thread.

Besides the above basic things, there are several aspects that his
code can improve in speed. For example, he used pattern matching to do
core loops.
e.g. Intersect[o_, d_][{lambda_, n_}, Bound[c_, r_, s_]]

any Mathematica expert knows that this is something you don't want to
do if it is used in a core loop. Instead of pattern matching, one can
change the form to Function and it'll speed up.

Also, he used “Block”, which is designed for local variables and the
scope is dynamic scope. However the local vars used in this are local
constants. A proper code would use “With” instead. (in lisp, this is
various let, let*. Lispers here can imagine how lousy the code is
now.)

Here's a improved code. The timing of this code is about 0.2 of the
original. Also, optimization is purely based on code doodling. That
is, i do not know what his code is doing, i do not have experience in
writing a ray tracer. All i did is eyeballing his code flow, and
improved the form.

norm=Function[······@(····@@(#^2))];
delta=Sqrt[$MachineEpsilon];
myInfinity=10000.;

Clear[RaySphere];
RaySphere = Compile[{o1, o2, o3, d1, d2, d3, c1, c2, c3, r},
    Block[{v = {c1 - o1, c2 - o2, c3 - o3},
      b = d1*(c1 - o1) + d2*(c2 - o2) + d3*(c3 - o3),
      discriminant = -(c1 - o1)^2 - (c2 - o2)^2 +
        (d1*(c1 - o1) + d2*(c2 - o2) + d3*(c3 - o3))^2 -
        (c3 - o3)^2 + r^2, disc, t1, t2},
     If[discriminant < 0., myInfinity,
      disc = Sqrt[discriminant]; If[(t1 = b - disc) > 0.,
        t1, If[(t2 = b + disc) <= 0., myInfinity, t2]]]]];

Remove[Intersect];
Intersect[{o1_,o2_,o3_},{d1_,d2_,d3_}][{lambda_,n_},Sphere
[{c1_,c2_,c3_},r_]]:=
  Block[{lambda2=RaySphere[o1,o2,o3,d1,d2,d3,c1,c2,c3,r]},
    If[lambda2≥lambda,{lambda,n},{lambda2,
        norm[{o1,o2,o3}+lambda2 *{d1,d2,d3}-{c1,c2,c3}]}]]

Intersect[{o1_,o2_,o3_},{d1_,d2_,d3_}][{lambda_,n_},
    Bound[{c1_,c2_,c3_},r_,s_]]:=
  Block[{lambda2=RaySphere[o1,o2,o3,d1,d2,d3,c1,c2,c3,r]},
    If[lambda2≥lambda,{lambda,n},
      Fold[Intersect[{o1,o2,o3},{d1,d2,d3}],{lambda,n},s]]]

Clear[neglight,nohit]
··········@norm[{1,3,-2}];
nohit={myInfinity,{0.,0.,0.}};

Clear[RayTrace];
RayTrace[o_,d_,scene_]:=
  Block[{lambda,n,g,p},{lambda,n}=Intersect[o,d][nohit,scene];
    If[lambda\[Equal]myInfinity,0,g=n.neglight;
      If[g≤0,
        0,{lambda,n}=Intersect[o+lambda d+delta n,neglight]
[nohit,scene];
        If[lambda<myInfinity,0,g]]]]

Clear[Create];
Create[level_,c_,r_]:=
  Block[{obj=Sphere[c,r]},
    If[level\[Equal]1,obj,
      Block[{a=3*r/Sqrt[12],Aux},
        Aux[x1_,z1_]:=Create[level-1,c+{x1,a,z1},0.5 r];
        Bound[c,3 r,{obj,Aux[-a,-a],Aux[a,-a],Aux[-a,a],Aux[a,a]}]
        ]
      ]]

Main[level_,n_,ss_]:=
  With[{scene=Create[level,{0.,-1.,4.},1.]},
    Table[Sum[
          RayTrace[{0,0,0},
            ·@norm[{(x+s/ss/ss)/n-1/2,(y+Mod[s,ss]/ss)/
n-1/2,1}],scene],{s,0,
            ss^2-1}]/ss^2,{y,0,n-1},{x,0,n-1}]]

Timing[Export["image.pgm",········@······@Main[2,100,4.]]]


Note to those who have Mathematica.
Mathematica 6 has Normalize, but that's not in Mathematica 4, so i
cooked up above.
Also, Mathematica 6 has AbsoluteTiming, which is intended to be
equivalent if you use stop watch to measure timing. Mathematica 4 has
only Timing, which measures CPU time. My speed improvement is based on
Timing. But the same factor will shown when using Mathematica 6 too.

I'm pretty sure further speed up by 0.5 factor of above's timing is
possible. Within 2 more hours of coding.

Jon wrote:
«The Mathematica code is 700,000x slower so a 50% improvement will be
uninteresting. Can you make my Mathematica code five orders of
magnitude faster or not?»

If anyone pay me $300, i can try to make it whatever the level of F#
or OCaml's speed is as cited in Jon's website. (
http://www.ffconsultancy.com/languages/ray_tracer/index.html ).

Please write out or write to me the detail exactly what speed is
required in some precise terms. If i agree to do it, spec satisfaction
is guaranteed or your money back.

PS Thanks Thomas M Hermann. It was fun.

  Xah
∑ http://xahlee.org/

☄
From: Xah Lee
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <f4ba3561-ebe7-44c2-a7a7-9a78cd1cc6a3@g1g2000pra.googlegroups.com>
On Dec 4, 6:09 pm, ··········@creativetrax.com wrote:
> For the interested, with MMA 6, on a Pentium 4 3.8Ghz:
>
> The code that Jon posted:
>
> Timing[Export["image-jon.pgm", ········@······@Main[2, 100, 4]]]
> {80.565, "image-jon.pgm"}
>
> The code that Xah posted:
>
> Timing[Export["image-xah.pgm", ········@······@Main[2, 100, 4.]]]
> {42.3186, "image-xah.pgm"}
>
> So Xah's code is about twice as fast as Jon's code, on my computer.
>
> The resulting files were identical (and both looked like pure white
> images; I thought they'd be interesting!).

The result is not pure white images. They are ray traced spheres
stacked in some recursive way. Here's the output in both my and jon's
version: http://xahlee.org/xx/image.pgm

also, note that Mathematica 6 has the function Normalize builtin,
which is used in Jon's code deeply in the core. Normalize is not in
Mathematica 4, so i had to code it myself, in this line: “norm=Function
[······@(····@@(#^2))];”. This possibly slow down my result a lot. You
might want to replace any call of “norm” in my program by the builtin
Normalize.

Also, each version of Mathematica has more optimizations. So, that
might explain why on v4 the speed factor is ~0.2 on my machine while
in v6 you see ~0.5.

My machine is OS X 10.4.x, PPC G5 1.9 Ghz.

-------------------------

let me take the opportunity to explain some high powered construct of
Mathematica.

Let's say for example, we want to write a function that takes a vector
(of linear algebra), and return a vector in the same direction but
with length 1. In linear algebar terminology, the new vector is called
the “normalized” vector of the original.

For those of you who don't know linear algebra but knows coding, this
means, we want a function whose input is a list of 3 elements say
{x,y,z}, and output is also a list of 3 elements, say {a,b,c}, with
the condition that

a = x/Sqrt[x^2+y^2+z^2]
b = y/Sqrt[x^2+y^2+z^2]
c = z/Sqrt[x^2+y^2+z^2]

For much of the history of Mathematica, normalize is not a builtin
function. It was introduced in v6, released sometimes in 2007. See
bottom of:
http://reference.wolfram.com/mathematica/ref/Normalize.html

Now, suppose our task is to write this function. In my code, you see
it is:

norm=Function[······@(····@@(#^2))];

let me explain how it is so succinct.

Mathematica's syntax support what's called FullForm, which is
basically a fully nested notation like lisp's. In fact, the
Mathematica compiler works with FullForm. The FullForm is not
something internal. A programer can type his code that way if he so
pleases.

in FullForm, the above expression is this:
 Set[ norm, Function[ Times[Slot[1], Power[ Sqrt[ Apply[ Plus, Power
[ Slot[1], 2 ] ] ], -1 ] ] ]

Now, in this
norm=Function[······@(····@@(#^2))]

The “Function” is your lisper's “lambda”. The “#” is the formal
parameter. So, in the outset we set “norm” to be a pure function.

Now, note that the “#” is not just a number, but can be any argument,
including vector of the form {x,y,z}. So, we see here that math
operations are applied to list entities directly. For example, in
Mathematica, {3,4,5}/2 returns {3/2,2,5/2} and {3,4,5}^2 returns
{9,16,25}.

In typical lang such as python, including lisp, you would have to map
the operation into each lisp elements instead.

The ·····@...” is a syntax shortcut for “Sqrt[...]”, and the
·····@@...” is a syntax shortcut for “Apply[Plus, ...]”, which is
lisp's “funcall”. So, taking the above all together, the code for
“norm” given above is _syntactically equivalent_ to this:

norm=Function[ #/Sqrt[ Apply[Plus, #^2] ]]

this means, square the vector, add them together, take the square
root, then have the original vector divide it.

The “#” is in fact a syntax shortcut for “Slot[1]”, meaning the first
formal parameter. The “=” is in fact a syntax shortcut for “Set[]”.
The “^” is a shortcut for “Power[]”, and the “/” is a shortcut for
“Power[..., -1]”. Putting all these today, you can see how the code is
syntactically equivalent to the above nested FullFolm.

Note, that the “norm” as defined above works for any dimentional
vectors, i.e. list of any length.

In lisp, python, perl, etc, you'll have 10 or so lines. In C or Java,
you'll have 50 or hundreds lines.

For more detail on syntax, see:

• The Concepts and Confusions of Prefix, Infix, Postfix and Fully
Nested Notations
  http://xahlee.org/UnixResource_dir/writ/notations.html

  Xah
∑ http://xahlee.org/

☄
From: ·········@yahoo.com
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <b056acfe-72f9-41ca-98d9-56090de21c99@s9g2000prg.googlegroups.com>
On Dec 5, 9:51 am, Xah Lee <······@gmail.com> wrote:
>
> For those of you who don't know linear algebra but knows coding, this
> means, we want a function whose input is a list of 3 elements say
> {x,y,z}, and output is also a list of 3 elements, say {a,b,c}, with
> the condition that
>
> a = x/Sqrt[x^2+y^2+z^2]
> b = y/Sqrt[x^2+y^2+z^2]
> c = z/Sqrt[x^2+y^2+z^2]

>
> In lisp, python, perl, etc, you'll have 10 or so lines. In C or Java,
> you'll have 50 or hundreds lines.

Ruby:

def norm a
  s = Math.sqrt(a.map{|x|x*x}.inject{|x,y|x+y})
  a.map{|x| x/s}
end
From: Xah Lee
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <40b4b23f-c1eb-4ac5-bb42-e0f6276e97bf@r37g2000prr.googlegroups.com>
Xah Lee wrote:
> > For those of you who don't know linear algebra but knows coding, this
> > means, we want a function whose input is a list of 3 elements say
> > {x,y,z}, and output is also a list of 3 elements, say {a,b,c}, with
> > the condition that
> >
> > a = x/Sqrt[x^2+y^2+z^2]
> > b = y/Sqrt[x^2+y^2+z^2]
> > c = z/Sqrt[x^2+y^2+z^2]
>
> > In lisp, python, perl, etc, you'll have 10 or so lines. In C or Java,
> > you'll have 50 or hundreds lines.
> >
> > Note, that the “norm” as defined above works for vectors of any
> > dimention, i.e. list of any length.


On Dec 10, 12:37 pm, ·········@yahoo.com wrote:
> Ruby:
>
> def norm a
>   s = Math.sqrt(a.map{|x|x*x}.inject{|x,y|x+y})
>   a.map{|x| x/s}
> end

I don't know ruby, but i tried to run it and it does not work.

#ruby
def norm a
  s = Math.sqrt(a.map{|x|x*x}.inject{|x,y|x+y})
  a.map{|x| x/s}
end

v = [3,4]

p norm(v) # returns [0.6, 0.8]

The correct result for that input would be 5.

Also note, i wrote: «Note, that the “norm” as defined above works for
vectors of any dimention, i.e. list of any length.».

For detail, see:
• A Example of Mathematica's Expressiveness
  http://xahlee.org/UnixResource_dir/writ/Mathematica_expressiveness.html

  Xah
∑ http://xahlee.org/

☄
From: Jon Harrop
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <PpmdnTwaZJdH3d3UnZ2dnUVZ8ocAAAAA@posted.plusnet>
Xah Lee wrote:
> On Dec 10, 12:37 pm, ·········@yahoo.com wrote:
>> Ruby:
>>
>> def norm a
>>   s = Math.sqrt(a.map{|x|x*x}.inject{|x,y|x+y})
>>   a.map{|x| x/s}
>> end
> 
> I don't know ruby, but i tried to run it and it does not work.
> 
> #ruby
> def norm a
>   s = Math.sqrt(a.map{|x|x*x}.inject{|x,y|x+y})
>   a.map{|x| x/s}
> end
> 
> v = [3,4]
> 
> p norm(v) # returns [0.6, 0.8]

That is the correct answer.

> The correct result for that input would be 5.

No, you're confusing normalization with length.

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
From: William James
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <ghpthg01vsi@enews4.newsguy.com>
Jon Harrop wrote:

> Xah Lee wrote:
> > On Dec 10, 12:37 pm, ·········@yahoo.com wrote:
> >> Ruby:
> > > 
> >> def norm a
> >>   s = Math.sqrt(a.map{|x|x*x}.inject{|x,y|x+y})
> >>   a.map{|x| x/s}
> >> end
> > 
> > I don't know ruby, but i tried to run it and it does not work.
> > 
> > #ruby
> > def norm a
> >   s = Math.sqrt(a.map{|x|x*x}.inject{|x,y|x+y})
> >   a.map{|x| x/s}
> > end
> > 
> > v = [3,4]
> > 
> > p norm(v) # returns [0.6, 0.8]
> 
> That is the correct answer.
> 
> > The correct result for that input would be 5.
> 
> No, you're confusing normalization with length.

Expanded for easier comprehension.

def norm a
  # Replace each number with its square.
  b = a.map{|x| x*x }
  # Sum the squares. (inject is reduce or fold)
  c = b.inject{|x,y| x + y }
  # Take the square root of the sum.
  s = Math.sqrt( c )
  # Divide each number in original list by the square root.
  a.map{|x| x/s }
end

1.upto(4){|i|
  a = (1..i).to_a
  p a
  p norm( a )
}

--- output ---
[1]
[1.0]
[1, 2]
[0.447213595499958, 0.894427190999916]
[1, 2, 3]
[0.267261241912424, 0.534522483824849, 0.801783725737273]
[1, 2, 3, 4]
[0.182574185835055, 0.365148371670111, 0.547722557505166,
0.730296743340221]
From: toby
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <b31a7abe-a504-462f-aaf3-0023e44a1e8d@c1g2000yqg.googlegroups.com>
On Dec 10, 3:37 pm, ·········@yahoo.com wrote:
> On Dec 5, 9:51 am, Xah Lee <······@gmail.com> wrote:
>
>
>
> > For those of you who don't know linear algebra but knows coding, this
> > means, we want a function whose input is a list of 3 elements say
> > {x,y,z}, and output is also a list of 3 elements, say {a,b,c}, with
> > the condition that
>
> > a = x/Sqrt[x^2+y^2+z^2]
> > b = y/Sqrt[x^2+y^2+z^2]
> > c = z/Sqrt[x^2+y^2+z^2]
>
> > In lisp, python, perl, etc, you'll have 10 or so lines. In C or Java,
> > you'll have 50 or hundreds lines.

void normalise(float d[], float v[]){
    float m = sqrt(v[0]*v[0] + v[1]*v[1] + v[2]*v[2]);
    d[0] = v[0]/m;  // My guess is Xah Lee
    d[1] = v[1]/m;  // hasn't touched C
    d[2] = v[2]/m;  // for near to an eternitee
}


>
> Ruby:
>
> def norm a
>   s = Math.sqrt(a.map{|x|x*x}.inject{|x,y|x+y})
>   a.map{|x| x/s}
> end
From: Raymond Wiker
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <m2vdtrhh7e.fsf@RAWMBP.local>
·········@yahoo.com writes:

> On Dec 5, 9:51�am, Xah Lee <······@gmail.com> wrote:
>>
>> For those of you who don't know linear algebra but knows coding, this
>> means, we want a function whose input is a list of 3 elements say
>> {x,y,z}, and output is also a list of 3 elements, say {a,b,c}, with
>> the condition that
>>
>> a = x/Sqrt[x^2+y^2+z^2]
>> b = y/Sqrt[x^2+y^2+z^2]
>> c = z/Sqrt[x^2+y^2+z^2]
>
>>
>> In lisp, python, perl, etc, you'll have 10 or so lines. In C or Java,
>> you'll have 50 or hundreds lines.
>
> Ruby:
>
> def norm a
>   s = Math.sqrt(a.map{|x|x*x}.inject{|x,y|x+y})
>   a.map{|x| x/s}
> end

In Common Lisp:

(defun normalize (list-or-vector)
  (let ((l (sqrt (reduce #'+ (map 'list (lambda (x) (* x x)) list-or-vector)))))
    (map (type-of list-or-vector) (lambda (x) (/ x l)) list-or-vector)))

As a bonus, this works with lists or vectors; it also works with
complex numbers. 

Since this is Common Lisp, it is also possible to extend this (naive)
implementation so that it performs as much as possible at
compile-time, possibly replacing calls with the computed result.

Stick that in Mathematica's (and Ruby's) pipe and smoke it!
From: Jon Harrop
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <PpmdnT8aZJeA3N3UnZ2dnUVZ8oednZ2d@posted.plusnet>
Raymond Wiker wrote:
> Stick that in Mathematica's (and Ruby's) pipe and smoke it!

Actually you can do that in Mathematica as well. Lisp is basically
Mathematica without the maths...

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
From: Scott
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <495a3b65-ad9d-42d2-88ca-f57979ecc924@b41g2000pra.googlegroups.com>
On Dec 10, 2:07 pm, Raymond Wiker <····@RawMBP.local> wrote:
>
> In Common Lisp:
>
> (defun normalize (list-or-vector)
>   (let ((l (sqrt (reduce #'+ (map 'list (lambda (x) (* x x)) list-or-vector)))))
>     (map (type-of list-or-vector) (lambda (x) (/ x l)) list-or-vector)))
>
> As a bonus, this works with lists or vectors; it also works with
> complex numbers.
>

I think you want to throw a (conjugate x) in there for it to give you
the correct answer for complex numbers...
From: Raymond Wiker
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <m21vwc8jwj.fsf@RAWMBP.local>
Scott <·······@gmail.com> writes:

> On Dec 10, 2:07�pm, Raymond Wiker <····@RawMBP.local> wrote:
>>
>> In Common Lisp:
>>
>> (defun normalize (list-or-vector)
>> � (let ((l (sqrt (reduce #'+ (map 'list (lambda (x) (* x x)) list-or-vector)))))
>> � � (map (type-of list-or-vector) (lambda (x) (/ x l)) list-or-vector)))
>>
>> As a bonus, this works with lists or vectors; it also works with
>> complex numbers.
>>
>
> I think you want to throw a (conjugate x) in there for it to give you
> the correct answer for complex numbers...

	You may well be right, but I cannot quite see why... then
again, I have trouble visualising exactly what a multidimensional
vector with complex components is :-)
From: Jon Harrop
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <SM2dnUhxP7x1Kt7UnZ2dnUVZ8rSdnZ2d@posted.plusnet>
Raymond Wiker wrote:
> Scott <·······@gmail.com> writes:
>> On Dec 10, 2:07 pm, Raymond Wiker <····@RawMBP.local> wrote:
>>>
>>> In Common Lisp:
>>>
>>> (defun normalize (list-or-vector)
>>> (let ((l (sqrt (reduce #'+ (map 'list (lambda (x) (* x x))
>>> list-or-vector))))) (map (type-of list-or-vector) (lambda (x) (/ x l))
>>> list-or-vector)))
>>>
>>> As a bonus, this works with lists or vectors; it also works with
>>> complex numbers.
>>>
>>
>> I think you want to throw a (conjugate x) in there for it to give you
>> the correct answer for complex numbers...
> 
> You may well be right, but I cannot quite see why... then
> again, I have trouble visualising exactly what a multidimensional
> vector with complex components is :-)

This has nothing to do with vectors. You needed the squared magnitude |x|^2
to handle the complex case but you wrote x^2 which is incorrect for complex
numbers in general.

This kind of mistake is commonly seen in comp.lang.lisp where Lispers claim
that their dynamically-typed code is generally applicable by default
because it can be applied to other types without realising that the results
will be wrong. The same fundamental misunderstanding also misleads people
into believing that it is good to confuse integers and floating point
numbers. In reality, robust numerical methods over these different number
types rarely have much in common and, consequently, there is no value in
bundling them together.

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
From: Kaz Kylheku
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <20081229100829.263@gmail.com>
On 2008-12-13, Jon Harrop <···@ffconsultancy.com> wrote:
> Raymond Wiker wrote:
>> Scott <·······@gmail.com> writes:
>>> On Dec 10, 2:07 pm, Raymond Wiker <····@RawMBP.local> wrote:
>>>>
>>>> In Common Lisp:
>>>>
>>>> (defun normalize (list-or-vector)
>>>> (let ((l (sqrt (reduce #'+ (map 'list (lambda (x) (* x x))
>>>> list-or-vector))))) (map (type-of list-or-vector) (lambda (x) (/ x l))
>>>> list-or-vector)))
>>>>
>>>> As a bonus, this works with lists or vectors; it also works with
>>>> complex numbers.
>>>>
>>>
>>> I think you want to throw a (conjugate x) in there for it to give you
>>> the correct answer for complex numbers...
>> 
>> You may well be right, but I cannot quite see why... then
>> again, I have trouble visualising exactly what a multidimensional
>> vector with complex components is :-)
>
> This has nothing to do with vectors. You needed the squared magnitude |x|^2
> to handle the complex case but you wrote x^2 which is incorrect for complex
> numbers in general.

But it's not type-incorret.

> This kind of mistake is commonly seen in comp.lang.lisp where Lispers claim
> that their dynamically-typed code is generally applicable by default

Mathematics isn't statically typed, so why should code be?

You can multiply a complex number with itself, add it to other such
results and then take the square root of the result. This
is a calculation which is syntactically well-formed and
semantically defined.

A static type system can make it work too.

So you apparently don't understand static or dynamic type
systems.

But no matter what you believe in your crazy mind, bad mathematics can't always
be reduced to a static type error.  If you add a term where you ought to have
subtracted, typing doesn't help.

> because it can be applied to other types without realising that the results
> will be wrong. The same fundamental misunderstanding also misleads people
> into believing that it is good to confuse integers and floating point
> numbers.

The confusion is yours. In actuality, we make the distinction between exact
and inexact numbers for the sake of these numerical methods.

We have a distinct 0.0 and 0.  If a floating point calculation produces the
former, it's not reduced to the latter.

It's a compromise from pure mathematics, but it make sense, since 0.0
aliases for an infinite range of numbers from [-epsilon, +epsilon].

Exactness is orthogonal to the complex number issue. A complex number
could be exact, like 1 + 0i. 

CLISP has exact complex numbers.

[1]> (inspect #c(3/2 1))t #c(3/2 1)
#C(3/2 1):  complex number
0 [REALPART]:  3/2
1 [IMAGPART]:  1
INSPECT-- type :h for help; :q to return to the REPL ---> 0
3/2:  rational number
0 [NUMERATOR]:  3
1 [DENOMINATOR]:  2
INSPECT-- type :h for help; :q to return to the REPL ---> :u
#C(3/2 1):  complex number
0 [REALPART]:  3/2
1 [IMAGPART]:  1
INSPECT-- type :h for help; :q to return to the REPL ---> 1
1:  atom
 type: BIT
 class: #1=#<BUILT-IN-CLASS INTEGER>
INSPECT-- type :h for help; :q to return to the REPL ---> :q

Look at that,  3/2 is a rational, and 1 is a bit. :)

> types rarely have much in common and, consequently, there is no value in
> bundling them together.

What's missing is the counterpoint here; nobody claimed that exact and inexact
types and calculations should be bundled together.
From: Jon Harrop
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <Ha2dnQ-usuZoGdnUnZ2dnUVZ8rKdnZ2d@posted.plusnet>
Kaz Kylheku wrote:
> On 2008-12-13, Jon Harrop <···@ffconsultancy.com> wrote:
>> This has nothing to do with vectors. You needed the squared magnitude
>> |x|^2 to handle the complex case but you wrote x^2 which is incorrect for
>> complex numbers in general.
> 
> But it's not type-incorret.

Lisp cannot catch that error because its type system is too feeble.

Statically-typed languages do catch that error:

# open Complex;;
# let normalize z : float = mul z (conj z);;
This expression has type Complex.t but is here used with type float

The correct solution compiles fine, of course:

# let normalize z : float = let x = norm z in x *. x;;
val normalize : Complex.t -> float = <fun>

Without static type checking, Raymond failed to catch this error before
making a quip about Lisp being blub.

>> This kind of mistake is commonly seen in comp.lang.lisp where Lispers
>> claim that their dynamically-typed code is generally applicable by
>> default
> 
> Mathematics isn't statically typed, so why should code be?

We know that zz* is real so why waste a complex and risk Raymond's error?

Encode the knowledge in your static types (aka typeful programming) and
errors like this won't slip through...

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
From: Kaz Kylheku
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <20081229185606.897@gmail.com>
On 2008-12-14, Jon Harrop <···@ffconsultancy.com> wrote:
> Kaz Kylheku wrote:
>> On 2008-12-13, Jon Harrop <···@ffconsultancy.com> wrote:
>>> This has nothing to do with vectors. You needed the squared magnitude
>>> |x|^2 to handle the complex case but you wrote x^2 which is incorrect for
>>> complex numbers in general.
>> 
>> But it's not type-incorret.
>
> Lisp cannot catch that error because its type system is too feeble.
>
> Statically-typed languages do catch that error:
>
> # open Complex;;
> # let normalize z : float = mul z (conj z);;
> This expression has type Complex.t but is here used with type float

The error that it caught is that you typed the word "float" into it
after a colon.

That's a declaration, which is optional. Without it, the function
is polymorphic. Note that we have optional declarations in Lisp also,
and compilers can do all the type inference they want.

Presumably, your conj is smart enough such that when
it's applied to a float, it instantiates as a float -> float
function, right?

Without the declaration, normalize it's a better function, one which we can use
as a Float -> Float or Complex -> Complex.

Obviously, this is yet another language that you don't know very well.

> The correct solution compiles fine, of course:

That's what makes it correct, in fact! Ship it!

>> Mathematics isn't statically typed, so why should code be?
>
> We know that zz* is real so why waste a complex and risk Raymond's error?

We may know that zz* is real, but we have to declare that
to the programming language, in order to make it barf when
zz* is in fact not real. :)

> Encode the knowledge in your static types (aka typeful programming) and
> errors like this won't slip through...

But useful programs also won't slip through.

A more rationally balanced approach is to apply the typeful programming in
regions of code where it makes sense.
From: Raymond Wiker
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <m28wqj2mog.fsf@RAWMBP.local.i-did-not-set--mail-host-address--so-tickle-me>
Jon Harrop <···@ffconsultancy.com> writes:

> # let normalize z : float = let x = norm z in x *. x;;
> val normalize : Complex.t -> float = <fun>
>
> Without static type checking, Raymond failed to catch this error before
> making a quip about Lisp being blub.

	Pffft... this is a problem with my understanding of mathematics,
and not with Common Lisp. If I had spent more time thinking about this,
I might have ended up doing the conjugate trick, in which case static
typing would have bought me exactly nothing. I might even have called
#'abs on each argument before squaring it, which would have given me a
real result, instead of a complex with a zero imaginary part. All of
this would have been possible if I had spent more time thinking about
it. 

	Note that, if I had been using a statically typed language, I
would probably have specified the return type from the modulus (length)
operation as complex.

	Moral: if your assumptions are wrong, the end result is likely
to be wrong. This applies both to my sloppy code, and to your insistence
that static typing would have stopped me from maaking this error.
From: Kaz Kylheku
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <20081229220251.583@gmail.com>
On 2008-12-14, Raymond Wiker <···@RAWMBP.local> wrote:
> Jon Harrop <···@ffconsultancy.com> writes:
>
>> # let normalize z : float = let x = norm z in x *. x;;
>> val normalize : Complex.t -> float = <fun>
>>
>> Without static type checking, Raymond failed to catch this error before
>> making a quip about Lisp being blub.
>
> 	Pffft... this is a problem with my understanding of mathematics,
> and not with Common Lisp. 

Jon has confused himself completely, apparently unable to follow
a simple discussion thread.

He's taken the fixed version (which has the conjugate) and braindamaged
it with a static type check.

The original bug was a math problem. 

You wrote some code and claimed that it naturally extends to the complex
numbers. But it had a little problem; it computed the wrong thing.
But your intent was to make it polymorphic.

In the static language it might have been along these lines:

  let normalize z = div z (sqrt mul z z)

(Never mind that the original was about vectors). Now presumably, div and mul
are (or can be) polymorphic functions. Statically polymorphic, of course.
So normalize will compute something as a complex -> complex
function and as a float -> float function.

The static type system does not save us from anything!
It happily lets us multiply a complex number with itself,
pass it through a complex -> complex square root,
and a complex -> complex -> complex division.

Now the silly claim is that the programmer could have made it like this:

  let normalize z : float = ...

But /why/ would programmer programmer have done that, when that
programmer believes that he's writing a generic function
that /does/ work for complex numbers and reals?

Moreover, if he recognized that the computation is wrong over the
complex domain, he would just have fix the computation. ``Ah crap, we need to
multiply with the conjugate, so add (conj x).

Adding a float declaration is nothing more than using the type
system to document that there is a bug (but the type error is
not that bug, just a fence around the bug!) 

I.e. ``since the wrong value is computed for cmoplex numbers, let's not
actually fix the bug, but instead throw our hands up and give up on the
function being polymorphic, and constrain it to the real domain.''

> I might have ended up doing the conjugate trick, in which case static
> typing would have bought me exactly nothing. I might even have called

In this case, static typing (inferential, with polymorphism)
would have bought exactly the same thing as dynamic typing.

Static typing would /not/ have magically identified the math error, as the Frog
Man falsely claims.

If a float return type declaration had been added, then it would not have
caught the error either. It would have caught the error that the function can't
be used on a complex type, which is not the same error as the function
computing the wrong value!!!

Stupid Home Grenouille succeeded in adding precisely that debilitating
declaration to the /correct polymorphic/ version which has the
conjugate bit that allows it to work for complexes and reals. 
``35 years'' of work on type systems down the drain.  :)

> 	Note that, if I had been using a statically typed language, I
> would probably have specified the return type from the modulus (length)
> operation as complex.

Or if you were using a ``modern'' one, you might simply not have
specified the return type, just like in dynamically typed Lisp.

> 	Moral: if your assumptions are wrong, the end result is likely
> to be wrong. This applies both to my sloppy code, and to your insistence
> that static typing would have stopped me from maaking this error.

If you used static typing, you'd still be racking your brains
about how to coax some behavior out of some program that, as it stands,
you finished and forgot about years ago.
From: Aatu Koskensilta
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <8763lmza1d.fsf@alatheia.dsl.inet.fi>
Kaz Kylheku <········@gmail.com> writes:

> In the static language it might have been along these lines:
> 
>   let normalize z = div z (sqrt mul z z)
> 
> (Never mind that the original was about vectors). Now presumably,
> div and mul are (or can be) polymorphic functions. Statically
> polymorphic, of course.  So normalize will compute something as a
> complex -> complex function and as a float -> float function.
> 
> The static type system does not save us from anything!
> It happily lets us multiply a complex number with itself,
> pass it through a complex -> complex square root,
> and a complex -> complex -> complex division.
> 
> Now the silly claim is that the programmer could have made it like this:
> 
>   let normalize z : float = ...
> 
> But /why/ would programmer programmer have done that, when that
> programmer believes that he's writing a generic function
> that /does/ work for complex numbers and reals?

Because he wants to use the static type system for something useful?
It is a part of the specification of the normalize function that it
returns a real number, a part of the specification that can be
captured in the type of the function so that the compiler can check
the code meets the specification in this respect. Of course, if one
doesn't use the type system for anything useful it will not be of much
use, but this is hardly a profound observation. And indeed for example
in Haskell code it is a common convention to include type
specifications for all top-level functions.

> If a float return type declaration had been added, then it would not
> have caught the error either. It would have caught the error that
> the function can't be used on a complex type, which is not the same
> error as the function computing the wrong value!!!

No, the error reported would have been that the function body does not
typecheck, that is, does not meet the specification provided. 

> If you used static typing, you'd still be racking your brains
> about how to coax some behavior out of some program that, as it stands,
> you finished and forgot about years ago.

A wonderful argument.

-- 
Aatu Koskensilta (················@uta.fi)

"Wovon man nicht sprechen kann, darüber muss man schweigen"
 - Ludwig Wittgenstein, Tractatus Logico-Philosophicus
From: Scott
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <7817bdb4-7fab-4f2f-a3fa-4c52883cece0@c36g2000prc.googlegroups.com>
On Dec 14, 3:06 pm, Aatu Koskensilta <················@uta.fi> wrote:
> [...] And indeed for example
> in Haskell code it is a common convention to include type
> specifications for all top-level functions.
>

And indeed many of those functions in the Haskell prelude are
polymorphic over multiple types.  In a compile time type checking
language, you can correctly define many list operations to work on
lists of any type, and you can correctly define many operations on
numeric lists (or vectors or arrays or matrices or whatever) to work
on both complex and real numeric lists (or whatever).  This would be a
smart thing to do if you were trying to create useful libraries for
matrix and vector operations.  I'm assuming you wouldn't want to cut
and paste the exact same code for Float and Double.  In this
hypothetical library, why would you want to write separate algorithms
for Real matrices when they are almost universally just special cases
of Complex matrices?  Neglecting to conjugate in the appropriate
places would not be caught by the type system.
From: Jon Harrop
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <v7-dnWbsvuQOK9jUnZ2dnUVZ8rCdnZ2d@posted.plusnet>
Scott wrote:
> On Dec 14, 3:06 pm, Aatu Koskensilta <················@uta.fi> wrote:
>> [...] And indeed for example
>> in Haskell code it is a common convention to include type
>> specifications for all top-level functions.
> 
> And indeed many of those functions in the Haskell prelude are
> polymorphic over multiple types.  In a compile time type checking
> language, you can correctly define many list operations to work on
> lists of any type, and you can correctly define many operations on
> numeric lists (or vectors or arrays or matrices or whatever) to work
> on both complex and real numeric lists (or whatever).  This would be a
> smart thing to do if you were trying to create useful libraries for
> matrix and vector operations.  I'm assuming you wouldn't want to cut
> and paste the exact same code for Float and Double.

Indeed, because the code is different.

> In this 
> hypothetical library, why would you want to write separate algorithms
> for Real matrices when they are almost universally just special cases
> of Complex matrices?

Because that is not true: numerical methods over reals are completely
different from numerical methods over complex numbers. You're trying to
factor out commonality that does not exist.

> Neglecting to conjugate in the appropriate places would not be caught by
> the type system.

z z* has the return type "complex" whereas abs(z)^2 has the return type
real. If static type checking isn't catching that error, you're doing
something wrong.

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
From: Scott
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <921c60b2-4e0f-4817-9f0c-7e906e5bdd39@k24g2000pri.googlegroups.com>
On Dec 14, 6:47 pm, Jon Harrop <····@ffconsultancy.com> wrote:
>
> > In this
> > hypothetical library, why would you want to write separate algorithms
> > for Real matrices when they are almost universally just special cases
> > of Complex matrices?
>
> Because that is not true: numerical methods over reals are completely
> different from numerical methods over complex numbers. You're trying to
> factor out commonality that does not exist.
>

Look at the definitions for everything from dot products to matrix
decompositions over complex numbers.  Change all the zs to xs, and
ignore it whenever they superscript a *, and you've got exactly the
description for dot products and matrix decomposition over real
numbers.

Keep in mind that if Raymond Wiker had included the conjugate in his
definition, it would have worked correctly for a list|vector of real|
complex float|double.  That's eight correct implementations for the
price of one.  I think this constitutes an existence proof for at
least that one simple algorithm.  I claim it's also true for scalar|
vector|matrix addition and subtraction, scalar|matrix multiplication
and division, LU decomposition, QR decomposition, Cholesky
decomposition, Singular Value Decomposition, and others.

[...]
> > Neglecting to conjugate in the appropriate places would not be caught by
> > the type system.
>
> z z* has the return type "complex" whereas abs(z)^2 has the return type
> real. If static type checking isn't catching that error, you're doing
> something wrong.
>

In mathematics, the conjugate operation over the field of complex
numbers is an identity function for values with zero for the imaginary
part.  As such, in a programming language, you can specify a conjugate
function on real values that returns the same real value with type
real.  It does not need to return type complex for type real input.
Both Common Lisp and PLT Scheme have conjugate implemented this way,
it's useful, and you could implement it in many statically typed
languages that way too.  It's a perfectly valid definition, and no I'm
not doing anything wrong.

The negation operator doesn't promote real values to complex.  Why
should conjugate?
From: Jon Harrop
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <ep6dneX2XNfcZdjUnZ2dnUVZ8rednZ2d@posted.plusnet>
Scott wrote:
> On Dec 14, 6:47 pm, Jon Harrop <····@ffconsultancy.com> wrote:
>> > In this
>> > hypothetical library, why would you want to write separate algorithms
>> > for Real matrices when they are almost universally just special cases
>> > of Complex matrices?
>>
>> Because that is not true: numerical methods over reals are completely
>> different from numerical methods over complex numbers. You're trying to
>> factor out commonality that does not exist.
> 
> Look at the definitions for everything from dot products to matrix
> decompositions over complex numbers.  Change all the zs to xs, and
> ignore it whenever they superscript a *, and you've got exactly the
> description for dot products and matrix decomposition over real
> numbers.

Only if you completely ignore numerical analysis, i.e. the fact that we are
not actually dealing with real and complex numbers but finite-precision
machine representations of them with added elements (e.g. infinity, NaN
and -0).

> Keep in mind that if Raymond Wiker had included the conjugate in his
> definition, it would have worked correctly for a list|vector of real|
> complex float|double.

Only for a suitably dubious definition of "worked correctly". In practice,
it gives the wrong answer for 50% of all floating point values!

> That's eight correct implementations for the price of one.

If that were true it would indeed be a wonder to behold. Unfortunately
numerical analysis is a huge and difficult subject riddled with problems
exactly like the one Raymond had.

> I think this constitutes an existence proof for at least that one simple
> algorithm. 

I'm afraid you'll have to find an algorithm even simpler than vector
normalization. I don't doubt that you can but my point is that there is no
practical benefit in factoring out commonality from the tiny number of
algorithms where your dreams come true, e.g. extracting the real part is
numerically robust.

> I claim it's also true for scalar| vector|matrix addition and subtraction,
> scalar|matrix multiplication and division,

While true, the source code is too small to benefit from factoring:

  map2 (+) u v
  map ((+) s) m

> LU decomposition, QR decomposition, Cholesky decomposition, Singular Value
> Decomposition, and others.  

These algorithms all sum a sequence of numbers (even dot product does this).
Doing that robustly with floats entails binning by order of magnitude (e.g.
modulus of the base 2 exponent) to minimize numerical error. That does not
generalize to complex numbers.

> [...]
>> > Neglecting to conjugate in the appropriate places would not be caught
>> > by the type system.
>>
>> z z* has the return type "complex" whereas abs(z)^2 has the return type
>> real. If static type checking isn't catching that error, you're doing
>> something wrong.
> 
> In mathematics, the conjugate operation over the field of complex
> numbers is an identity function for values with zero for the imaginary
> part.  As such, in a programming language, you can specify a conjugate
> function on real values that returns the same real value with type
> real.  It does not need to return type complex for type real input.
> Both Common Lisp and PLT Scheme have conjugate implemented this way,
> it's useful, and you could implement it in many statically typed
> languages that way too.  It's a perfectly valid definition, and no I'm
> not doing anything wrong.
> 
> The negation operator doesn't promote real values to complex.  Why
> should conjugate?

That design leads to the addition of an arbitrary number of no-op functions,
like conjugate over floats, so it is obviously a bad design.

Moreover, many combinations of functions and numerical types are invalid so
your functions will presumably raise exceptions. So they are not just
harmless: they actually become a source of run-time errors.

A much better solution is to provide numerical methods only where they make
sense (i.e. no no-ops) and factor out commonality by parameterization over
functions rather than numeric types. In other words, get rid of the numeric
tower and use higher-order functions instead. The result is a much more
effective factoring and no loss of performance. This is the approach we use
in our F# for Numerics library.

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
From: Scott
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <9b08586b-e95e-42cc-9631-41e2f89d7542@a37g2000pre.googlegroups.com>
On Dec 14, 11:27 pm, Jon Harrop <····@ffconsultancy.com> wrote:
> Scott wrote:
>
> > Look at the definitions for everything from dot products to matrix
> > decompositions over complex numbers.  Change all the zs to xs, and
> > ignore it whenever they superscript a *, and you've got exactly the
> > description for dot products and matrix decomposition over real
> > numbers.
>
> Only if you completely ignore numerical analysis, i.e. the fact that we are
> not actually dealing with real and complex numbers but finite-precision
> machine representations of them with added elements (e.g. infinity, NaN
> and -0).
>

For what it's worth, I'm familiar with numerical analysis, and I'm
beginning to suspect that you will never admit to being wrong about
anything (even when you are).  Please tell me how a zero in the
imaginary part contributes to rounding error, NaNs, or Infs in any of
addition, subtraction, multiplication, division (assume Smith's
algorithm)...  I'm not confused by your attempts to clutter the
discussion, and I don't think you're as familiar with this as you're
claiming to be.

And besides that, you've still missed the point.  You *can* express
the algorithm as though it is working on complex numbers, but you
either trust the typing system to dispatch to the correct version of
"sum" or "mul" or "conj", or you expect dynamic typing to do
likewise.  You write the code once, and very little, if any, extra
work is done for the real specialization at run time.

> > Keep in mind that if Raymond Wiker had included the conjugate in his
> > definition, it would have worked correctly for a list|vector of real|
> > complex float|double.
>
> Only for a suitably dubious definition of "worked correctly". In practice,
> it gives the wrong answer for 50% of all floating point values!
>

I don't know what you're talking about here.  Maybe you didn't like
his summation... see below.

>
> These algorithms all sum a sequence of numbers (even dot product does this).
> Doing that robustly with floats entails binning by order of magnitude (e.g.
> modulus of the base 2 exponent) to minimize numerical error. That does not
> generalize to complex numbers.
>

There are several algorithms for robustly summing a sequence of
numbers.  I would expect this (Kahan's summation or more
sophisticated) to be handled in a general purpose "sum" function.  And
yes, it does generalize to complex numbers because you can sum the
real and imaginary parts separately.  Create two sets of "binnings" in
your algorithm...

>
> > The negation operator doesn't promote real values to complex.  Why
> > should conjugate?
>
> That design leads to the addition of an arbitrary number of no-op functions,
> like conjugate over floats, so it is obviously a bad design.
>

And in a statically typed language, the compiler would recognize the
no-op and remove it.  I think someone will claim that Common Lisp or
Stalin can do this kind of type inference too.  Your conclusion is
flawed.

> Moreover, many combinations of functions and numerical types are invalid so
> your functions will presumably raise exceptions. So they are not just
> harmless: they actually become a source of run-time errors.

You've got divide by zeros with complex numbers too.  Singular or ill-
conditioned matrices and the like are not going to be avoided with
static type checking.  I don't see how you can stop a user at run time
from passing in a real matrix that could to lead to the square root of
a negative number, and I don't see how limiting your implementation to
reals is going to help avoid that problem.  Run time exceptions are a
fact of life.
From: Jon Harrop
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <laudneAJHfbdTtvUnZ2dnUVZ8vydnZ2d@posted.plusnet>
Scott wrote:
> Please tell me how a zero in the imaginary part contributes to rounding
> error, NaNs, or Infs in any of addition, subtraction, multiplication,
> division (assume Smith's algorithm)...

Another strawman argument.

> And besides that, you've still missed the point.  You *can* express
> the algorithm as though it is working on complex numbers...

No. Trying to factor by number representation is fruitless.

> , but you 
> either trust the typing system to dispatch to the correct version of
> "sum" or "mul" or "conj", or you expect dynamic typing to do
> likewise.  You write the code once, and very little, if any, extra
> work is done for the real specialization at run time.

No. The extra work in your design goes into implementing no-op functions and
error functions that should not even exist. That is obviously a waste of
time. But the real problem is that you end up with broken algorithms
because you believed the numeric tower would magically solve your problems
when, in fact, it does not.

>> These algorithms all sum a sequence of numbers (even dot product does
>> this). Doing that robustly with floats entails binning by order of
>> magnitude (e.g. modulus of the base 2 exponent) to minimize numerical
>> error. That does not generalize to complex numbers.
> 
> There are several algorithms for robustly summing a sequence of
> numbers.  I would expect this (Kahan's summation or more
> sophisticated) to be handled in a general purpose "sum" function.  And
> yes, it does generalize to complex numbers because you can sum the
> real and imaginary parts separately.  Create two sets of "binnings" in
> your algorithm...

You need to distinguish between the different kinds of summation algorithm.

The earlier "compensation" algorithms are generalized correctly by a numeric
tower as you say. However, they have unbounded worst-case error and, in
particular, suffer catastrophic loss of precision when the input numbers
vary significantly in magnitude.

Those algorithms were superceded by "distillation" summation algorithms that
bin by exponent in order to provide useful worst-case error bounds. Floats
bin by exponent just fine, of course, but complex numbers do not.
Consequently, trying to generalize these practically-useful summation
functions using a numeric tower only results in broken code.

Hence real numerical methods do not generalize as you claim.

Note that there is commonality that you can factor out using functions but
not by generalizing over numeric type, which was my original point.

>> > The negation operator doesn't promote real values to complex.  Why
>> > should conjugate?
>>
>> That design leads to the addition of an arbitrary number of no-op
>> functions, like conjugate over floats, so it is obviously a bad design.
> 
> And in a statically typed language, the compiler would recognize the
> no-op and remove it.

That would require whole-program optimization or the inlining of huge
higher-order functions so it is actually highly unlikely to occur.

> I think someone will claim that Common Lisp or Stalin can do this kind of
> type inference too.

Stalin should because it is a whole-program optimizing compiler but Stalin
is not representative of ordinary compilers. I doubt any existing CL
implementations will. Neither will OCaml. F# could be forced to inline such
algorithms manually and the CLR might make the optimization but I would not
rely upon it.

> Your conclusion is flawed. 

Note that I was referring to the programmer having to write out an arbitrary
number of no-op functions for no practical gain whatsoever. I had not even
begun to point out the fact that it would also cripple the performance of
your core numerical code.

>> Moreover, many combinations of functions and numerical types are invalid
>> so your functions will presumably raise exceptions. So they are not just
>> harmless: they actually become a source of run-time errors.
> 
> You've got divide by zeros with complex numbers too.  Singular or ill-
> conditioned matrices and the like are not going to be avoided with
> static type checking.

Yet another strawman argument.

> I don't see how you can stop a user at run time 
> from passing in a real matrix that could to lead to the square root of
> a negative number, and I don't see how limiting your implementation to
> reals is going to help avoid that problem.  Run time exceptions are a
> fact of life.

The existence of a problem does not justify making it worse.

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
From: Kaz Kylheku
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <20081231151933.36@gmail.com>
On 2008-12-15, Scott <·······@gmail.com> wrote:
> On Dec 14, 11:27 pm, Jon Harrop <····@ffconsultancy.com> wrote:
>> Scott wrote:
>>
>> > Look at the definitions for everything from dot products to matrix
>> > decompositions over complex numbers.  Change all the zs to xs, and
>> > ignore it whenever they superscript a *, and you've got exactly the
>> > description for dot products and matrix decomposition over real
>> > numbers.
>>
>> Only if you completely ignore numerical analysis, i.e. the fact that we are
>> not actually dealing with real and complex numbers but finite-precision
>> machine representations of them with added elements (e.g. infinity, NaN
>> and -0).
>>
>
> For what it's worth, I'm familiar with numerical analysis, and I'm
> beginning to suspect that you will never admit to being wrong about
> anything (even when you are).  Please tell me how a zero in the
> imaginary part contributes to rounding error, NaNs, or Infs in any of
> addition, subtraction, multiplication, division (assume Smith's
> algorithm)...

He's right that multiplying a number by its conjugate and then taking the
square root is not the best way to obtain the absolute value of an imaginary
number. Doing it this way is equivalent to just 
(sqrt (+ (* real real) (* imag imag))).

The better ways all treat the complex as a two-element vector of which they
compute the length (i.e. the Euclidean hypotenuse).

However, how badly it sucks, if at all, to just do (sqrt (* x (conjugate x)))
depends on the application.

You know, sometimes in numeric calculations it's even good enough to use 3.14
for the value of pi.  It depends on the requirements.

If you hack up your own vector norm in your program, you don't necessarily have
to code it to the requirements that would be required, say of a widely used
library that's going to be shipped with the OS.  It just has to meet the
requirements of your program.

If you already have such a library, the most productive thing is just to use
it. If you don't have it, the most productive thing is to code a
straightforward calculation which is no more numerically perfect than your
application requires.
From: Scott
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <75bfc6fb-255b-4cc2-9307-b35dd84ee07d@r37g2000prr.googlegroups.com>
On Dec 15, 4:18 pm, Kaz Kylheku <········@gmail.com> wrote:
>
> He's right that multiplying a number by its conjugate and then taking the
> square root is not the best way to obtain the absolute value of an imaginary
> number. Doing it this way is equivalent to just
> (sqrt (+ (* real real) (* imag imag))).
>

I must not communicate very clearly.  I understand that a well written
hypotenuse function does not do it this way because it needs to avoid
overflow and underflow.  But for the normalize function that started
this thread, given a vector/list z, you need:

   (div z (sqrt (sum (square (hypot z))))

And I suspect that is actually worse than:

   (div z (sqrt (sum (mul (conj z) z))))

I think the whole business about numerical analysis was an attempt to
clutter the topic about typing.
From: Tamas K Papp
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <6qo8baFddhh6U2@mid.individual.net>
On Mon, 15 Dec 2008 23:18:14 +0000, Kaz Kylheku wrote:

> He's right that multiplying a number by its conjugate and then taking
> the square root is not the best way to obtain the absolute value of an
> imaginary number. Doing it this way is equivalent to just (sqrt (+ (*
> real real) (* imag imag))).
> 
> The better ways all treat the complex as a two-element vector of which
> they compute the length (i.e. the Euclidean hypotenuse).

Sorry but I don't understand.  The Euclidean norm is exactly the 
(sqrt ...) formula you wrote above.

Tamas
From: Jon Harrop
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <-NedneBJwOBlg9rUnZ2dnUVZ8qfinZ2d@posted.plusnet>
Tamas K Papp wrote:
> On Mon, 15 Dec 2008 23:18:14 +0000, Kaz Kylheku wrote:
>> He's right that multiplying a number by its conjugate and then taking
>> the square root is not the best way to obtain the absolute value of an
>> imaginary number. Doing it this way is equivalent to just (sqrt (+ (*
>> real real) (* imag imag))).
>> 
>> The better ways all treat the complex as a two-element vector of which
>> they compute the length (i.e. the Euclidean hypotenuse).
> 
> Sorry but I don't understand.  The Euclidean norm is exactly the
> (sqrt ...) formula you wrote above.

I think this is the same confusion between mathematics and numerics. Your
statement is mathematically correct but it ignores the downsides of the
extra numeric computations involved in squaring a robustly-calculated norm
which, I believe, is what Scott was referring to.

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
From: Kaz Kylheku
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <20081230220711.862@gmail.com>
On 2008-12-15, Jon Harrop <···@ffconsultancy.com> wrote:
> Scott wrote:
>> That's eight correct implementations for the price of one.
>
> If that were true it would indeed be a wonder to behold. Unfortunately
> numerical analysis is a huge and difficult subject riddled with problems
> exactly like the one Raymond had.

Computing the wrong result using a completely wrong formula is a problem,
but not a problem in the domain of numerical analysis.
From: Nicolas Neuss
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <87abaxvg19.fsf@ma-patru.mathematik.uni-karlsruhe.de>
Jon Harrop <···@...> writes:

>> LU decomposition, QR decomposition, Cholesky decomposition, Singular
>> Value Decomposition, and others.
>
> These algorithms all sum a sequence of numbers (even dot product does
> this).  Doing that robustly with floats entails binning by order of
> magnitude (e.g.  modulus of the base 2 exponent) to minimize numerical
> error. That does not generalize to complex numbers.

Binning?  If I interpret correctly what you mean, do you really believe
that numerical routines (let's take LAPACK, BLAS) do it?  If yes, which
one?

Nicolas
From: Jon Harrop
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <Xq-dnancKbbnA9vUnZ2dnUVZ8jidnZ2d@posted.plusnet>
Nicolas Neuss wrote:
> Jon Harrop <···@...> writes:
>>> LU decomposition, QR decomposition, Cholesky decomposition, Singular
>>> Value Decomposition, and others.
>>
>> These algorithms all sum a sequence of numbers (even dot product does
>> this).  Doing that robustly with floats entails binning by order of
>> magnitude (e.g.  modulus of the base 2 exponent) to minimize numerical
>> error. That does not generalize to complex numbers.
> 
> Binning?  If I interpret correctly what you mean, do you really believe
> that numerical routines (let's take LAPACK, BLAS) do it?  If yes, which
> one?

The ones in Mathematica do. See the documentation for "Total", for example.
IIRC, Intel's MKL uses a similar technique.

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
From: Kaz Kylheku
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <20081231094249.131@gmail.com>
On 2008-12-15, Jon Harrop <···@ffconsultancy.com> wrote:
> Scott wrote:
>> On Dec 14, 6:47 pm, Jon Harrop <····@ffconsultancy.com> wrote:
>>> > In this
>>> > hypothetical library, why would you want to write separate algorithms
>>> > for Real matrices when they are almost universally just special cases
>>> > of Complex matrices?
>>>
>>> Because that is not true: numerical methods over reals are completely
>>> different from numerical methods over complex numbers. You're trying to
>>> factor out commonality that does not exist.
>> 
>> Look at the definitions for everything from dot products to matrix
>> decompositions over complex numbers.  Change all the zs to xs, and
>> ignore it whenever they superscript a *, and you've got exactly the
>> description for dot products and matrix decomposition over real
>> numbers.
>
> Only if you completely ignore numerical analysis, i.e. the fact that we are
> not actually dealing with real and complex numbers but finite-precision
> machine representations of them with added elements (e.g. infinity, NaN
> and -0).
>
>> Keep in mind that if Raymond Wiker had included the conjugate in his
>> definition, it would have worked correctly for a list|vector of real|
>> complex float|double.
>
> Only for a suitably dubious definition of "worked correctly". In practice,
> it gives the wrong answer for 50% of all floating point values!
>
>> That's eight correct implementations for the price of one.
>
> If that were true it would indeed be a wonder to behold. Unfortunately
> numerical analysis is a huge and difficult subject riddled with problems
> exactly like the one Raymond had.
>
>> I think this constitutes an existence proof for at least that one simple
>> algorithm. 
>
> I'm afraid you'll have to find an algorithm even simpler than vector
> normalization. I don't doubt that you can but my point is that there is no
> practical benefit in factoring out commonality from the tiny number of
> algorithms where your dreams come true, e.g. extracting the real part is
> numerically robust.
>
>> I claim it's also true for scalar| vector|matrix addition and subtraction,
>> scalar|matrix multiplication and division,
>
> While true, the source code is too small to benefit from factoring:
>
>   map2 (+) u v
>   map ((+) s) m
>
>> LU decomposition, QR decomposition, Cholesky decomposition, Singular Value
>> Decomposition, and others.  
>
> These algorithms all sum a sequence of numbers (even dot product does this).
> Doing that robustly with floats entails binning by order of magnitude (e.g.

Aha, he's been studying the BSD source file e_hypot.c.

> modulus of the base 2 exponent) to minimize numerical error. That does not
> generalize to complex numbers.

You've drifted way far from your original thesis that an error would have been
detected by static typing.

This is an extremely far-fetched proposition which only illustrates your
insanity.

It simply makes no sense.

The truth is that the generic code could easily work in a type-inferenced
static language, without compiler errors. If the programmer's intent was to
write the code polymorphically, he wouldn't even dream of adding the type
declaration.

You're trying to convince people that the programmer, who had intended
to write a polymorphic function in the naive way, would have somehow put
in that type declaration anyway.  He would then have tried the function with a
complex argument, and reacted to the ensuing compile-ime type error with a
``eureka!'' moment: ``This type error is telling me that my numerical
methods are not robust; I must treat complex numbers separately, and do binning
by order of magnitude to ensure accuracy. Oh, and I have the wrong calculation
there, I missed a conjugate''.

Wow, we sure can milk a lot out of a simple type mismatch!

In reality, what would have happened is that the programmer would have removed
the declaration. ``Hey, by declaring a float return type, I'm preventing this
function from being applicable to complex numbers. It's not polymorphic
at all like I intended. If I take it out, it will Just Work. Of course, the
result will be a real number disguised as a complex with a zero imaginary part,
but I don't care. It walks like a duck, etc.'' Then he might actually try it
and notice that normalizing, say, 1 + i does not result in a complex number 
with an imaginary part that is anywhere near zero.  Only then would he realize
that the calculation is wrong, and needs a conjugate.
From: Raymond Wiker
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <m21vwa8kkt.fsf@RAWMBP.local>
Aatu Koskensilta <················@uta.fi> writes:

> Kaz Kylheku <········@gmail.com> writes:
>
>> In the static language it might have been along these lines:
>> 
>>   let normalize z = div z (sqrt mul z z)
>> 
>> (Never mind that the original was about vectors). Now presumably,
>> div and mul are (or can be) polymorphic functions. Statically
>> polymorphic, of course.  So normalize will compute something as a
>> complex -> complex function and as a float -> float function.
>> 
>> The static type system does not save us from anything!
>> It happily lets us multiply a complex number with itself,
>> pass it through a complex -> complex square root,
>> and a complex -> complex -> complex division.
>> 
>> Now the silly claim is that the programmer could have made it like this:
>> 
>>   let normalize z : float = ...
>> 
>> But /why/ would programmer programmer have done that, when that
>> programmer believes that he's writing a generic function
>> that /does/ work for complex numbers and reals?
>
> Because he wants to use the static type system for something useful?
> It is a part of the specification of the normalize function that it
> returns a real number, a part of the specification that can be
> captured in the type of the function so that the compiler can check
> the code meets the specification in this respect. 

	No, the normalize function returns a vector of the same length
as the original vector, with each element scaled down by the the
length (modulus) of the original vector. The function that computes
the length (or modulus) of the vector can *probably* be defined to
return real-valued results - I'm not 100% sure what sort of behaviour
one should expect when trying to normalize a vector where one or more
of the elements are complex :-) 

> Of course, if one
> doesn't use the type system for anything useful it will not be of much
> use, but this is hardly a profound observation. And indeed for example
> in Haskell code it is a common convention to include type
> specifications for all top-level functions.

	Well, in this case, the error was not in a top-level
function... 
From: Jon Harrop
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <XKWdnSMvjsk5NdjUnZ2dnUVZ8r-dnZ2d@posted.plusnet>
Raymond Wiker wrote:
> Aatu Koskensilta <················@uta.fi> writes:
>> Because he wants to use the static type system for something useful?
>> It is a part of the specification of the normalize function that it
>> returns a real number, a part of the specification that can be
>> captured in the type of the function so that the compiler can check
>> the code meets the specification in this respect.
> 
> No, the normalize function returns a vector of the same length
> as the original vector, with each element scaled down by the the
> length (modulus) of the original vector. The function that computes
> the length (or modulus) of the vector can *probably* be defined to
> return real-valued results - I'm not 100% sure what sort of behaviour
> one should expect when trying to normalize a vector where one or more
> of the elements are complex :-)

You want the l^2 norm, which is:

  Sqrt[Sum[Abs[z]^2]]

See:

  http://mathworld.wolfram.com/L2-Norm.html

>> Of course, if one
>> doesn't use the type system for anything useful it will not be of much
>> use, but this is hardly a profound observation. And indeed for example
>> in Haskell code it is a common convention to include type
>> specifications for all top-level functions.

Similarly in SML, OCaml and F#, it is convention to put the type of all
definitions in the module signature.

> Well, in this case, the error was not in a top-level function...

The "norm" function is generally-applicable and, consequently, should have
been a top-level function as well. Moreover, its numerically-robust
definition is non-trivial (from the OCaml stdlib):

let norm x =
  (* Watch out for overflow in computing re^2 + im^2 *)
  let r = abs_float x.re and i = abs_float x.im in
  if r = 0.0 then i
  else if i = 0.0 then r
  else if r >= i then
    let q = i /. r in r *. sqrt(1.0 +. q *. q)
  else
    let q = r /. i in i *. sqrt(1.0 +. q *. q)

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
From: Scott
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <1be40685-a0ad-4fd0-9f11-f1c2e5ab2c64@w39g2000prb.googlegroups.com>
On Dec 14, 5:48 pm, Jon Harrop <····@ffconsultancy.com> wrote:
>
> The "norm" function is generally-applicable and, consequently, should have
> been a top-level function as well. Moreover, its numerically-robust
> definition is non-trivial (from the OCaml stdlib):
>
> let norm x =
>   (* Watch out for overflow in computing re^2 + im^2 *)
>   let r = abs_float x.re and i = abs_float x.im in
>   if r = 0.0 then i
>   else if i = 0.0 then r
>   else if r >= i then
>     let q = i /. r in r *. sqrt(1.0 +. q *. q)
>   else
>     let q = r /. i in i *. sqrt(1.0 +. q *. q)
>

But if the first thing you're going to do after you calculate the norm
is to square it, I'm not sure you've gained much by protecting against
overflow/underflow.

Assuming that the conjugate of a real returned type real, where does
this:

    z/Sqrt[Sum[Abs[z]^2]]

work, Where this:

    z/Sqrt[Sum[Conj[z]*z]]

doesn't?

I'll take it for granted that Mathematica's "Conjugate" always returns
type complex, but other systems (Common Lisp, PLT Scheme, Haskell, C+
+, etc...) could return the type of their input...
From: Jon Harrop
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <n_-dnQp_NKIPd9jUnZ2dnUVZ8sPinZ2d@posted.plusnet>
Scott wrote:
> On Dec 14, 5:48 pm, Jon Harrop <····@ffconsultancy.com> wrote:
>>
>> The "norm" function is generally-applicable and, consequently, should
>> have been a top-level function as well. Moreover, its numerically-robust
>> definition is non-trivial (from the OCaml stdlib):
>>
>> let norm x =
>>   (* Watch out for overflow in computing re^2 + im^2 *)
>>   let r = abs_float x.re and i = abs_float x.im in
>>   if r = 0.0 then i
>>   else if i = 0.0 then r
>>   else if r >= i then
>>     let q = i /. r in r *. sqrt(1.0 +. q *. q)
>>   else
>>     let q = r /. i in i *. sqrt(1.0 +. q *. q)
> 
> But if the first thing you're going to do after you calculate the norm
> is to square it, I'm not sure you've gained much by protecting against
> overflow/underflow.

You've gained a general-purpose "norm" function that works correctly.

> I'll take it for granted that Mathematica's "Conjugate" always returns
> type complex...

That is incorrect: everything in Mathematica is an expression.

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
From: Scott
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <e1872b8a-8cd7-4a6f-917a-fe11caabee00@x16g2000prn.googlegroups.com>
On Dec 14, 10:29 pm, Jon Harrop <····@ffconsultancy.com> wrote:
> Scott wrote:
>
> > But if the first thing you're going to do after you calculate the norm
> > is to square it, I'm not sure you've gained much by protecting against
> > overflow/underflow.
>
> You've gained a general-purpose "norm" function that works correctly.
>

Ok, but in the context of the "normalize" function that returns a unit
vector, your version has done N square roots and divisions for no
appreciable benefit.  You'll still overflow in the same situations.
And I'm not certain, but I think you introduced additional floating
point rounding error.

> > I'll take it for granted that Mathematica's "Conjugate" always returns
> > type complex...
>
> That is incorrect: everything in Mathematica is an expression.
>

Fair enough.  Thank you for the clarification.
From: Jon Harrop
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <8rydncKLxdP7ndvUnZ2dnUVZ8oWdnZ2d@posted.plusnet>
Scott wrote:
> On Dec 14, 10:29 pm, Jon Harrop <····@ffconsultancy.com> wrote:
>> Scott wrote:
>>
>> > But if the first thing you're going to do after you calculate the norm
>> > is to square it, I'm not sure you've gained much by protecting against
>> > overflow/underflow.
>>
>> You've gained a general-purpose "norm" function that works correctly.
> 
> Ok, but in the context of the "normalize" function that returns a unit
> vector, your version has done N square roots and divisions for no
> appreciable benefit. You'll still overflow in the same situations.

Yes. I missed the "norm2" function in OCaml's stdlib but, even then,
normalize is still prone to overflow. Perhaps a better solution would be to
normalize by the largest element as you go, something like this (off the
top of my head):

  let metric (z: complex) =
    max (abs z.i) (abs z.r)

  let normalize r =
    let mutable l = 0.0
    let mutable d = 1.0
    for z in r do
      let d' = metric z
      if d' > 2.0 * d then
        l <- d / d' * l
        d <- d'
      l <- l + norm2(1/d * z)
    map (( * ) (d / il) r

That should be near optimal in terms of both performance and reliability.

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
From: Jon Harrop
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <Tuqdnb6mWIQFnNvUnZ2dnUVZ8gydnZ2d@posted.plusnet>
Jon Harrop wrote:
>   let metric (z: complex) =
>     max (abs z.i) (abs z.r)
> 
>   let normalize r =
>     let mutable l = 0.0
>     let mutable d = 1.0
>     for z in r do
>       let d' = metric z
>       if d' > 2.0 * d then
>         l <- d / d' * l
>         d <- d'
>       l <- l + norm2(1/d * z)

Actually this algorithm can be sped up by skipping that previous line when
d' < MACH_EPS * l because we know the contribution to the sum after the
costly norm2 will be insignificant anyway.

>     map (( * ) (d / il) r

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
From: Kaz Kylheku
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <20081230223806.905@gmail.com>
On 2008-12-15, Jon Harrop <···@ffconsultancy.com> wrote:
> The "norm" function is generally-applicable and, consequently, should have
> been a top-level function as well. Moreover, its numerically-robust
> definition is non-trivial (from the OCaml stdlib):
>
> let norm x =
>   (* Watch out for overflow in computing re^2 + im^2 *)
>   let r = abs_float x.re and i = abs_float x.im in
>   if r = 0.0 then i
>   else if i = 0.0 then r
>   else if r >= i then
>     let q = i /. r in r *. sqrt(1.0 +. q *. q)
>   else
>     let q = r /. i in i *. sqrt(1.0 +. q *. q)

This classifies as quite trivial.

I suggest you look at the truly non-trivial way of computing a hypotenuse, such as:

http://www.openbsd.org/cgi-bin/cvsweb/src/lib/libm/src/e_hypot.c?rev=1.4;content-type=text%2Fplain
From: Jon Harrop
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <ItWdnQ7i84i4PNvUnZ2dnUVZ8jSdnZ2d@posted.plusnet>
Kaz Kylheku wrote:
> I suggest you look at the truly non-trivial way of computing a hypotenuse,
> such as:
> 
>
http://www.openbsd.org/cgi-bin/cvsweb/src/lib/libm/src/e_hypot.c?rev=1.4;content-type=text%2Fplain

Nice. :-)

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
From: Kaz Kylheku
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <20081231154418.612@gmail.com>
On 2008-12-13, Jon Harrop <···@ffconsultancy.com> wrote:
> Raymond Wiker wrote:
>> Scott <·······@gmail.com> writes:
>>> On Dec 10, 2:07 pm, Raymond Wiker <····@RawMBP.local> wrote:
>>>>
>>>> In Common Lisp:
>>>>
>>>> (defun normalize (list-or-vector)
>>>> (let ((l (sqrt (reduce #'+ (map 'list (lambda (x) (* x x))
>>>> list-or-vector))))) (map (type-of list-or-vector) (lambda (x) (/ x l))
>>>> list-or-vector)))
>>>>
>>>> As a bonus, this works with lists or vectors; it also works with
>>>> complex numbers.
>>>>
>>>
>>> I think you want to throw a (conjugate x) in there for it to give you
>>> the correct answer for complex numbers...
>> 
>> You may well be right, but I cannot quite see why... then
>> again, I have trouble visualising exactly what a multidimensional
>> vector with complex components is :-)
>
> This has nothing to do with vectors. You needed the squared magnitude |x|^2
> to handle the complex case but you wrote x^2 which is incorrect for complex
> numbers in general.

By the way, it's not obvious that computing the magnitude (even if by a very
good absolute value function) and then squaring it is actually better.

On the surface, it seems very stupid.  You're taking a square root (inside the
absolute value function) and then immediately squaring it again, adding up all
the squares and taking another square root.

The way the code is above, only one square root is done. Until that point, it's
all just multiplications and additions.

If I were to superficially guess which approach is more precise (never mind
faster!) I'd go with the one that does performs fewer square root operations.

I agree that for computing the magnitude of a complex number, it's best to use
a library function rather than reinvent the wheel. We have that function. It's
called ABS and works on complexes and reals alike.

If you want the perfect vector norm, you have to code that using a
generalization of the techniques that are applied to complex numbers.

Note that if we have some complex number #c(<real> <imag>) then the product of
that with its conjugate is just <real> * <real> + <imag> * <imag>.  Now we are
just adding these over the vector, and then taking the square root.  This means
that the norm of a complex vector of length N can simply be regarded as the
norm of a vector of reals of length 2N, which is formed by exploding the real
and imaginary components.  The resulting normal vector of length 2N is then
compacted back to a vector of length N, where consecutive pairs of real numbers
are reinterpreted as complex numbers.

Visually:

  (c-norm #(c0 c1 ... cn-1))  
  
     = (complex-pack (real-norm (complex-unpack #(c0 c1 .. cn-1))) 

  (complex-unpack #(c0 c1 .. cn-1)) 

     = `#(,@(explode c0) ,@(explode c1) ... ,@(explode cn-1)))

  (explode c) = `(,(real-part c) ,(imag-part c))

  (complex-pack #(x0 x1 x2 x3 ... x2n-1) 
  
     = `#(,(complex x0 x1) ... ,(complex x2n-2 x2n-1))

Where the ci are complex numbers, ri is the real component of ci, and ii is the
imaginary component of ci.

It would appear that the means to a good complex vector normalize is through a
good real vector normalize.

You want to avoid doing any silly sub-normalizing among the pairs of numbers,
but treat all of the complex and imaginary components globally.

For instance if the norm function needs to sort the numbers in order of
magnitude, it should be done without regard for their pairing in the
original complex numbers.
From: Jon Harrop
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <2MednSeCPskFv9XUnZ2dnUVZ8vidnZ2d@posted.plusnet>
Kaz Kylheku wrote:
> You want to avoid doing any silly sub-normalizing among the pairs of
> numbers, but treat all of the complex and imaginary components globally.

Yes, good idea. This pushes the difficult task of writing numerically robust
code into a real-specific implementation.

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
From: Jon Harrop
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <xPSdnZnTRPdJHt7UnZ2dnUVZ8uednZ2d@posted.plusnet>
Raymond Wiker wrote:
> In Common Lisp:
> 
> (defun normalize (list-or-vector)
>   (let ((l (sqrt (reduce #'+ (map 'list (lambda (x) (* x x))
>   list-or-vector)))))
>     (map (type-of list-or-vector) (lambda (x) (/ x l)) list-or-vector)))
> 
> As a bonus, this works with lists or vectors; it also works with
> complex numbers.
> 
> Since this is Common Lisp, it is also possible to extend this (naive)
> implementation so that it performs as much as possible at
> compile-time, possibly replacing calls with the computed result.
> 
> Stick that in Mathematica's (and Ruby's) pipe and smoke it!

In fact, I think your Lisp code is broken with respect to complex numbers
because you don't take their absolute value. No doubt you'd have spotted
that error immediately had you been using a statically typed language...

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
From: Kaz Kylheku
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <20081228220530.179@gmail.com>
On 2008-12-13, Jon Harrop <···@ffconsultancy.com> wrote:
> Raymond Wiker wrote:
>> In Common Lisp:
>> 
>> (defun normalize (list-or-vector)
>>   (let ((l (sqrt (reduce #'+ (map 'list (lambda (x) (* x x))
>>   list-or-vector)))))
>>     (map (type-of list-or-vector) (lambda (x) (/ x l)) list-or-vector)))
>> 
>> As a bonus, this works with lists or vectors; it also works with
>> complex numbers.
>> 
>> Since this is Common Lisp, it is also possible to extend this (naive)
>> implementation so that it performs as much as possible at
>> compile-time, possibly replacing calls with the computed result.
>> 
>> Stick that in Mathematica's (and Ruby's) pipe and smoke it!
>
> In fact, I think your Lisp code is broken with respect to complex numbers
> because you don't take their absolute value. No doubt you'd have spotted
> that error immediately had you been using a statically typed language...

I don't see how. All the operations are defined for the complex values. You can
multiply them with themselves, and take their square root, divide, etc.

Static polymorphism could make this work too, unless you were missing
something, like an overload of square root for complex numbers. No?

Still, I don't see why the compiler would find a complex number problem if you
aren't actually using complex umbers anywhere.

My math is rusty, but can't we fix this by multiplying the complex x with its
conjugate, rather than just itself?

(Hey, we have it, and it's called conjugate! I just tab-completed on it at the
REPL. CLHS confirms that it's standard).

I.e.:

  (lambda (x) (* x (conjugate x))) ;; same as (* x x) for reals
From: Jon Harrop
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <TfCdnW7bIMENK97UnZ2dnUVZ8sPinZ2d@posted.plusnet>
Kaz Kylheku wrote:
> On 2008-12-13, Jon Harrop <···@ffconsultancy.com> wrote:
>> Raymond Wiker wrote:
>>> In Common Lisp:
>>> 
>>> (defun normalize (list-or-vector)
>>>   (let ((l (sqrt (reduce #'+ (map 'list (lambda (x) (* x x))
>>>   list-or-vector)))))
>>>     (map (type-of list-or-vector) (lambda (x) (/ x l)) list-or-vector)))
>>> 
>>> As a bonus, this works with lists or vectors; it also works with
>>> complex numbers.
>>> 
>>> Since this is Common Lisp, it is also possible to extend this (naive)
>>> implementation so that it performs as much as possible at
>>> compile-time, possibly replacing calls with the computed result.
>>> 
>>> Stick that in Mathematica's (and Ruby's) pipe and smoke it!
>>
>> In fact, I think your Lisp code is broken with respect to complex numbers
>> because you don't take their absolute value. No doubt you'd have spotted
>> that error immediately had you been using a statically typed language...
> 
> I don't see how. All the operations are defined for the complex values.
> You can multiply them with themselves, and take their square root, divide,
> etc.
> 
> Static polymorphism could make this work too, unless you were missing
> something, like an overload of square root for complex numbers. No?

Static type inference would tell you that the return type for a complex
vector is a complex when it should be a real. That is indicative of the
error in Raymond's code.

> Still, I don't see why the compiler would find a complex number problem if
> you aren't actually using complex umbers anywhere.
> 
> My math is rusty, but can't we fix this by multiplying the complex x with
> its conjugate, rather than just itself?
> 
> (Hey, we have it, and it's called conjugate! I just tab-completed on it at
> the REPL. CLHS confirms that it's standard).
> 
> I.e.:
> 
>   (lambda (x) (* x (conjugate x))) ;; same as (* x x) for reals

That returns a complex that always has a zero imaginary component. Better to
use abs and get a real-valued result.

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
From: Kaz Kylheku
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <20081229093834.556@gmail.com>
On 2008-12-13, Jon Harrop <···@ffconsultancy.com> wrote:
> Kaz Kylheku wrote:
>> On 2008-12-13, Jon Harrop <···@ffconsultancy.com> wrote:
>>> Raymond Wiker wrote:
>>>> In Common Lisp:
>>>> 
>>>> (defun normalize (list-or-vector)
>>>>   (let ((l (sqrt (reduce #'+ (map 'list (lambda (x) (* x x))
>>>>   list-or-vector)))))
>>>>     (map (type-of list-or-vector) (lambda (x) (/ x l)) list-or-vector)))
>>>> 
>>>> As a bonus, this works with lists or vectors; it also works with
>>>> complex numbers.
>>>> 
>>>> Since this is Common Lisp, it is also possible to extend this (naive)
>>>> implementation so that it performs as much as possible at
>>>> compile-time, possibly replacing calls with the computed result.
>>>> 
>>>> Stick that in Mathematica's (and Ruby's) pipe and smoke it!
>>>
>>> In fact, I think your Lisp code is broken with respect to complex numbers
>>> because you don't take their absolute value. No doubt you'd have spotted
>>> that error immediately had you been using a statically typed language...
>> 
>> I don't see how. All the operations are defined for the complex values.
>> You can multiply them with themselves, and take their square root, divide,
>> etc.
>> 
>> Static polymorphism could make this work too, unless you were missing
>> something, like an overload of square root for complex numbers. No?
>
> Static type inference would tell you that the return type for a complex
> vector is a complex when it should be a real.

The return type of what? There are a few subexpressions in there
which have a return type.

> error in Raymond's code.
>
>> Still, I don't see why the compiler would find a complex number problem if
>> you aren't actually using complex umbers anywhere.
>> 
>> My math is rusty, but can't we fix this by multiplying the complex x with
>> its conjugate, rather than just itself?
>> 
>> (Hey, we have it, and it's called conjugate! I just tab-completed on it at
>> the REPL. CLHS confirms that it's standard).
>> 
>> I.e.:
>> 
>>   (lambda (x) (* x (conjugate x))) ;; same as (* x x) for reals
>
> That returns a complex that always has a zero imaginary component.

Is that necessarily so? There could be support for
complex numbers constructed from exact componets:

bash$ clisp -q
[1]> #c(1 1)
#C(1 1)
[2]> (* #c(1 1) (conjugate #c(1 1)))
2
[3]> (type-of (* #c(1 1) (conjugate #c(1 1))))
(INTEGER 0 16777215)

How about the conjugate of a real number?

[4]> (conjugate (sin 3.0))
0.14112

``The conjugate of a real is itself.'' -ANSI CL.

Obviously, Lisp is not a language that you know.

Mathematics isn't statically typed. The type of a mathematical expression
depends on what is going on in that expression.  4 / 3 is a rational number,
but 4 / 2 is an integer.  If these are variables x / y, then you don't know
what the exact type is, even if you individually know X and Y to be integers.
You could blindly assign the type rational to  x / y, but that's
crippling. In math the type of something is what that thing actually is.

If the vector is complex valued, does it matter if (*x (conjugate x)) returns a
value that is some small epsilon north or south of the real number line?  Even
if we end up with a complex denominator, where the complex part is small, the
normalization is still valid. It's inexact, of course, but we are working
with inexact numbers.
From: Tamas K Papp
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <6qine4Fco1q7U1@mid.individual.net>
On Sat, 13 Dec 2008 21:07:39 +0000, Kaz Kylheku wrote:

> Mathematics isn't statically typed. The type of a mathematical
> expression depends on what is going on in that expression.  4 / 3 is a
> rational number, but 4 / 2 is an integer.  If these are variables x / y,
> then you don't know what the exact type is, even if you individually
> know X and Y to be integers. You could blindly assign the type rational
> to  x / y, but that's crippling. In math the type of something is what
> that thing actually is.

Apparently you have yet to see the light!  The proper statically typed 
way to do this would be to use union types.  You would keep track of 
whether the number is an integer, a rational, floating point etc, and 
write monster conditionals (sweetened by "pattern matching") for simple 
arithmetic operations.  Effectively, you would be implementing dynamic 
typing for numbers, showing how truly powerful F# is!

Wait, Lisp already has dynamic typing.  Damn.  Oh well, I can still write 
this up in the F# Journal.

Tamas
From: Pascal J. Bourguignon
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <87k5a33snn.fsf@informatimago.com>
Tamas K Papp <······@gmail.com> writes:
> Wait, Lisp already has dynamic typing.  Damn.  Oh well, I can still write 
> this up in the F# Journal.

Of course, 3/4 of the papers in language design and 1/4 in other CS
domains are just that: explaining how you can do  in other programming
languages or systems  what is easily done (and doesn't justify a paper)
in lisp.

-- 
__Pascal Bourguignon__
From: Rainer Joswig
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <joswig-0DCC0E.12332214122008@news-europe.giganews.com>
In article <··············@informatimago.com>,
 ···@informatimago.com (Pascal J. Bourguignon) wrote:

> Tamas K Papp <······@gmail.com> writes:
> > Wait, Lisp already has dynamic typing.  Damn.  Oh well, I can still write 
> > this up in the F# Journal.
> 
> Of course, 3/4 of the papers in language design and 1/4 in other CS
> domains are just that: explaining how you can do  in other programming
> languages or systems  what is easily done (and doesn't justify a paper)
> in lisp.

See this:  http://martinfowler.com/dslwip/

-- 
http://lispm.dyndns.org/
From: Jon Harrop
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <-7qdnQWRX8ThPdjUnZ2dnUVZ8rKdnZ2d@posted.plusnet>
Pascal J. Bourguignon wrote:
> Tamas K Papp <······@gmail.com> writes:
>> Wait, Lisp already has dynamic typing.  Damn.  Oh well, I can still write
>> this up in the F# Journal.
> 
> Of course, 3/4 of the papers in language design and 1/4 in other CS
> domains are just that: explaining how you can do  in other programming
> languages or systems  what is easily done (and doesn't justify a paper)
> in lisp.

Write incorrect versions of trivial functions whilst mouthing off?

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
From: Jon Harrop
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <-7qdnQSRX8TZPNjUnZ2dnUVZ8rKdnZ2d@posted.plusnet>
Tamas K Papp wrote:
> On Sat, 13 Dec 2008 21:07:39 +0000, Kaz Kylheku wrote:
>> Mathematics isn't statically typed. The type of a mathematical
>> expression depends on what is going on in that expression.  4 / 3 is a
>> rational number, but 4 / 2 is an integer.  If these are variables x / y,
>> then you don't know what the exact type is, even if you individually
>> know X and Y to be integers. You could blindly assign the type rational
>> to  x / y, but that's crippling. In math the type of something is what
>> that thing actually is.
> 
> Apparently you have yet to see the light!  The proper statically typed
> way to do this would be to use union types.

On the contrary, that is how you weaken the static type system to recover
the undesirable properties of dynamic typing. Consequently, the approach
you are advocating is not used in statically-typed languages.

> You would keep track of 
> whether the number is an integer, a rational, floating point etc, and
> write monster conditionals (sweetened by "pattern matching") for simple
> arithmetic operations.  Effectively, you would be implementing dynamic
> typing for numbers, showing how truly powerful F# is!

You would end up with the same bugs that we just saw in the Lisp code.

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
From: Scott
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <eb841f9c-7702-44a8-b50e-9538d381c1af@y1g2000pra.googlegroups.com>
On Dec 14, 5:16 pm, Jon Harrop <····@ffconsultancy.com> wrote:
> Tamas K Papp wrote:
>
> > Apparently you have yet to see the light!  The proper statically typed
> > way to do this would be to use union types.
>
> On the contrary, that is how you weaken the static type system to recover
> the undesirable properties of dynamic typing. Consequently, the approach
> you are advocating is not used in statically-typed languages.
>

This sentiment seems to contradict the following excerpts from the
section "Good Style" in chapter one of your online book:

   type number = Integer of int | Real of float | Complex of float *
float;;

   let good_is_zero = function
       Integer i -> i = 0
     | Real x -> x = 0.
     | Complex (x, y) -> x = 0. && y = 0.;;

I guess in your opinion it is ok for a function to accept different
types of arguments, but not ok for a function to return different
types of results.  Is this a limitation of Ocaml?  I couldn't make the
following work:

    let times_two = function
      Integer i -> i * 2
    | Real x -> x *. 2.
    | Complex (x, y) -> Complex (x *. 2., y *. 2.);;

I would expect the resulting type signature to be:

   val times_two : number -> number = <fun>
From: Jon Harrop
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <Rq6dnZptZceOZ9jUnZ2dnUVZ8tSdnZ2d@posted.plusnet>
Scott wrote:
> On Dec 14, 5:16 pm, Jon Harrop <····@ffconsultancy.com> wrote:
>> Tamas K Papp wrote:
>>
>> > Apparently you have yet to see the light!  The proper statically typed
>> > way to do this would be to use union types.
>>
>> On the contrary, that is how you weaken the static type system to recover
>> the undesirable properties of dynamic typing. Consequently, the approach
>> you are advocating is not used in statically-typed languages.
> 
> This sentiment seems to contradict the following excerpts from the
> section "Good Style"

in the context of pattern matching.

> in chapter one of your online book: 
> 
>    type number = Integer of int | Real of float | Complex of float *
> float;;
> 
>    let good_is_zero = function
>        Integer i -> i = 0
>      | Real x -> x = 0.
>      | Complex (x, y) -> x = 0. && y = 0.;;
>
> I guess in your opinion it is ok for a function to accept different
> types of arguments, but not ok for a function to return different
> types of results.

I don't see a problem with returning different types but you need to be
careful what you mean by "type" in that context. I believe you are
referring to what are called "type constructors" in the context of
languages like OCaml. They are not the same as types.

> Is this a limitation of Ocaml?

No.

> I couldn't make the following work: 
> 
>     let times_two = function
>       Integer i -> i * 2
>     | Real x -> x *. 2.
>     | Complex (x, y) -> Complex (x *. 2., y *. 2.);;
> 
> I would expect the resulting type signature to be:
> 
>    val times_two : number -> number = <fun>

  let times_two = function
    | Integer i -> Integer(i * 2)
    | Real x -> Real(x *. 2.)
    | Complex (x, y) -> Complex (x *. 2., y *. 2.);;

Your last line is probably incorrect, BTW.

You can implement a numeric tower in this way, like the ones in Lisp and
Mathematica but that is generally a very bad idea. Whenever you create sum
types like "number" in OCaml you are weakening the type system by reducing
the number of errors that static typing can catch. This is why these
statically-typed languages do not have general number towers (although
OCaml does have a smaller number tower for exact numeric types because they
do have a lot in common).

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
From: Scott
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <c8fa63c7-6668-4b12-88a7-2771fc74ef60@v5g2000prm.googlegroups.com>
On Dec 14, 11:35 pm, Jon Harrop <····@ffconsultancy.com> wrote:
> Scott wrote:
>
>   let times_two = function
>     | Integer i -> Integer(i * 2)
>     | Real x -> Real(x *. 2.)
>     | Complex (x, y) -> Complex (x *. 2., y *. 2.);;
>

Thank you.

>
> You can implement a numeric tower in this way, like the ones in Lisp and
> Mathematica but that is generally a very bad idea. Whenever you create sum
> types like "number" in OCaml you are weakening the type system by reducing
> the number of errors that static typing can catch. This is why these
> statically-typed languages do not have general number towers (although
> OCaml does have a smaller number tower for exact numeric types because they
> do have a lot in common).
>

I got that definition of "number" from your web site under the section
"Good Design".  I take it you've changed your mind on this since you
wrote/published it.
From: Jon Harrop
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <ItWdnQ_i84gxPdvUnZ2dnUVZ8jSdnZ2d@posted.plusnet>
Scott wrote:
> I got that definition of "number" from your web site under the section
> "Good Design".

Good design of pattern matches.

> I take it you've changed your mind on this since you wrote/published it.

Follow that advice and your pattern matches will be robust during
development. The advice has nothing to do with numeric towers.

I'm sure you already knew that though, so taking statements out of context
is presumably the nearest you can get to a counter argument.

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
From: Scott
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <5630b150-82df-4277-9298-aca4ec27296e@a37g2000pre.googlegroups.com>
On Dec 15, 11:26 am, Jon Harrop <····@ffconsultancy.com> wrote:
> Scott wrote:
> > I got that definition of "number" from your web site under the section
> > "Good Design".
>
> Good design of pattern matches.
>
> > I take it you've changed your mind on this since you wrote/published it.
>
> Follow that advice and your pattern matches will be robust during
> development. The advice has nothing to do with numeric towers.
>
> I'm sure you already knew that though, so taking statements out of context
> is presumably the nearest you can get to a counter argument.
>

Nope, not my intent.  Regardless of the context, I think that example
is a contradiction to what you've been saying in this discussion.  I
was tired last night, and I didn't see that you had already responded
to it in an earlier message.  Looking at it now, I see you use an <H4>
for "Pattern matching", and an <H5> for "Good style", so it is really
a subsection (the font sizes are nearly identical on my browser).  My
bad, but I think if you're so strongly opposed to weakening the type
system that it is an odd choice of example for pattern matching.  I'm
still convinced it is a useful technique, but I'm not going to spend
much more effort defending that position.

You clearly enjoy picking fights with the Lispers and advocating your
own particular religion on typing.  I don't really have a dog in that
fight...
From: Jon Harrop
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <-NedneFJwOCgg9rUnZ2dnUVZ8qfinZ2d@posted.plusnet>
Scott wrote:
> Nope, not my intent.  Regardless of the context, I think that example
> is a contradiction to what you've been saying in this discussion.  I
> was tired last night, and I didn't see that you had already responded
> to it in an earlier message.  Looking at it now, I see you use an <H4>
> for "Pattern matching", and an <H5> for "Good style", so it is really
> a subsection (the font sizes are nearly identical on my browser).  My
> bad, but I think if you're so strongly opposed to weakening the type
> system that it is an odd choice of example for pattern matching.

On the contrary, that is the sole purpose of sum types and the main use of
pattern matching. Perhaps I could have chosen a more useful example like a
scene graph but a numeric tower has obvious meaning and seemed ideal for a
first chapter.

> I'm still convinced it is a useful technique, but I'm not going to spend
> much more effort defending that position.

It certainly is a useful technique but an all-encompassing numeric tower is
not one of them. A numeric tower over exact numeric types makes a lot more
sense, IMHO.

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
From: André Thieme
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <gi1drs$h1u$1@news.motzarella.org>
Kaz Kylheku schrieb:

> Mathematics isn't statically typed. The type of a mathematical expression
> depends on what is going on in that expression.  4 / 3 is a rational number,
> but 4 / 2 is an integer.  If these are variables x / y, then you don't know
> what the exact type is, even if you individually know X and Y to be integers.
> You could blindly assign the type rational to  x / y, but that's
> crippling. In math the type of something is what that thing actually is.

I could not respect a statically typed language that would compile
programs that do
x / y
unless the compiler can 100% prove that y is not zero.
Otherwise it should be a compile time error.
So, you would have to put the x / y into an if and only execute this
branch if y is zero. Otherwise the program must continue in some other
way, for example throwing an exception.
So, that would be manual dynamic typing.
I hope that F# and OCaml won�t compile programs where division is used
unless it is explicitly wrapped in an if.
Also okay would be things like:
let y = 4;
some_function x / y
...


Andr�
-- 
From: Mark Wooding
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <slrngkj8dr.k2f.mdw@metalzone.distorted.org.uk>
André Thieme <······························@justmail.de> wrote:

> So, you would have to put the x / y into an if and only execute this
> branch if y is zero. Otherwise the program must continue in some other
> way, for example throwing an exception.

Surely the runtime exception is just as unacceptable as the divide-by-
zero error.  The right approach must be to attach an appropriate
precondition to the containing function and impose a proof obligation --
enforced by the compiler -- on all callers to demonstrate that the
precondition is met and that therefore division by zero is impossible.

This is, of course, the way things are done in some areas of
programming, when safety or security are critical.  But it's not
actually very convenient for the rest of us...

-- [mdw]
From: Jon Harrop
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <uJadnRoUgO2pH9nUnZ2dnUVZ8rGdnZ2d@posted.plusnet>
Kaz Kylheku wrote:
> On 2008-12-13, Jon Harrop <···@ffconsultancy.com> wrote:
>> Kaz Kylheku wrote:
>>> On 2008-12-13, Jon Harrop <···@ffconsultancy.com> wrote:
>>>> Raymond Wiker wrote:
>>>>> In Common Lisp:
>>>>> 
>>>>> (defun normalize (list-or-vector)
>>>>>   (let ((l (sqrt (reduce #'+ (map 'list (lambda (x) (* x x))
>>>>>   list-or-vector)))))
>>>>>     (map (type-of list-or-vector) (lambda (x) (/ x l))
>>>>>     list-or-vector)))
>>>>> 
>>>>> As a bonus, this works with lists or vectors; it also works with
>>>>> complex numbers.
>>>>> 
>>>>> Since this is Common Lisp, it is also possible to extend this (naive)
>>>>> implementation so that it performs as much as possible at
>>>>> compile-time, possibly replacing calls with the computed result.
>>>>> 
>>>>> Stick that in Mathematica's (and Ruby's) pipe and smoke it!
>>>>
>>>> In fact, I think your Lisp code is broken with respect to complex
>>>> numbers because you don't take their absolute value. No doubt you'd
>>>> have spotted that error immediately had you been using a statically
>>>> typed language...
>>> 
>>> I don't see how. All the operations are defined for the complex values.
>>> You can multiply them with themselves, and take their square root,
>>> divide, etc.
>>> 
>>> Static polymorphism could make this work too, unless you were missing
>>> something, like an overload of square root for complex numbers. No?
>>
>> Static type inference would tell you that the return type for a complex
>> vector is a complex when it should be a real.
> 
> The return type of what?

His "normalize" function.

>>> Still, I don't see why the compiler would find a complex number problem
>>> if you aren't actually using complex umbers anywhere.
>>> 
>>> My math is rusty, but can't we fix this by multiplying the complex x
>>> with its conjugate, rather than just itself?
>>> 
>>> (Hey, we have it, and it's called conjugate! I just tab-completed on it
>>> at the REPL. CLHS confirms that it's standard).
>>> 
>>> I.e.:
>>> 
>>>   (lambda (x) (* x (conjugate x))) ;; same as (* x x) for reals
>>
>> That returns a complex that always has a zero imaginary component.
> 
> Is that necessarily so?

Unless you want to generate a complex with a zero imaginary part and then
run-time coerce it to discard it unnecessarily, yes.

> There could be support for 
> complex numbers constructed from exact componets:
> 
> bash$ clisp -q
> [1]> #c(1 1)
> #C(1 1)
> [2]> (* #c(1 1) (conjugate #c(1 1)))
> 2
> [3]> (type-of (* #c(1 1) (conjugate #c(1 1))))
> (INTEGER 0 16777215)
> 
> How about the conjugate of a real number?
> 
> [4]> (conjugate (sin 3.0))
> 0.14112
> 
> ``The conjugate of a real is itself.'' -ANSI CL.

Irrelevant. We are not computing the conjugate of a real.

> Obviously, Lisp is not a language that you know.

As you can see, you're wrong:

* (defun normalize (z) (* z (conjugate z)))

NORMALIZE

* (normalize #c(0.2d0 0.3d0))

#C(0.13d0 0.0d0)

Note the redundant zero-valued imaginary part in the result.

> Mathematics isn't statically typed.

Without a definition of "type". That statement (and all that follows) is
meaningless.

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
From: John W Kennedy
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <49404775$0$4893$607ed4bc@cv.net>
Xah Lee wrote:
> In lisp, python, perl, etc, you'll have 10 or so lines. In C or Java,
> you'll have 50 or hundreds lines.

C:

#include <stdlib.h>
#include <math.h>

void normal(int dim, float* x, float* a) {
    float sum = 0.0f;
    int i;
    float divisor;
    for (i = 0; i < dim; ++i) sum += x[i] * x[i];
    divisor = sqrt(sum);
    for (i = 0; i < dim; ++i) a[i] = x[i]/divisor;
}

Java:

static float[] normal(final float[] x) {
    float sum = 0.0f;
    for (int i = 0; i < x.length; ++i) sum += x[i] * x[i];
    final float divisor = (float) Math.sqrt(sum);
    float[] a = new float[x.length];
    for (int i = 0; i < x.length; ++i) a[i] = x[i]/divisor;
    return a;
}


-- 
John W. Kennedy
  "Never try to take over the international economy based on a radical 
feminist agenda if you're not sure your leader isn't a transvestite."
   -- David Misch:  "She-Spies", "While You Were Out"
From: Bakul Shah
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <49405AD2.7010207@bitblocks.com>
John W Kennedy wrote:
> Xah Lee wrote:
>> In lisp, python, perl, etc, you'll have 10 or so lines. In C or Java,
>> you'll have 50 or hundreds lines.
> 
> C:
> 
> #include <stdlib.h>
> #include <math.h>
> 
> void normal(int dim, float* x, float* a) {
>    float sum = 0.0f;
>    int i;
>    float divisor;
>    for (i = 0; i < dim; ++i) sum += x[i] * x[i];
>    divisor = sqrt(sum);
>    for (i = 0; i < dim; ++i) a[i] = x[i]/divisor;
> }
> 
> Java:
> 
> static float[] normal(final float[] x) {
>    float sum = 0.0f;
>    for (int i = 0; i < x.length; ++i) sum += x[i] * x[i];
>    final float divisor = (float) Math.sqrt(sum);
>    float[] a = new float[x.length];
>    for (int i = 0; i < x.length; ++i) a[i] = x[i]/divisor;
>    return a;
> }
> 
> 

q){x%sqrt sum x}3 4
0.6 0.8
From: Bakul Shah
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <49405DDE.4020509@bitblocks.com>
Bakul Shah wrote:
> John W Kennedy wrote:
>> Xah Lee wrote:
>>> In lisp, python, perl, etc, you'll have 10 or so lines. In C or Java,
>>> you'll have 50 or hundreds lines.
>>
>> C:
>>
>> #include <stdlib.h>
>> #include <math.h>
>>
>> void normal(int dim, float* x, float* a) {
>>    float sum = 0.0f;
>>    int i;
>>    float divisor;
>>    for (i = 0; i < dim; ++i) sum += x[i] * x[i];
>>    divisor = sqrt(sum);
>>    for (i = 0; i < dim; ++i) a[i] = x[i]/divisor;
>> }
>>
>> Java:
>>
>> static float[] normal(final float[] x) {
>>    float sum = 0.0f;
>>    for (int i = 0; i < x.length; ++i) sum += x[i] * x[i];
>>    final float divisor = (float) Math.sqrt(sum);
>>    float[] a = new float[x.length];
>>    for (int i = 0; i < x.length; ++i) a[i] = x[i]/divisor;
>>    return a;
>> }
>>
>>
> 
> q){x%sqrt sum x}3 4
> 0.6 0.8

Oops. I meant to write {x%sqrt sum x*x}3 4
From: Xah Lee
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <5ebe5a7d-cbdf-4d66-a816-a7d2a0a273c9@40g2000prx.googlegroups.com>
On Dec 10, 2:47 pm, John W Kennedy <·······@attglobal.net> wrote:
> Xah Lee wrote:
> > In lisp, python, perl, etc, you'll have 10 or so lines. In C or Java,
> > you'll have 50 or hundreds lines.
>
> C:
>
> #include <stdlib.h>
> #include <math.h>
>
> void normal(int dim, float* x, float* a) {
>     float sum = 0.0f;
>     int i;
>     float divisor;
>     for (i = 0; i < dim; ++i) sum += x[i] * x[i];
>     divisor = sqrt(sum);
>     for (i = 0; i < dim; ++i) a[i] = x[i]/divisor;
>
> }
>
> Java:
>
> static float[] normal(final float[] x) {
>     float sum = 0.0f;
>     for (int i = 0; i < x.length; ++i) sum += x[i] * x[i];
>     final float divisor = (float) Math.sqrt(sum);
>     float[] a = new float[x.length];
>     for (int i = 0; i < x.length; ++i) a[i] = x[i]/divisor;
>     return a;
>
> }

Thanks to various replies.

I've now gather code solutions in ruby, python, C, Java, here:

• A Example of Mathematica's Expressiveness
  http://xahlee.org/UnixResource_dir/writ/Mathematica_expressiveness.html

now lacking is perl, elisp, which i can do well in a condensed way.
It'd be interesting also to have javascript... and perhaps erlang,
OCaml/F#, Haskell too.

  Xah
∑ http://xahlee.org/

☄
From: Chris Rathman
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <16459ac1-32b7-408c-80ff-16469af4228a@l33g2000pri.googlegroups.com>
On Dec 10, 6:51 pm, Xah Lee <······@gmail.com> wrote:
> I've now gather code solutions in ruby, python, C, Java, here:
>
>  now lacking is perl, elisp, which i can do well in a condensed way.
>  It'd be interesting also to have javascript... and perhaps erlang,
>  OCaml/F#, Haskell too.

Pay me $600 for my time and I'll even throw in an Algol-68
version.  :-)
From: Jim Gibson
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <111220081211063566%jimsgibson@gmail.com>
In article
<····································@40g2000prx.googlegroups.com>, Xah
Lee <······@gmail.com> wrote:

> On Dec 10, 2:47�pm, John W Kennedy <·······@attglobal.net> wrote:
> > Xah Lee wrote:
> > > In lisp, python, perl, etc, you'll have 10 or so lines. In C or Java,
> > > you'll have 50 or hundreds lines.
> >
> > C:
> >
> > #include <stdlib.h>
> > #include <math.h>
> >
> > void normal(int dim, float* x, float* a) {
> > � � float sum = 0.0f;
> > � � int i;
> > � � float divisor;
> > � � for (i = 0; i < dim; ++i) sum += x[i] * x[i];
> > � � divisor = sqrt(sum);
> > � � for (i = 0; i < dim; ++i) a[i] = x[i]/divisor;
> >
> > }
> >
> > Java:
> >
> > static float[] normal(final float[] x) {
> > � � float sum = 0.0f;
> > � � for (int i = 0; i < x.length; ++i) sum += x[i] * x[i];
> > � � final float divisor = (float) Math.sqrt(sum);
> > � � float[] a = new float[x.length];
> > � � for (int i = 0; i < x.length; ++i) a[i] = x[i]/divisor;
> > � � return a;
> >
> > }
> 
> Thanks to various replies.
> 
> I've now gather code solutions in ruby, python, C, Java, here:
> 
> � A Example of Mathematica's Expressiveness
>   http://xahlee.org/UnixResource_dir/writ/Mathematica_expressiveness.html
> 
> now lacking is perl, elisp, which i can do well in a condensed way.
> It'd be interesting also to have javascript... and perhaps erlang,
> OCaml/F#, Haskell too.

Perl:

sub normal
{
  my $sum = 0;
  $sum += $_ ** 2 for @_;
  my $length = sqrt($sum);
  return map { $_/$length } @_;
}

-- 
Jim Gibson
From: Xah Lee
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <452ba58b-9377-4463-85e2-5ecdaa25a129@t39g2000prh.googlegroups.com>
On Dec 10, 2:47 pm, John W Kennedy <·······@attglobal.net> wrote:
> Xah Lee wrote:
> > In lisp, python, perl, etc, you'll have 10 or so lines. In C or Java,
> > you'll have 50 or hundreds lines.
>
> C:
>
> #include <stdlib.h>
> #include <math.h>
>
> void normal(int dim, float* x, float* a) {
>     float sum = 0.0f;
>     int i;
>     float divisor;
>     for (i = 0; i < dim; ++i) sum += x[i] * x[i];
>     divisor = sqrt(sum);
>     for (i = 0; i < dim; ++i) a[i] = x[i]/divisor;
>
> }

i don't have experience coding C. The code above doesn't seems to
satisfy the spec. The input should be just a vector, array, list, or
whatever the lang supports.
The output is the same datatype of the same dimension.

  Xah
∑ http://xahlee.org/

☄
From: Jon Harrop
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <S_adnUFJfP3bB9zUnZ2dnUVZ8sPinZ2d@posted.plusnet>
Xah Lee wrote:
> On Dec 10, 2:47 pm, John W Kennedy <·······@attglobal.net> wrote:
>> Xah Lee wrote:
>> > In lisp, python, perl, etc, you'll have 10 or so lines. In C or Java,
>> > you'll have 50 or hundreds lines.
>>
>> C:
>>
>> #include <stdlib.h>
>> #include <math.h>
>>
>> void normal(int dim, float* x, float* a) {
>> float sum = 0.0f;
>> int i;
>> float divisor;
>> for (i = 0; i < dim; ++i) sum += x[i] * x[i];
>> divisor = sqrt(sum);
>> for (i = 0; i < dim; ++i) a[i] = x[i]/divisor;
>>
>> }
> 
> i don't have experience coding C. The code above doesn't seems to
> satisfy the spec. The input should be just a vector, array, list, or
> whatever the lang supports.
> The output is the same datatype of the same dimension.

The output is in the preallocated argument "a". It is the same type (float
*) and has the same dimension. That is idiomatic C.

You could define a struct type representing a vector that includes its
length and data (akin to std::vector<..> in C++) but it would still be
nowhere near 50 LOC as you claimed.

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
From: George Neuner
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <6375k4pf1mleb6im67nkpec5olqbsom88e@4ax.com>
On Thu, 11 Dec 2008 10:41:59 -0800 (PST), Xah Lee <······@gmail.com>
wrote:

>On Dec 10, 2:47�pm, John W Kennedy <·······@attglobal.net> wrote:
>> Xah Lee wrote:
>> > In lisp, python, perl, etc, you'll have 10 or so lines. In C or Java,
>> > you'll have 50 or hundreds lines.
>>
>> C:
>>
>> #include <stdlib.h>
>> #include <math.h>
>>
>> void normal(int dim, float* x, float* a) {
>> � � float sum = 0.0f;
>> � � int i;
>> � � float divisor;
>> � � for (i = 0; i < dim; ++i) sum += x[i] * x[i];
>> � � divisor = sqrt(sum);
>> � � for (i = 0; i < dim; ++i) a[i] = x[i]/divisor;
>>
>> }
>
>i don't have experience coding C. 

Then why do you talk about it as if you know something?

>The code above doesn't seems to satisfy the spec.

It does.

>The input should be just a vector, array, list, or
>whatever the lang supports. The output is the same 
>datatype of the same dimension.

C's native arrays are stored contiguously.  Multidimensional arrays
can be accessed as a vector of length (dim1 * dim2 * ... * dimN).

This code handles arrays of any dimensionality.  The poorly named
argument 'dim' specifies the total number of elements in the array.

George
From: Bakul Shah
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <4942D561.5020203@bitblocks.com>
George Neuner wrote:
> On Thu, 11 Dec 2008 10:41:59 -0800 (PST), Xah Lee <······@gmail.com>
> wrote:
> 
>> On Dec 10, 2:47 pm, John W Kennedy <·······@attglobal.net> wrote:
>>> Xah Lee wrote:
>>>> In lisp, python, perl, etc, you'll have 10 or so lines. In C or Java,
>>>> you'll have 50 or hundreds lines.
>>> C:
>>>
>>> #include <stdlib.h>
>>> #include <math.h>
>>>
>>> void normal(int dim, float* x, float* a) {
>>>     float sum = 0.0f;
>>>     int i;
>>>     float divisor;
>>>     for (i = 0; i < dim; ++i) sum += x[i] * x[i];
>>>     divisor = sqrt(sum);
>>>     for (i = 0; i < dim; ++i) a[i] = x[i]/divisor;
>>>
>>> }
>> i don't have experience coding C. 
> 
> Then why do you talk about it as if you know something?
> 
>> The code above doesn't seems to satisfy the spec.
> 
> It does.
> 
>> The input should be just a vector, array, list, or
>> whatever the lang supports. The output is the same 
>> datatype of the same dimension.
> 
> C's native arrays are stored contiguously.  Multidimensional arrays
> can be accessed as a vector of length (dim1 * dim2 * ... * dimN).
> 
> This code handles arrays of any dimensionality.  The poorly named
> argument 'dim' specifies the total number of elements in the array.
> 
> George

Only if the length in each dimension is known at compile time (or
in C99, if this is an automatic array). When this is not the case,
you may have to implement something like the following (not the only
way, just one way):

float** new_matrix(int rows, int cols) {
	float** m = malloc(sizeof(float*)*rows);
	int i;
	for (i = 0; i < rows; i++)
		m[i] = malloc(sizeof(float)*cols);
	return m;
}

In this case normal() fails since matrix m is not in a single
contiguous area.

But I suspect Xah is complaining because the function doesn't
*return* a value of the same type; instead you have to pass in
the result vector. But such is life if you code in C!
From: Xah Lee
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <a917ba86-8f89-4419-b491-5e17fd80b569@z6g2000pre.googlegroups.com>
> >On Dec 10, 2:47 pm, John W Kennedy <·······@attglobal.net> wrote:
> >> C:
>
> >> #include <stdlib.h>
> >> #include <math.h>
>
> >> void normal(int dim, float* x, float* a) {
> >>     float sum = 0.0f;
> >>     int i;
> >>     float divisor;
> >>     for (i = 0; i < dim; ++i) sum += x[i] * x[i];
> >>     divisor = sqrt(sum);
> >>     for (i = 0; i < dim; ++i) a[i] = x[i]/divisor;
>
> >> }

Due to the low level of C, this C example should perhaps then accept a
sequence of numbers separated by space, and print the output. Having
dimension as part of input is not acceptable.

For examples in other langs that can take any input (in source code)
and print output, see:
http://xahlee.org/UnixResource_dir/writ/Mathematica_expressiveness.html

A new addition is in Scheme Lisp:

;; Scheme Lisp. By Bakul Shah
(define (normalize vec)
   (let ((d (sqrt (apply + (map (lambda (x) (* x x)) vec)))))
     (map (lambda (x) (/ x d)) vec)))

(normalize '(3 4))

The JavaScript example:

// Javascript. By William James
function normalize( vec ) {
var div=Math.sqrt(vec.map(function(x) x*x).reduce(function(a,b) a+b))
  return vec.map(function(x) x/div)
}

is also not qualified. (it is syntax error in SpiderMonkey engine
“JavaScript-C 1.7.0 2007-10-03”)

  Xah
∑ http://xahlee.org/

☄
From: ·········@yahoo.com
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <c7cc6dc4-f807-4833-9397-7d590887602d@e1g2000pra.googlegroups.com>
On Dec 25, 5:24 am, Xah Lee <······@gmail.com> wrote:

> The JavaScript example:
>
> // Javascript. By William James
> function normalize( vec ) {
> var div=Math.sqrt(vec.map(function(x) x*x).reduce(function(a,b) a+b))
>   return vec.map(function(x) x/div)
>
> }
>
> is also not qualified. (it is syntax error in SpiderMonkey engine
> “JavaScript-C 1.7.0 2007-10-03”)

Since you are using the latest version of Mathematica, you should
also use the latest version of SpiderMonkey.

The function works outside of a web browser with jslibs, and it
works in Firefox 3.0.1.

<html>
<body>

<script type="application/javascript;version=1.8"/>

// Tested with Firefox 3.0.1.
// SpiderMonkey JavaScript 1.6 added map().
// 1.8 added reduce() and function shorthand:
// function(x) { return x * x }
//   can now be:
// function(x) x * x

// Javascript. By William James
function normalize( vec ) {
var div=Math.sqrt(vec.map(function(x) x*x).reduce(function(a,b) a+b))
  return vec.map(function(x) x/div)
}

window.alert( normalize( [2,3,4] ).toSource() )

</script>

</body>
</html>
From: William James
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <gm0npf014o7@enews4.newsguy.com>
·········@yahoo.com wrote:

> On Dec 25, 5:24�am, Xah Lee <······@gmail.com> wrote:
> 
> > The JavaScript example:
> > 
> > // Javascript. By William James
> > function normalize( vec ) {
> > var div=Math.sqrt(vec.map(function(x) x*x).reduce(function(a,b)
> > a+b)) � return vec.map(function(x) x/div)
> > 
> > }
> > 
> > is also not qualified. (it is syntax error in SpiderMonkey engine
> > �JavaScript-C 1.7.0 2007-10-03�)
> 
> Since you are using the latest version of Mathematica, you should
> also use the latest version of SpiderMonkey.
> 
> The function works outside of a web browser with jslibs, and it
> works in Firefox 3.0.1.
> 
> <html>
> <body>
> 
> <script type="application/javascript;version=1.8"/>
> 
> // Tested with Firefox 3.0.1.
> // SpiderMonkey JavaScript 1.6 added map().
> // 1.8 added reduce() and function shorthand:
> // function(x) { return x * x }
> //   can now be:
> // function(x) x * x
> 
> // Javascript. By William James
> function normalize( vec ) {
> var div=Math.sqrt(vec.map(function(x) x*x).reduce(function(a,b) a+b))
>   return vec.map(function(x) x/div)
> }
> 
> window.alert( normalize( [2,3,4] ).toSource() )
> 
> </script>
> 
> </body>
> </html>

Reduce:

procedure normalize vec;
  begin scalar div;
    div := for each u in vec sum u^2;
    return map(~x/div, vec)
  end;
From: Jon Harrop
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <P5ednRHi8Zr6Ys7UnZ2dnUVZ8u-dnZ2d@posted.plusnet>
Xah Lee wrote:
>> >On Dec 10, 2:47 pm, John W Kennedy <·······@attglobal.net> wrote:
>> >> C:
>>
>> >> #include <stdlib.h>
>> >> #include <math.h>
>>
>> >> void normal(int dim, float* x, float* a) {
>> >>     float sum = 0.0f;
>> >>     int i;
>> >>     float divisor;
>> >>     for (i = 0; i < dim; ++i) sum += x[i] * x[i];
>> >>     divisor = sqrt(sum);
>> >>     for (i = 0; i < dim; ++i) a[i] = x[i]/divisor;
>>
>> >> }
> 
> Due to the low level of C, this C example should perhaps then accept a
> sequence of numbers separated by space...

In other words, you want a REPL.

Why don't we have another challenge that involves handling a non-trivial
input format, i.e. parsing?

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
From: ···@netherlands.com
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <8j78l4drqjg46ps74up555bfu44kct27f7@4ax.com>
On Thu, 25 Dec 2008 21:50:29 +0000, Jon Harrop <···@ffconsultancy.com> wrote:

>Xah Lee wrote:
>>> >On Dec 10, 2:47 pm, John W Kennedy <·······@attglobal.net> wrote:
>>> >> C:
>>>
>>> >> #include <stdlib.h>
>>> >> #include <math.h>
>>>
>>> >> void normal(int dim, float* x, float* a) {
>>> >>     float sum = 0.0f;
>>> >>     int i;
>>> >>     float divisor;
>>> >>     for (i = 0; i < dim; ++i) sum += x[i] * x[i];
>>> >>     divisor = sqrt(sum);
>>> >>     for (i = 0; i < dim; ++i) a[i] = x[i]/divisor;
>>>
>>> >> }
>> 
>> Due to the low level of C, this C example should perhaps then accept a
>> sequence of numbers separated by space...
>
>In other words, you want a REPL.
>
>Why don't we have another challenge that involves handling a non-trivial
>input format, i.e. parsing?

Maybe you could speed it up.

sln

----------------------------------
void normal (int Dimension, double *X, double *A)
{
     double *xp, *ap, divisor, sum;
     int i;

     for ( i=0, sum=0., xp=X; i<Dimension; i++, xp++)
          sum += (*xp) ** 2;
     divisor = sqrt ( sum );
     for ( i=0, ap=A, xp=X; i<Dimension; i++, ap++, xp++)
          *ap = *xp / divisor;

     // if Dimension is greater than
     // sizeof double X[] or double A[] it will GPF.
     // this is a really, really bad design.
}
From: Steven D'Aprano
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <01643cc7$0$6988$c3e8da3@news.astraweb.com>
On Fri, 26 Dec 2008 00:05:02 +0000, sln wrote:

> On Thu, 25 Dec 2008 21:50:29 +0000, Jon Harrop <···@ffconsultancy.com>
> wrote:
> 
>>Xah Lee wrote:

[Typical Xah Lee post excised]

>>Why don't we have another challenge that involves handling a non-trivial
>>input format, i.e. parsing?
> 
> Maybe you could speed it up.

Maybe you people could stop cross-posting your pissing contests to 
newsgroups where they aren't wanted?

It's bad enough that Xah Lee cross-posts in the first place, without 
others keeping the thread alive.


-- 
Steven
From: Tamas K Papp
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <6rivskF1stjpU1@mid.individual.net>
On Fri, 26 Dec 2008 02:46:59 +0000, Steven D'Aprano wrote:

> It's bad enough that Xah Lee cross-posts in the first place, without
> others keeping the thread alive.

Maybe people could _stop replying_ to trolls?  (1) Please have no effect 
on them, & (2) I already have them filtered out.  But I see the replies.  
If people didn't reply to these idiots, I wouldn't even notice that they 
exist.

Tamas
From: Tamas K Papp
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <6rivuaF1stjpU2@mid.individual.net>
On Fri, 26 Dec 2008 03:03:48 +0000, Tamas K Papp wrote:

> Maybe people could _stop replying_ to trolls?  (1) Please have no effect

Meant "pleas".
From: Jerome Baum
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <gj6sjd$abn$1@gwaiyur.mb-net.net>
> Maybe people could _stop replying_ to trolls?  (1) Please have 
> no effect
> on them, & (2) I already have them filtered out.  But I see the 
> replies.
> If people didn't reply to these idiots, I wouldn't even notice 
> that they
> exist.

I'll take the risk involved with posting a two word post:

"second that!"

Hold on, that's 13 words -- and now 18 words -- and now 22 -- ...

- Jerome 
From: William James
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <ghquiq018d5@enews2.newsguy.com>
John W Kennedy wrote:

> Xah Lee wrote:
> > In lisp, python, perl, etc, you'll have 10 or so lines. In C or
> > Java, you'll have 50 or hundreds lines.
> 

> Java:
> 
> static float[] normal(final float[] x) {
>    float sum = 0.0f;
>    for (int i = 0; i < x.length; ++i) sum += x[i] * x[i];
>    final float divisor = (float) Math.sqrt(sum);
>    float[] a = new float[x.length];
>    for (int i = 0; i < x.length; ++i) a[i] = x[i]/divisor;
>    return a;
> }

"We don't need no stinkin' loops!"

SpiderMonkey Javascript:

function normal( ary )
{ div=Math.sqrt(ary.map(function(x) x*x).reduce(function(a,b) a+b))
  return ary.map(function(x) x/div)
}
From: Kaz Kylheku
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <20081226055859.604@gmail.com>
On 2008-12-05, Xah Lee <······@gmail.com> wrote:
> Let's say for example, we want to write a function that takes a vector
> (of linear algebra), and return a vector in the same direction but
> with length 1. In linear algebar terminology, the new vector is called
> the “normalized” vector of the original.
>
> For those of you who don't know linear algebra but knows coding, this

If I were to guess who that would be ...

> means, we want a function whose input is a list of 3 elements say
> {x,y,z}, and output is also a list of 3 elements, say {a,b,c}, with
> the condition that
>
> a = x/Sqrt[x^2+y^2+z^2]
> b = y/Sqrt[x^2+y^2+z^2]
> c = z/Sqrt[x^2+y^2+z^2]
>
> In lisp, python, perl, etc, you'll have 10 or so lines. In C or Java,
> you'll have 50 or hundreds lines.

Really? ``50 or hundreds'' of lines in C?

  #include <math.h> /* for sqrt */

  void normalize(double *out, double *in)
  {
	double denom = sqrt(in[0] * in[0] + in[1] * in[1] + in[2] * in[2]);

        out[0] = in[0]/denom;
	out[1] = in[1]/denom;
	out[2] = in[2]/denom;
  }

Doh?

Now try writing a device driver for your wireless LAN adapter in Mathematica.
From: Xah Lee
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <d92d6549-af70-4502-8e1d-acccb815489d@i20g2000prf.googlegroups.com>
Xah Lee wrote:
> > Let's say for example, we want to write a function that takes a vector
> > (of linear algebra), and return a vector in the same direction but
> > with length 1. In linear algebar terminology, the new vector is called
> > the “normalized” vector of the original.
>
> > For those of you who don't know linear algebra but knows coding, this
>
> If I were to guess who that would be ...
>
> > means, we want a function whose input is a list of 3 elements say
> > {x,y,z}, and output is also a list of 3 elements, say {a,b,c}, with
> > the condition that
>
> > a = x/Sqrt[x^2+y^2+z^2]
> > b = y/Sqrt[x^2+y^2+z^2]
> > c = z/Sqrt[x^2+y^2+z^2]
>
> > In lisp, python, perl, etc, you'll have 10 or so lines. In C or Java,
> > you'll have 50 or hundreds lines.


Kaz Kylheku wrote:
> Really? ``50 or hundreds'' of lines in C?
>
>   #include <math.h> /* for sqrt */
>
>   void normalize(double *out, double *in)
>   {
>         double denom = sqrt(in[0] * in[0] + in[1] * in[1] + in[2] * in[2]);
>
>         out[0] = in[0]/denom;
>         out[1] = in[1]/denom;
>         out[2] = in[2]/denom;
>   }
>
> Doh?

Kaz, pay attention:

Xah wrote: «Note, that the “norm” as defined above works for vectors
of any dimention, i.e. list of any length.»

The essay on the example of Mathematica expressiveness of defining
Normalize is now cleaned up and archived at:

• A Example of Mathematica's Expressiveness
  http://xahlee.org/UnixResource_dir/writ/Mathematica_expressiveness.html

  Xah
∑ http://xahlee.org/

☄
From: Jon Harrop
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <EdednSau4flN1N3UnZ2dnUVZ8i2dnZ2d@posted.plusnet>
Xah Lee wrote:
> Kaz Kylheku wrote:
>> Really? ``50 or hundreds'' of lines in C?
>>
>>   #include <math.h> /* for sqrt */
>>
>>   void normalize(double *out, double *in)
>>   {
>>         double denom = sqrt(in[0] * in[0] + in[1] * in[1] + in[2] *
>>         in[2]);
>>
>>         out[0] = in[0]/denom;
>>         out[1] = in[1]/denom;
>>         out[2] = in[2]/denom;
>>   }
>>
>> Doh?
> 
> Kaz, pay attention:
> 
> Xah wrote: «Note, that the “norm” as defined above works for vectors
> of any dimention, i.e. list of any length.»

That is still only 6 lines of C code and not 50 as you claimed:

double il = 0.0;
for (int i=0; i<n; ++i)
  il += in[i] * in[i];
il = 1.0 / sqrt(il);
for (int i=0; i<n; ++i)
  out[i] = il * in[i];

Try computing the Fourier transform of:

  0.007 + 0.01 I, -0.002 - 0.0024 I

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
From: =?UTF-8?B?QXJuZSBWYWpow7hq?=
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <49405469$0$90262$14726298@news.sunsite.dk>
Jon Harrop wrote:
> Xah Lee wrote:
>> Kaz Kylheku wrote:
>>> Really? ``50 or hundreds'' of lines in C?
>>>
>>>   #include <math.h> /* for sqrt */
>>>
>>>   void normalize(double *out, double *in)
>>>   {
>>>         double denom = sqrt(in[0] * in[0] + in[1] * in[1] + in[2] *
>>>         in[2]);
>>>
>>>         out[0] = in[0]/denom;
>>>         out[1] = in[1]/denom;
>>>         out[2] = in[2]/denom;
>>>   }
>>>
>>> Doh?
>> Kaz, pay attention:
>>
>> Xah wrote: «Note, that the “norm” as defined above works for vectors
>> of any dimention, i.e. list of any length.»
> 
> That is still only 6 lines of C code and not 50 as you claimed:
> 
> double il = 0.0;
> for (int i=0; i<n; ++i)
>   il += in[i] * in[i];
> il = 1.0 / sqrt(il);
> for (int i=0; i<n; ++i)
>   out[i] = il * in[i];

Not that it matters, but the above requires C99 (or C++).

Arne
From: Kaz Kylheku
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <20081226091336.487@gmail.com>
On 2008-12-10, Xah Lee <······@gmail.com> wrote:
> Xah Lee wrote:
>> > means, we want a function whose input is a list of 3 elements say
            ^^^^^^^^^^^^^^^^^^             ^^^^^^^^^^^^^^^^^^^^^^^

> Kaz, pay attention:

[ reformatted to 7 bit USASCII ]

> Xah wrote: Note, that the norm
> of any dimention, i.e. list of any length.

It was coded to the above requirements. 
From: George Neuner
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <d075k4tpj01c71b2j57i5v19kufu0dvefm@4ax.com>
On Wed, 10 Dec 2008 21:37:34 +0000 (UTC), Kaz Kylheku
<········@gmail.com> wrote:

>Now try writing a device driver for your wireless LAN adapter in Mathematica.

Notice how Xah chose not to respond to this.

George
From: Rainer Joswig
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <joswig-988E80.18432812122008@news-europe.giganews.com>
In article <··································@4ax.com>,
 George Neuner <········@comcast.net> wrote:

> On Wed, 10 Dec 2008 21:37:34 +0000 (UTC), Kaz Kylheku
> <········@gmail.com> wrote:
> 
> >Now try writing a device driver for your wireless LAN adapter in Mathematica.
> 
> Notice how Xah chose not to respond to this.
> 
> George

For inspiration, here is some old Lisp driver code for an old
3com network card (Ethernet, not WLAN):

http://jrm-code-project.googlecode.com/svn/trunk/lambda/network/drivers/3com.lisp

-- 
http://lispm.dyndns.org/
From: Rob Warnock
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <U-CdnZ1cFKeNu97UnZ2dnUVZ_h6dnZ2d@speakeasy.net>
Rainer Joswig  <······@lisp.de> wrote:
+---------------
| For inspiration, here is some old Lisp driver code for an old
| 3com network card (Ethernet, not WLAN):
| 
| http://jrm-code-project.googlecode.com/svn/trunk/lambda/network/drivers/3com.lisp
+---------------

Heh. This looks a *lot* like the user-mode hardware bringup/debugging
code I was writing in CMUCL during the last few years (for a now-PPoE).  ;-}
Lots of bit- & byte-field definitions, peek & poke stuff, utilities
to encode/pack/unpack/decode hardware register fields from/to readable
symbols, etc. The main obvious difference I noticed was that instead
of using SYS:%NUBUS-READ & SYS:%NUBUS-WRITE to peek/poke at the
hardware, my code did an MMAP of "/dev/mem" and then used CMUCL's
SYSTEM:SAP-REF-{8,16,32} and SETFs of same [wrapped within suitable
syntactic sugar, of course]. Fun stuff!


-Rob

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: Jon Harrop
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <Xv6dncWkcZMllKHUnZ2dnUVZ8oadnZ2d@posted.plusnet>
Xah Lee wrote:
> I didn't realize until after a hour, that if Jon simply give numerical
> arguments to Main and Create, the result timing by a factor of 0.3 of
> original. What a incredible sloppiness! and he intended this to show
> Mathematica speed with this code?
>
> The Main[] function calls Create. The create has 3 parameters: level,
> c, and r. The level is a integer for the recursive level of
> raytracing . The c is a vector for sphere center i presume. The r is
> radius of the sphere. His input has c and r as integers, and this in
> Mathematica means computation with exact arithmetics (and automatic
> kicks into infinite precision if necessary). Changing c and r to float
> immediately reduced the timing to 0.3 of original.

That is only true if you solve a completely different and vastly simpler
problem, which I see you have (see below).

> The RaySphere function contain codes that does symbolic computation by
> calling Im, which is the imaginary part of a complex number!! and if
> so, it returns the symbol Infinity! The possible result of Infinity is
> significant because it is used in Intersect to do a numerical
> comparison in a If statement. So, here in these deep loops,
> Mathematica's symbolic computation is used for numerical purposes!

Infinity is a floating point number.

> So, first optimization at the superficial code form level is to get
> rid of this symbolic computation.

That does not speed up the original computation.

> Instead of checking whethere his “disc = Sqrt[b^2 - v.v + r^2]” has
> imaginary part, one simply check whether the argument to sqrt is
> negative.

That does not speed up the original computation.

> after getting rid of the symbolic computation, i made the RaySphere
> function to be a Compiled function.

That should improve performance but the Mathematica remains well over five
orders of magnitude slower than OCaml, Haskell, Scheme, C, C++, Fortran,
Java and even Lisp!

> Besides the above basic things, there are several aspects that his
> code can improve in speed. For example, he used pattern matching to do
> core loops.
> e.g. Intersect[o_, d_][{lambda_, n_}, Bound[c_, r_, s_]]
> 
> any Mathematica expert knows that this is something you don't want to
> do if it is used in a core loop. Instead of pattern matching, one can
> change the form to Function and it'll speed up.

Your code does not implement this change.

> Also, he used “Block”, which is designed for local variables and the
> scope is dynamic scope. However the local vars used in this are local
> constants. A proper code would use “With” instead. (in lisp, this is
> various let, let*. Lispers here can imagine how lousy the code is
> now.)

Earlier, you said that "Module" should be used. Now you say "With". Which is
it and why?

Your code does not implement this change either.

> Here's a improved code. The timing of this code is about 0.2 of the
> original.
> ...
> Timing[Export["image.pgm",········@······@Main[2,100,4.]]]

You have only observed a speedup because you have drastically simplified the
scene being rendered. Specifically, the scene I gave contained over 80,000
spheres but you are benchmarking with only 5 spheres and half of the image
is blank!

Using nine levels of spheres as I requested originally, your version is not
measurably faster at all.

Perhaps you should give a refund?

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
From: Xah Lee
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <48b8b065-6193-454c-a548-3117449f8a55@40g2000prx.googlegroups.com>
For those interested in this Mathematica problem, i've now cleaned up
the essay with additional comments here:

• A Mathematica Optimization Problem
  http://xahlee.org/UnixResource_dir/writ/Mathematica_optimization.html

The result and speed up of my code can be verified by anyone who has
Mathematica.

Here's some additional notes i added to the above that is not
previously posted.

-------------------------

Advice For Mathematica Optimization

Here's some advice for mathematica optimization, roughly from most
important to less important:

    * Any experienced programer knows, that optimization at the
algorithm level is far more important than at the level of code
construction variation. So, make sure the algorithm used is good, as
opposed to doodling with your code forms. If you can optimize your
algorithm, the speed up may be a order of magnitude. (for example,
various algorithm for sorting algorithms↗ illustrates this.)

    * If you are doing numerical computation, always make sure that
your input and every intermediate step is using machine precision.
This you do by making the numbers in your input using decimal form
(e.g. use “1.”, “N[Pi]” instead of “1”, “Pi”). Otherwise Mathematica
may use exact arithmetics.

    * For numerical computation, do not simply slap “N[]” into your
code. Because the intermediate computation may still be done using
exact arithmetic or symbolic computation.

    * Make sure your core loop, where your calculation is repeated and
takes most of the time spent, is compiled, by using Compile.

    * When optimizing speed, try to avoid pattern matching. If your
function is “f[x_]:= ...”, try to change it to the form of “f=Function
[x,...]” instead.

    * Do not use complicated patterns if not necessary. For example,
use “f[x_,y_]” instead of “f[x_][y_]”.

------------------------------

...

Besides the above basic things, there are several aspects that his
code can improve in speed. For example, he used rather complicated
pattern matching to do intensive numerical computation part. Namely:

Intersect[o_, d_][{lambda_, n_}, Bound[c_, r_, s_]]
Intersect[o_, d_][{lambda_, n_}, Sphere[c_, r_]]

Note that the way the parameters of Intersect defined above is a
nested form. The code would be much faster if you just change the
forms to:

Intersect[o_, d_, {lambda_, n_}, Bound[c_, r_, s_]]
Intersect[o_, d_, {lambda_, n_}, Sphere[c_, r_]]

or even just this:

Intersect[o_, d_, lambda_, n_, c_, r_, s_]
Intersect[o_, d_, lambda_, n_, c_, r_]

Also, note that the Intersect is recursive. Namely, the Intersect
calls itself. Which form is invoked depends on the pattern matching of
the parameters. However, not only that, inside one of the Intersect it
uses Fold to nest itself. So, there are 2 recursive calls going on in
Intersect. Reducing this recursion to a simple one would speed up the
code possibly by a order of magnitude.

Further, if Intersect is made to take a flat sequence of argument as
in “Intersect[o_, d_, lambda_, n_, c_, r_, s_]”, then pattern matching
can be avoided by making it into a pure function “Function”. And when
it is a “Function”, then Intersect or part of it may be compiled with
Compile. When the code is compiled, the speed should be a order of
magnitude faster.

-----------------------------

Someone keeps claiming that Mathematica code is some “5 order of
magnitude slower”. It is funny how the order of magnitude is
quantified. I'm not sure there's a standard interpretation other than
hyperbole.

There's a famous quote by Alan Perlis ( http://en.wikipedia.org/wiki/Alan_Perlis
) that goes:
“A Lisp programmer knows the value of everything, but the cost of
nothing.”

this quote captures the nature of lisp in comparison to most other
langs at the time the quote is written. Lisp is a functional lang, and
in functional langs, the concept of values is critical, because any
lisp program is either a function definition or expression. Function
and expression act on values and return values. The values along with
definitions determines the program behavior. “the cost of nothing”
captures the sense that in high level langs, esp dynamic langs like
lisp, it's easy to do something, but it is more difficult to know the
algorithmic behavior of constructs. This is in contrast to langs like
C, Pascal, or modern lang like Java, where almost anything you write
in it is “fast”, simply forced by the low level nature of the lang.

In a similar way, Mathematica is far more higher level than any
existing lang, counting other so-called computer algebra systems. A
simple one-liner Mathematica construct easily equates to 10 or hundred
lines of lisp, perl, python, and if you count its hundreds of
mathematical functions such as Solve, Derivative, Integrate, each line
of code is equivalent to a few thousands lines in other langs.

However, there is a catch, that applies to any higher level langs,
namely, it is extremely easy, to create a program that are very
inefficient.

This can typically be observed in student or beginner's code in lisp.
The code may produce the right output, but may be extremely
inefficient for lacking expertise with the language.

The phenomenon of creating code that are inefficient is proportional
to the highlevelness or power of the lang. In general, the higher
level of the lang, the less possible it is actually to produce a code
that is as efficient as a lower level lang. For example, the level or
power of lang can be roughly order as this:

assembly langs
C, pascal
C++, java, c#
unix shells
perl, python, ruby, php
lisp
Mathematica

the lower level the lang, the longer it consumes programer's time, but
faster the code runs. Higher level langs may or may not be crafted to
be as efficient. For example, code written in the level of langs such
as perl, python, ruby, will never run as fast as C, regardless what
expert a perler is. C code will never run as fast as assembler langs.
And if the task crafting a raytracing software, then perl, python,
ruby, lisp, Mathematica, are simply not suitable, and are not likely
to produce any code as fast as C or Java.

On the other hand, higher level langs in many applications simply
cannot be done with lower level lang for various practical reasons.
For example, you can use Mathematica to solve some physics problem in
few hours, or give Pi to gazillion digits in few seconds with just “N
[Pi,10000000000000]”. Sure, you can code a solution in lisp, perl, or
even C, but that means few years of man hours. Similarly, you can do
text processing in C, Java, but perl, python, ruby, php, emacs lisp,
Mathematica, can reduce your man hours to 10% or 1% of coding effort.

In the above, i left out functional langs that are roughly statically
typed and compiled, such as Haskell, OCaml, etc. I do not have
experience with these langs. I suppose they do maitain some advantage
of low level lang's speed, yet has high level constructs. Thus, for
computationally intensive tasks such as writing a raytracer, they may
compete with C, Java in speed, yet easier to write with fewer lines of
code.

personally, i've made some effort to study Haskell but never went thru
it. In my experience, i find langs that are (roughly called) strongly
typed, difficult to learn and use. (i have reading knowledge of C and
working knowledge of Java, but am never good with Java. The verbosity
in Java turns me off thoroughly.)

-----------------

as to how fast Mathematica can be in the raytracing toy code shown in
this thread, i've given sufficient demonstration that it can be speed
up significantly. Even Mathematica is not suitable for this task, but
i'm pretty sure can make the code's speed in the some level of speed
as OCaml.
(as opposed to someone's claim that it must be some 700000 times
slower or some “5 orders of magnituted slower”). However, to do so
will take me half a day or a day of coding. Come fly $300 to my paypal
account, then we'll talk. Money back guaranteed, as i said before.

  Xah
∑ http://xahlee.org/

☄
From: Jon Harrop
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <4ridnecnkLopkKDUnZ2dnUVZ8qbinZ2d@posted.plusnet>
Xah Lee wrote:
> The result and speed up of my code can be verified by anyone who has
> Mathematica.

You changed the scene that is being rendered => your speedup is bogus!

Trace the scene I originally gave and you will see that your program is no
faster than mine was.

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
From: Jon Harrop
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <M-OdnaHGv7WwgaDUnZ2dnUVZ8jKdnZ2d@posted.plusnet>
Xah Lee wrote:
> For those interested in this Mathematica problem, i've now cleaned up
> the essay with additional comments here:
> 
> • A Mathematica Optimization Problem
>   http://xahlee.org/UnixResource_dir/writ/Mathematica_optimization.html

In that article you say:

> Further, if Intersect is made to take a flat sequence of argument as
> in “Intersect[o_, d_, lambda_, n_, c_, r_, s_]”, then pattern matching can
> be avoided by making it into a pure function “Function”. And when it is
> a “Function”, then Intersect or part of it may be compiled with Compile.
> When the code is compiled, the speed should be a order of magnitude
> faster.     

That is incorrect. Mathematica's Compile function cannot handle recursive
functions like Intersect. For example:

In[1]:= Compile[{n_, _Integer}, If[# == 0, 1, #0[[# n - 1]] #1] &[n]]

During evaluation of In[1]:= Compile::fun: Compilation of
(If[#1==0,1,#0[[#1 n-1]] #1]&)[Compile`FunctionVariable$435] cannot
proceed. It is not possible to compile pure functions with arguments
that represent the function itself. >>

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
From: Xah Lee
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <0201522a-b15a-4401-a208-5d9af70bdd74@w24g2000prd.googlegroups.com>
On Dec 8, 5:10 am, Jon Harrop <····@ffconsultancy.com> wrote:
> Xah Lee wrote:
> > For those interested in this Mathematica problem, i've now cleaned up
> > the essay with additional comments here:
>
> > • A Mathematica Optimization Problem
> >  http://xahlee.org/UnixResource_dir/writ/Mathematica_optimization.html
>
> In that article you say:
>
> > Further, if Intersect is made to take a flat sequence of argument as
> > in “Intersect[o_, d_, lambda_, n_, c_, r_, s_]”, then pattern matching can
> > be avoided by making it into a pure function “Function”. And when it is
> > a “Function”, then Intersect or part of it may be compiled with Compile.
> > When the code is compiled, the speed should be a order of magnitude
> > faster.

> That is incorrect. Mathematica's Compile function cannot handle recursive
> functions like Intersect.

i didn't claim it can. You can't expect to have a fast or good program
if you code Java style in a functional lang.

Similarly, if you want code to run fast in Mathematica, you don't just
slap in your OCaml code into Mathematica syntax and expect it to work
fast.

If you are a Mathematica expert, you could make it recurse yet have
the speed as other langs. First, by changing your function's form, to
avoid pattern matching, and rewrite your bad recursion. That is what i
claimed in the above paragraph. Read it again to see.

> For example:
> In[1]:= Compile[{n_, _Integer}, If[# == 0, 1, #0[[# n - 1]] #1] &[n]]
>
> During evaluation of In[1]:= Compile::fun: Compilation of
> (If[#1==0,1,#0[[#1 n-1]] #1]&)[Compile`FunctionVariable$435] cannot
> proceed. It is not possible to compile pure functions with arguments
> that represent the function itself. >>

Mathematica's Compile function is intended to speed up numerical
computation. To want Compile to handle recursion in the context of
Mathematica's programing features, is not something possible even in
theoretical sense.

Scheme lisp implementations can compile recursive code, but lisp is a
lower level lang than Mathematica, where perhaps the majority of
Mathematica's builtin functions can equate to 10 or more lines of
lisp, and any of its hundreds math functions equates to entire
libraries of other langs. It is reasonable, but silly, to expect
Mathematica's Compile function to compile any code in Mathematica.

Perhaps in the future version of Mathematica, its Compile function can
handle basic recursive forms.

Also, in this discussion, thanks to Thomas M Hermann's $20 offered to
me for my challenge to you, that i have taken the time to show working
code that demonstrate many problems in your code. Unless you think my
code and replies to you are totally without merit or fairness, but
otherwise you should acknowledge it, in whole or in parts you agree,
in a honest and wholehearted way, instead of pushing on with petty
verbal fights.

  Xah
∑ http://xahlee.org/

☄
From: Jon Harrop
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <Z9ednSE6l5AIDKDUnZ2dnUVZ8gOdnZ2d@posted.plusnet>
Xah Lee wrote:
> Also, in this discussion, thanks to Thomas M Hermann's $20 offered to
> me for my challenge to you, that i have taken the time to show working
> code that demonstrate many problems in your code.

You failed the challenge that you were given. Specifically, your code is not
measurably faster on the problem that I set. Moreover, you continued to
write as if you had not failed and, worse, went on to give even more awful
advice as if your credibility had not just been destroyed.

> If you are a Mathematica expert, you could make it recurse yet have
> the speed as other langs.

No, you cannot. That is precisely why you just failed this challenge.

You should accept the fact that Mathematica currently has these
insurmountable limitations.

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
From: Xah Lee
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <8947211d-2ee8-403a-8c8a-c7599aea5f80@k24g2000pri.googlegroups.com>
2008-12-08

Xah Lee wrote:
> > Also, in this discussion, thanks to Thomas M Hermann's $20 offered to
> > me for my challenge to you, that i have taken the time to show working
> > code that demonstrate many problems in your code.


A moron, wrote:
> You failed the challenge that you were given.

you didn't give me a challenge. I gave you. I asked for $5 sincerity
wage of mutal payment or money back guarantee, so that we can show
real code instead of verbal fight. You didn't take it and do nothing
but continue petty quarrel on words. Thomas was nice to pay me, which
results in my code that is demonstratably faster than yours. (verified
by a post from “jason-sage @@@ creativetrax.com”, quote: “So Xah's
code is about twice as fast as Jon's code, on my computer.”, message
can be seen at “ http://www.gossamer-threads.com/lists/python/python/698196?do=post_view_threaded#698196
” ) You refuse to acknowledge it, and continue babbling, emphasizing
that my code should be some hundred times faster make valid argument.

As i said, now pay me $300, i will then make your Mathematica code in
the same level of speed as your OCmal. If it does not, money back
guaranteed. Here's more precise terms i ask:

Show me your OCmal code that will compile on my machine (PPC Mac, OSX
10.4.x). I'll make your Mathematica code in the same speed level as
your OCmal code. (you claimed Mathematica is roughly 700 thousand
times slower to your OCmal code. I claim, i can make it, no more than
10 times slower than the given OCmal code.)

So, pay me $300 as consulting fee. If the result does not comply to
the above spec, money back guaranteed.

> You should accept the fact that Mathematica currently has these
> insurmountable limitations.

insurmountable ur mom.

  Xah
∑ http://xahlee.org/

☄
From: Wesley MacIntosh
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <ghkm01$8qc$4@news.motzarella.org>
A flamer wrote:
> A moron, wrote:
[snip]

> my machine (PPC Mac, OSX 10.4.x).

Well, that explains a great deal.

Actually, I suspect all these newsgroups are being trolled.
From: Jon Harrop
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <TdedndErcJo9XKDUnZ2dnUVZ8uudnZ2d@posted.plusnet>
Xah Lee wrote:
> A moron, wrote:
> > You failed the challenge that you were given.
> 
> you didn't give me a challenge.

Thomas gave you the challenge:

  "What I want in return is you to execute and time Dr. Harrop's original
code, posting the results to this thread... By Dr. Harrop's original code,
I specifically mean the code he posted to this thread. I've pasted it below
for clarity.".

Thomas even quoted my code verbatim to make his requirements totally
unambiguous. Note the parameters [9, 512, 4] in the last line that he and I
both gave:

  AbsoluteTiming[Export["image.pgm", ········@······@Main[9, 512, 4]]]

You have not posted timings of that, let alone optimized it. So you failed.

> I gave you. I asked for $5 sincerity 
> wage of mutal payment or money back guarantee, so that we can show
> real code instead of verbal fight. You didn't take it and do nothing
> but continue petty quarrel on words.

Then where did you post timings of that exact code as Thomas requested?

>
http://www.gossamer-threads.com/lists/python/python/698196?do=post_view_threaded#698196 
> ” ) You refuse to acknowledge it, and continue babbling, emphasizing that
> my code should be some hundred times faster make valid argument.

That is not my code! Look at the last line where you define the scene:

  Timing[Export["image.pgm",Graphics[at]······@Main[2,100,4.]]]

Those are not the parameters I gave you. Your program is running faster
because you changed the scene from over 80,000 spheres to only 5 spheres.
Look at your output image: it is completely wrong!

> As i said, now pay me $300, i will then make your Mathematica code in
> the same level of speed as your OCmal. If it does not, money back
> guaranteed.

Your money back guarantee is worthless if you cannot even tell when you have
failed.

> Show me your OCmal code that will compile on my machine (PPC Mac, OSX
> 10.4.x).

The code is still on our site:

  http://www.ffconsultancy.com/languages/ray_tracer/

OCaml, C++ and Scheme all take ~4s to ray trace the same scene.

> I'll make your Mathematica code in the same speed level as 
> your OCmal code. (you claimed Mathematica is roughly 700 thousand
> times slower to your OCmal code. I claim, i can make it, no more than
> 10 times slower than the given OCmal code.)

You have not even made it 10% faster, let alone 70,000x faster. Either
provide the goods or swallow the fact that you have been wrong all along.

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
From: Xah Lee
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <9f86f78c-776e-4793-b931-67dadc0e24f1@i24g2000prf.googlegroups.com>
On Dec 8, 4:56 pm, Jon Harrop <····@ffconsultancy.com> wrote:
> Xah Lee wrote:
> > A moron, wrote:
> > > You failed the challenge that you were given.
>
> > you didn't give me a challenge.
>
> Thomas gave you the challenge:
>
>   "What I want in return is you to execute and time Dr. Harrop's original
> code, posting the results to this thread... By Dr. Harrop's original code,
> I specifically mean the code he posted to this thread. I've pasted it below
> for clarity.".
>
> Thomas even quoted my code verbatim to make his requirements totally
> unambiguous. Note the parameters [9, 512, 4] in the last line that he and I
> both gave:
>
>   AbsoluteTiming[Export["image.pgm", ········@······@Main[9, 512, 4]]]
>
> You have not posted timings of that, let alone optimized it. So you failed.

The first parameter to your Main specifies some kinda recursively
stacked spheres in the rendered image. The second parameter is the
width and height of the pixels in the rendered image.

I tried to run them but my computer went 100% cpu and after i recall 5
min its still going. So, i reduced your input. In the end, with
reduced input, it shows my code is 5 times faster (running Mathematica
v4 on OS X 10.4.x with PPC 1.9 GHz), and on the other guy's computer
with Mathematica 6 he says it's twice as fast.

Given your code's nature, it is reasonably to assume that with your
original input my code would still be faster than yours. You claim it
is not or that it is perhaps just slightly faster?

It is possible you are right. I don't want to spend the energy to run
your code and my code and possible hog my computer for hours or
perhaps days. As i said, your recursive Intersect is very badly
written Mathematica code. It might even start memory swapping.

Also, all you did is talking bullshit. Thomas actually is the one took
my challenge to you and gave me $20 to prove my argument to YOU. His
requirement, after the payment, is actually, i quote:

«Alright, I've sent $20. The only reason I would request a refund is
if you don't do anything. As long as you improve the code as you've
described and post the results, I'll be satisfied. If the improvements
you've described don't result in better performance, that's OK.»

He haven't posted since nor emailed me. It is reasonable to assume he
is satisfied as far as his payment to me to see my code goes.

You, kept on babbling. Now you say that the input is different. Fine.
How long does that input actually take on your computer? If days, i'm
sorry i cannot run your toy code on my computer for days. If in few
hours, i can then run the code overnight, and if necessary, give you
another version that will be faster with your given input to shut you
the fuck up.

However, there's cost to me. What do i get to do your homework? It is
possible, that if i spend the energy and time to do this, then you
again refuse to acknowledge it, or kept on complaining about something
else.

You see, newsgroup is the bedrock of bullshit. You bullshit, he
bullshits, everybody brags and bullshit because there is no stake. I
want sincerity and responsibility backed up, with for example paypal
deposits. You kept on bullshitting, Thomas gave me $20 and i produced
a code that reasonably demonstrated at least how unprofessional your
Mathematica code was.

Here's the deal. Pay me $20, then i'll creat a version of Mathematica
code that has the same input as yours. Your input is Main[9, 512, 4],
as i have exposed, your use of interger in the last part for numerical
computation is Mathematica incompetence. You didn't acknowledge even
this. I'll give a version of Mathematica with input Main[9, 512, 4.]
that will run faster than yours. If not, money back guaranteed. Also,
pay me $300, then i can produce a Mathematica version no more than 10
times slower than your OCaml code, this should be a 70000 times
improvement according to you. Again, money back guarantee.

If i don't receive $20 or $300, this will be my last post to you in
this thread. You are just a bullshitter.

O wait... my code with Main[9, 512, 4.] and other numerical changes
already makes your program run faster regardless of the input size.
What a motherfucking bullshit you are. Scratch the $20. The $300
challenge still stands firm.

  Xah
∑ http://xahlee.org/

☄
From: ···@netherlands.com
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <nabuj418t5o4l6rmjli2i41315bv72ema3@4ax.com>
On Tue, 9 Dec 2008 15:01:11 -0800 (PST), Xah Lee <······@gmail.com> wrote:

>
>On Dec 8, 4:56 pm, Jon Harrop <····@ffconsultancy.com> wrote:
>> Xah Lee wrote:
>> > A moron, wrote:
>> > > You failed the challenge that you were given.
>>
>> > you didn't give me a challenge.
>>
>> Thomas gave you the challenge:
>>
>>   "What I want in return is you to execute and time Dr. Harrop's original
>> code, posting the results to this thread... By Dr. Harrop's original code,
>> I specifically mean the code he posted to this thread. I've pasted it below
>> for clarity.".
>>
>> Thomas even quoted my code verbatim to make his requirements totally
>> unambiguous. Note the parameters [9, 512, 4] in the last line that he and I
>> both gave:
>>
>>   AbsoluteTiming[Export["image.pgm", ········@······@Main[9, 512, 4]]]
>>
>> You have not posted timings of that, let alone optimized it. So you failed.
>
>The first parameter to your Main specifies some kinda recursively
>stacked spheres in the rendered image. The second parameter is the
>width and height of the pixels in the rendered image.
>
>I tried to run them but my computer went 100% cpu and after i recall 5
>min its still going. So, i reduced your input. In the end, with
>reduced input, it shows my code is 5 times faster (running Mathematica
>v4 on OS X 10.4.x with PPC 1.9 GHz), and on the other guy's computer
>with Mathematica 6 he says it's twice as fast.
>
>Given your code's nature, it is reasonably to assume that with your
>original input my code would still be faster than yours. You claim it
>is not or that it is perhaps just slightly faster?
>
>It is possible you are right. I don't want to spend the energy to run
>your code and my code and possible hog my computer for hours or
>perhaps days. As i said, your recursive Intersect is very badly
>written Mathematica code. It might even start memory swapping.
>
>Also, all you did is talking bullshit. Thomas actually is the one took
>my challenge to you and gave me $20 to prove my argument to YOU. His
>requirement, after the payment, is actually, i quote:
>
>�Alright, I've sent $20. The only reason I would request a refund is
>if you don't do anything. As long as you improve the code as you've
>described and post the results, I'll be satisfied. If the improvements
>you've described don't result in better performance, that's OK.�
>
>He haven't posted since nor emailed me. It is reasonable to assume he
>is satisfied as far as his payment to me to see my code goes.
>
>You, kept on babbling. Now you say that the input is different. Fine.
>How long does that input actually take on your computer? If days, i'm
>sorry i cannot run your toy code on my computer for days. If in few
>hours, i can then run the code overnight, and if necessary, give you
>another version that will be faster with your given input to shut you
>the fuck up.
>
>However, there's cost to me. What do i get to do your homework? It is
>possible, that if i spend the energy and time to do this, then you
>again refuse to acknowledge it, or kept on complaining about something
>else.
>
>You see, newsgroup is the bedrock of bullshit. You bullshit, he
>bullshits, everybody brags and bullshit because there is no stake. I
>want sincerity and responsibility backed up, with for example paypal
>deposits. You kept on bullshitting, Thomas gave me $20 and i produced
>a code that reasonably demonstrated at least how unprofessional your
>Mathematica code was.
>
>Here's the deal. Pay me $20, then i'll creat a version of Mathematica
>code that has the same input as yours. Your input is Main[9, 512, 4],
>as i have exposed, your use of interger in the last part for numerical
>computation is Mathematica incompetence. You didn't acknowledge even
>this. I'll give a version of Mathematica with input Main[9, 512, 4.]
>that will run faster than yours. If not, money back guaranteed. Also,
>pay me $300, then i can produce a Mathematica version no more than 10
>times slower than your OCaml code, this should be a 70000 times
>improvement according to you. Again, money back guarantee.
>
>If i don't receive $20 or $300, this will be my last post to you in
>this thread. You are just a bullshitter.
>
>O wait... my code with Main[9, 512, 4.] and other numerical changes
>already makes your program run faster regardless of the input size.
>What a motherfucking bullshit you are. Scratch the $20. The $300
>challenge still stands firm.
>
>  Xah
>? http://xahlee.org/
>
>?
Ad hominem
From: Kaz Kylheku
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <20081224103256.892@gmail.com>
On 2008-12-08, Xah Lee <······@gmail.com> wrote:
> So, pay me $300 as consulting fee. If the result does not comply to
> the above spec, money back guaranteed.

*LOL*

Did you just offer someone the exciting wager of ``your money back or nothing?

No matter what probability we assign to the outcomes, the /upper bound/
on the expected income from the bet is at most zero dollars.  Now that's not so
bad. Casino games and lotteries have that property too; the net gain is
negative.

But your game has no variability to suck someone in; the /maximum/ income from
any trial is that you break even, which is considered winning.

If you ever decide to open a casino, I suggest you stop playing with
Mathematica for a while, and spend a little more time with Statistica,
Probabilica, and especially Street-Smartica. 

:)
From: Madhu
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <m3oczld4xw.fsf@moon.robolove.meer.net>
* Kaz Kylheku <··················@gmail.com> :
Wrote on Tue, 9 Dec 2008 04:36:28 +0000 (UTC):

|> So, pay me $300 as consulting fee. If the result does not comply to
|> the above spec, money back guaranteed.
|
| Did you just offer someone the exciting wager of ``your money back or
| nothing?

No, I don't think he was offering a bet --- this sounded more like he
was charging for a service.  The costs would cover the time he spent in
providing the service; except if the service was not satisfactory in
which case he'd refund the amount.  (Actually the cost seems to be
calculated to dissade any customer from engaging him in the first place,
so the conclusion from your analysis would still hold)
--
Madhu
From: Stanisław Halik
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <ghjdac$m1q$1@opal.icpnet.pl>
In comp.lang.lisp Xah Lee <······@gmail.com> wrote:

> The phenomenon of creating code that are inefficient is proportional
> to the highlevelness or power of the lang. In general, the higher
> level of the lang, the less possible it is actually to produce a code
> that is as efficient as a lower level lang. For example, the level or
> power of lang can be roughly order as this:

> assembly langs
> C, pascal
> C++, java, c#
> unix shells
> perl, python, ruby, php
> lisp
> Mathematica

This is untrue. Common Lisp native-code compilers are orders of
magnitude faster than those of scripting languages such as Perl or Ruby.

In particular, creating an efficient Ruby implementation might prove
challenging - the language defines lexical bindings as modifiable at
runtime, arithmetic operations as requiring a dynamic method dispatch
etc.

FUT ignored.

-- 
The great peril of our existence lies in the fact that our diet consists
entirely of souls. -- Inuit saying
From: Xah Lee
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <fc1846db-bb87-4376-88a9-9ddea1ea9fe2@i20g2000prf.googlegroups.com>
Xah Lee wrote:

«...

The phenomenon of creating code that are inefficient is proportional
to the highlevelness or power of the lang. In general, the higher
level of the lang, the less possible it is actually to produce a code
that is as efficient as a lower level lang. For example, the level or
power of lang can be roughly order as this:
assembly langs
C, pascal
C++, java, c#
unix shells
perl, python, ruby, php
lisp
Mathematica

...
»

Moron Stanisław Halik wrote:

> This is untrue. Common Lisp native-code compilers are orders of
> magnitude faster than those of scripting languages such as Perl or Ruby.

Learn to read articles and discuss in whole, as opposed to nickpick on
particulars so that your favorite lang looks good.

  Xah
∑ http://xahlee.org/

☄
From: Stanisław Halik
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <ghlsaa$k92$2@opal.icpnet.pl>
thus spoke Xah Lee <······@gmail.com>:

> The phenomenon of creating code that are inefficient is proportional
> to the highlevelness or power of the lang. In general, the higher
> level of the lang, the less possible it is actually to produce a code
> that is as efficient as a lower level lang. For example, the level or
> power of lang can be roughly order as this:

Yes, that's true, but your hierarchy sucks. Unix shells more powerful
than C? They're macro languages, ferchristsakes. You should also explain
what are the high-level features of Mathematica inhibiting optimization.
A math functions' library doesn't make the language more powerful. Take
java, for instance, it has a large standard library alright.

> assembly langs
> C, pascal
> C++, java, c#
> unix shells
> perl, python, ruby, php
> lisp
> Mathematica


-- 
The great peril of our existence lies in the fact that our diet consists
entirely of souls. -- Inuit saying
From: Jon Harrop
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <lJWdnX4f6bqGMKPUnZ2dnUVZ8q_inZ2d@posted.plusnet>
Stanisław Halik wrote:
> thus spoke Xah Lee <······@gmail.com>:
>> The phenomenon of creating code that are inefficient is proportional
>> to the highlevelness or power of the lang. In general, the higher
>> level of the lang, the less possible it is actually to produce a code
>> that is as efficient as a lower level lang. For example, the level or
>> power of lang can be roughly order as this:
> 
> Yes, that's true, but your hierarchy sucks. Unix shells more powerful
> than C? They're macro languages, ferchristsakes. You should also explain
> what are the high-level features of Mathematica inhibiting optimization...

In the context of single-threaded programs there is no excuse for
Mathematica being so slow: it adds no impediments beyond those found in
Lisp. The only reason Mathematica is so slow is that its only
implementation is a naive term rewriter that makes no attempt to use native
code.

However, in the context of parallelism on multicores everything changes.
Mathematica is built entirely around one giant global rewrite table. In
other words, all variables are global in Mathematica. Consequently, the
obvious implementation of any kind of shared-state parallelism will require
synchronization around every single read or write to any variable, which
would be cripplingly slow. Their solution has been to resort to distributed
parallelism but that is hugely inefficient (e.g. see Erlang) and renders
Mathematica even less suitable for general purpose programming on
multicores. There are more sophisticated alternatives that can work around
this problem but they would require a complete rewrite of the internals and
that is not feasible for business reasons (i.e. backward compatibility).

-- 
Dr Jon D Harrop, Flying Frog Consultancy Ltd.
http://www.ffconsultancy.com/?u
From: George Neuner
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <8evqj4p342jvcma5saibf54thv7075r88g@4ax.com>
On Sun, 7 Dec 2008 14:53:49 -0800 (PST), Xah Lee <······@gmail.com>
wrote:

>The phenomenon of creating code that are inefficient is proportional
>to the highlevelness or power of the lang. In general, the higher
>level of the lang, the less possible it is actually to produce a code
>that is as efficient as a lower level lang. 

This depends on whether someone has taken the time to create a high
quality optimizing compiler.


>For example, the level or power of lang can be roughly order as 
>this:
>
>assembly langs
>C, pascal
>C++, java, c#
>unix shells
>perl, python, ruby, php
>lisp
>Mathematica

According to what "power" estimation?  Assembly, C/C++, C#, Pascal,
Java, Python, Ruby and Lisp are all Turing Complete.  I don't know
offhand whether Mathematica is also TC, but if it is then it is at
most equally powerful.

Grammatic complexity is not exactly orthogonal to expressive power,
but it is mostly so.  Lisp's SEXPRs are an existence proof that a
Turing powerful language can have a very simple grammar.  And while a
2D symbolic equation editor may be easier to use than spelling out the
elements of an equation in a linear textual form, it is not in any
real sense "more powerful".


>the lower level the lang, the longer it consumes programer's time, but
>faster the code runs. Higher level langs may or may not be crafted to
>be as efficient.  For example, code written in the level of langs such
>as perl, python, ruby, will never run as fast as C, regardless what
>expert a perler is. 

There is no language level reason that Perl could not run as fast as C
... it's just that no one has cared to implement it.


>C code will never run as fast as assembler langs.

For a large function with many variables and/or subcalls, a good C
compiler will almost always beat an assembler programmer by sheer
brute force - no matter how good the programmer is.  I suspect the
same is true for most HLLs that have good optimizing compilers.

I've spent years doing hard real time programming and I am an expert
in C and a number of assembly languages.  It is (and has been for a
long time) impractical to try to beat a good C compiler for a popular
chip by writing from scratch in assembly.  It's not just that it takes
too long ... it's that most chips are simply too complex for a
programmer to keep all the instruction interaction details straight in
his/her head.  Obviously results vary by programmer, but once a
function grows beyond 100 or so instructions, the compiler starts to
win consistently.  By the time you've got 500 instructions (just a
medium sized C function) it's virtually impossible to beat the
compiler.

In functional languages where individual functions tend to be much
smaller, you'll still find very complex functions in the disassembly
that arose from composition, aggressive inlining, generic
specialization, inlined pattern matching, etc.  Here an assembly
programmer can quite often match the compiler for a particular
function (because it is short), but overall will fail to match the
compiler in composition.

When maximum speed is necessary it's almost always best to start with
an HLL and then hand optimize your optimizing compiler's output.
Humans are quite often able to find additional optimizations in
assembly code that they could not have written as well overall in the
first place.

George
From: Xah Lee
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <9d5ecca4-3eb8-42ad-b4d8-951719ef874b@n33g2000pri.googlegroups.com>
Dear George Neuner,

Xah Lee wrote:
> >The phenomenon of creating code that are inefficient is proportional
> >to the highlevelness or power of the lang. In general, the higher
> >level of the lang, the less possible it is actually to produce a code
> >that is as efficient as a lower level lang.

George Neuner wrote:
> This depends on whether someone has taken the time to create a high
> quality optimizing compiler.

try to read the sentence. I quote:
«The phenomenon of creating code that are inefficient is proportional
to the highlevelness or power of the lang. In general, the higher
level of the lang, the less possible it is actually to produce a code
that is as efficient as a lower level lang.»

Xah Lee wrote:
> >For example,
> >the level or power of lang can be roughly order as
> >this:
>
> >assembly langs
> >C, pascal
> >C++, java, c#
> >unix shells
> >perl, python, ruby, php
> >lisp
> >Mathematica

George wrote:
> According to what "power" estimation?  Assembly, C/C++, C#, Pascal,
> Java, Python, Ruby and Lisp are all Turing Complete.  I don't know
> offhand whether Mathematica is also TC, but if it is then it is at
> most equally powerful.

it's amazing that every tech geekers (aka idiots) want to quote
“Turing Complete” in every chance. Even a simple cellular automata,
such as Conway's game of life or rule 110, are complete.

http://en.wikipedia.org/wiki/Conway's_Game_of_Life
http://en.wikipedia.org/wiki/Rule_110

in fact, according to Stephen Wolfram's controversial thesis by the
name of “Principle of computational equivalence”, every goddamn thing
in nature is just about turing complete. (just imagine, when you take
a piss, the stream of yellow fluid is actually doing turning complete
computations!)

for a change, it'd be far more interesting and effective knowledge
showoff to cite langs that are not so-called fuck of the turing
complete.

the rest of you message went on stupidly on the turing complete point
of view on language's power, mixed with lisp fanaticism, and personal
gribes about merits and applicability assembly vs higher level langs.
It's fine to go on with your gribes, but be careful in using me as a
stepping stone.

  Xah
∑ http://xahlee.org/

☄
From: George Neuner
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <i5e5k41fhnnleh1nsbbk4210g1vf436e6s@4ax.com>
On Mon, 8 Dec 2008 15:14:18 -0800 (PST), Xah Lee <······@gmail.com>
wrote:

>Dear George Neuner,
>
>Xah Lee wrote:
>> >For example,
>> >the level or power of lang can be roughly order as
>> >this:
>>
>> >assembly langs
>> >C, pascal
>> >C++, java, c#
>> >unix shells
>> >perl, python, ruby, php
>> >lisp
>> >Mathematica
>
>George wrote:
>> According to what "power" estimation?  Assembly, C/C++, C#, Pascal,
>> Java, Python, Ruby and Lisp are all Turing Complete.  I don't know
>> offhand whether Mathematica is also TC, but if it is then it is at
>> most equally powerful.
>
>it's amazing that every tech geekers (aka idiots) want to quote
>�Turing Complete� in every chance. Even a simple cellular automata,
>such as Conway's game of life or rule 110, are complete.
>
>http://en.wikipedia.org/wiki/Conway's_Game_of_Life
>http://en.wikipedia.org/wiki/Rule_110
>
>in fact, according to Stephen Wolfram's controversial thesis by the
>name of �Principle of computational equivalence�, every goddamn thing
>in nature is just about turing complete. (just imagine, when you take
>a piss, the stream of yellow fluid is actually doing turning complete
>computations!)

Wolfram's thesis does not make the case that everything is somehow
doing computation.  

>for a change, it'd be far more interesting and effective knowledge
>showoff to cite langs that are not so-called fuck of the turing
>complete.

We geek idiots cite Turing because it is an important measure of a
language.  There are plenty of languages which are not complete.  That
you completely disregard a fundamental truth of computing is
disturbing.

>the rest of you message went on stupidly on the turing complete point
>of view on language's power, mixed with lisp fanaticism, and personal
>gribes about merits and applicability assembly vs higher level langs.

You don't seem to understand the difference between leverage and power
and that disturbs all the geeks here who do.  We worry that newbies
might actually listen to your ridiculous ramblings and be led away
from the truth.

>It's fine to go on with your gribes, but be careful in using me as a
>stepping stone.

Xah, if I wanted to step on you I would do it with combat boots.  You
should be thankful that you live 3000 miles away and I don't care
enough about your petty name calling to come looking for you.  If you
insult people in person like you do on usenet then I'm amazed that
you've lived this long.

George
From: Chris Rathman
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <27dd3536-b101-40e4-9db9-c06baac83b1e@k8g2000yqn.googlegroups.com>
Xah Lee wrote:
> Come flying $5 to my paypal account, and i'll give you real code,
> amongest the programing tech geekers here for all to see.

That's the problem with Mathematica - it's so expensive that you even
have to pay for simple benchmark programs.
From: toby
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <4b6c3304-2396-4d35-a06f-e5d693cbbb12@f13g2000yqj.googlegroups.com>
On Dec 3, 4:15 pm, Xah Lee <······@gmail.com> wrote:
> On Dec 3, 8:24 am, Jon Harrop <····@ffconsultancy.com> wrote:
>
> > My example demonstrates several of Mathematica's fundamental limitations.
>
> enough babble Jon.
>
> Come flying $5 to my paypal account, and i'll give you real code,

I'll give you $5 to go away

--T
From: J�rgen Exner
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <l2oej4141lej4fat3mft05i75je2t6fsmb@4ax.com>
toby <····@telegraphics.com.au> wrote:
>On Dec 3, 4:15 pm, Xah Lee <······@gmail.com> wrote:
>> On Dec 3, 8:24 am, Jon Harrop <····@ffconsultancy.com> wrote:
>>
>> > My example demonstrates several of Mathematica's fundamental limitations.
>>
>> enough babble Jon.
>>
>> Come flying $5 to my paypal account, and i'll give you real code,
>
>I'll give you $5 to go away

if you add "and never come back" then count me in, too.

jue
From: Kaz Kylheku
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <20081219033030.559@gmail.com>
On 2008-12-04, Jürgen Exner <········@hotmail.com> wrote:
> toby <····@telegraphics.com.au> wrote:
>>On Dec 3, 4:15 pm, Xah Lee <······@gmail.com> wrote:
>>> On Dec 3, 8:24 am, Jon Harrop <····@ffconsultancy.com> wrote:
>>>
>>> > My example demonstrates several of Mathematica's fundamental limitations.
>>>
>>> enough babble Jon.
>>>
>>> Come flying $5 to my paypal account, and i'll give you real code,
>>
>>I'll give you $5 to go away
>
> if you add "and never come back" then count me in, too.

Really? I will trade you one Xah Lee for three Jon Harrops and I will even
throw in a free William James.
From: J�rgen Exner
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <h9qej49380o45bgbpfmjfqttdc4eiv4cgv@4ax.com>
Kaz Kylheku <········@gmail.com> wrote:
>On 2008-12-04, J�rgen Exner <········@hotmail.com> wrote:
>> toby <····@telegraphics.com.au> wrote:
>>>On Dec 3, 4:15 pm, Xah Lee <······@gmail.com> wrote:
>>>> On Dec 3, 8:24 am, Jon Harrop <····@ffconsultancy.com> wrote:
>>>>
>>>> > My example demonstrates several of Mathematica's fundamental limitations.
>>>>
>>>> enough babble Jon.
>>>>
>>>> Come flying $5 to my paypal account, and i'll give you real code,
>>>
>>>I'll give you $5 to go away
>>
>> if you add "and never come back" then count me in, too.
>
>Really? I will trade you one Xah Lee for three Jon Harrops and I will even
>throw in a free William James.

Well, I've never seen those names on CL.perl.M, so I don't know them.

jue
From: Lew
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <gh7cb4$a1l$3@news.albasani.net>
Xah Lee wrote:
> enough babble ...

Good point.  Plonk.  Guun dun!

-- 
Lew
From: Andreas Waldenburger
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <20081204111115.7cee2ecc@usenot.de>
On Wed, 03 Dec 2008 20:38:44 -0500 Lew <·····@lewscanon.com> wrote:

> Xah Lee wrote:
> > enough babble ...
> 
> Good point.  Plonk.  Guun dun!
> 

I vaguely remember you plonking the guy before. Did you unplonk him in
the meantime? Or was that just a figure of speech?


teasingly yours,
/W

-- 
My real email address is constructed by swapping the domain with the
recipient (local part).
From: Lew
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <gh8oso$puq$1@news.albasani.net>
Andreas Waldenburger wrote:
> On Wed, 03 Dec 2008 20:38:44 -0500 Lew <·····@lewscanon.com> wrote:
> 
>> Xah Lee wrote:
>>> enough babble ...
>> Good point.  Plonk.  Guun dun!
>>
> 
> I vaguely remember you plonking the guy before. Did you unplonk him in
> the meantime? Or was that just a figure of speech?

I have had some hard drive and system changes that wiped out my old killfiles.

-- 
Lew
From: ············@gmail.com
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <4d86021f-d25c-4b96-b1a2-3a0c162a6154@a26g2000prf.googlegroups.com>
On Dec 4 2008, 5:11 am, Andreas Waldenburger <········@usenot.de>
wrote:
> On Wed, 03 Dec 2008 20:38:44 -0500 Lew <·····@lewscanon.com> wrote:
>
> > Xah Lee wrote:
> > > enough babble ...
>
> > Good point.  Plonk.  Guun dun!
>
> I vaguely remember you plonking the guy before. Did you unplonk him in
> the meantime? Or was that just a figure of speech?
>
> teasingly yours,
> /W

Andreas Waldenburger, I hold up a mirror to your soul!

A couple of years ago you started posting to the newsgroup
comp.lang.java.programmer. Unlike most newbies, instead of lurking for
a while and then contributing on-topic posts about Java, you jumped
into the nearest available flamewar and immediately got in up to your
neck. Then, on November 13, 2007, you stooped to intentionally
misquoting one of your opponents. When he wrote

http://groups.google.com/group/comp.lang.java.programmer/msg/7797d4e90afa4a20?dmode=source

you quoted him thusly:
http://groups.google.com/group/comp.lang.java.programmer/msg/0386cc91ce75a3c6?dmode=source

A few days later, you did it again, misquoting this post

http://groups.google.com/group/comp.lang.java.programmer/msg/fca19d413549f499?dmode=source

in this one:

http://groups.google.com/group/comp.lang.java.programmer/msg/397e1d4b23537c1b?dmode=source

In both cases, you publicly portrayed this poor man as a pervert, even
though, whatever his other faults, that is clearly not one of them.

Since that date, you have continued to participate on and off in a
flamewar with many of the same participants, including the one you so
mistreated that November. In fact, it seems that the very SAME
flamewar is still in progress, in another newsgroup, fourteen whole
months later, and you are still in it up to your neck.

Repeatedly you have claimed to be primarily motivated by finding the
disrupting of newsgroups to be entertaining. This is tantamount to
admitting to being a troll.

If you have an excuse for your behavior, please speak now, and give
your apology before the witnesses gathered here.

If you do not, then know that your soul is forever tainted!
From: Jerry Gerrone
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <bb2c74a8-ac4e-49b4-bbb7-ced3091cf253@f11g2000vbf.googlegroups.com>
On Jan 21, 1:06 pm, ·············@gmail.com" <············@gmail.com>
wrote:
> On Dec 4 2008, 5:11 am, Andreas Waldenburger <········@usenot.de>
> wrote:
> > I vaguely remember you plonking [Xah Lee] before. Did you unplonk him in
> > the meantime? Or was that just a figure of speech?
>
> > teasingly yours,
> > /W
>
> Andreas Waldenburger, I hold up a mirror to your soul!
>
> A couple of years ago you started posting to the newsgroup
> comp.lang.java.programmer. Unlike most newbies, instead of lurking for
> a while and then contributing on-topic posts about Java, you jumped
> into the nearest available flamewar and immediately got in up to your
> neck. Then, on November 13, 2007, you stooped to intentionally
> misquoting one of your opponents. When he wrote
>
> http://groups.google.com/group/comp.lang.java.programmer/msg/7797d4e9...
>
> A few days later, you did it again, misquoting this post
>
> http://groups.google.com/group/comp.lang.java.programmer/msg/fca19d41...
>
> in this one:
>
> http://groups.google.com/group/comp.lang.java.programmer/msg/397e1d4b...
>
> In both cases, you publicly portrayed this poor man as a pervert, even
> though, whatever his other [insult deleted], that is clearly not one of
> them.

None of the nasty things that you have said or implied about me are at
all true.

> Repeatedly you have claimed to be primarily motivated by finding the
> disrupting of newsgroups to be entertaining. This is tantamount to
> admitting to being a troll.

Yes, and here you are, feeding him. Way to go, genius.

(And cljp had just gotten peaceful again, too!)
From: Don Geddis
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <87ljuvakow.fsf@geddis.org>
Xah Lee <······@gmail.com> wrote on Wed, 3 Dec 2008 :
> On Dec 3, 8:24 am, Jon Harrop <····@ffconsultancy.com> wrote:
>> My example demonstrates several of Mathematica's fundamental limitations.
> enough babble Jon.

Wait a minute ... are c.l.l's two trolls having a public argument with
each other?

Suddenly, I feel a deja vu flashback to misconfigured mailer daemons,
that just keep sending bounced email messages back and forth to each other
in an infinite loop...

        -- Don
_______________________________________________________________________________
Don Geddis                  http://don.geddis.org/               ···@geddis.org
From: Kaz Kylheku
Subject: Re: Mathematica 7 compares to other languages
Date: 
Message-ID: <20081219184837.850@gmail.com>
On 2008-12-04, Don Geddis <···@geddis.org> wrote:
> Xah Lee <······@gmail.com> wrote on Wed, 3 Dec 2008 :
>> On Dec 3, 8:24 am, Jon Harrop <····@ffconsultancy.com> wrote:
>>> My example demonstrates several of Mathematica's fundamental limitations.
>> enough babble Jon.
>
> Wait a minute ... are c.l.l's two trolls having a public argument with
> each other?

Now if they start trimming the responses in each round, so that the article
size is bounded, we can call it proper tail recursion!