From: George J. Carrette
Subject: Re: C++ briar patch (Was: Object IDs are bad)
Date: 
Message-ID: <01bc664b$aa03fe20$0f02000a@gjchome.nis.newscorp.com>
Paul Prescod <········@csclub.uwaterloo.ca> wrote 
> STL does its work at *compile time*. It uses operator overloading
> based on variable types. A generic library that did its work at runtime
would 
> be a very different beast. 

Maybe I'm too much of an old timer, but I remember plenty of generic
algorithm packages in LISP which did their work at compile time.
Two that come to mind in the field of Computer Algebra are Barton's Mode 
Package, and various work at IBM's Watson Research Lab.

These things were written in LISP but certainly got down to the usual
assembly
language hackers level of memory locations, bytes, and bits.
Anything to pack a set of objects more efficiently in those days of 18-bit
address space machines.
And talk about power of generic expression and efficient specialization.
These things proved theorems about your code that make operator
overloading in C++ look like childs play.

I say "assembly language" hackers because I have yet to see C or C++
provide the feeling of the "bits between your toes" like mature LISP
implementations
do.

Heck, I wake up in the middle of the night sometimes thinking about
all the horribly inefficient C and C++ code that I've got cranky away on
some poor slobs machine, lacking optimizations that I wouldn't have
dreamed of leaving out in any self-respecting lisp implementation.

Anyway, don't accuse me of implying that writing this stuff in Lisp was
easy.
The LISP duality of macro/fexpr didn't provide much scaffolding.
You got the toplevel datastructure after the parse phase and that was that.

I was also firsthand witness to the famous Steven Wolfram's attempts
to use Macsyma and what he thought was LISP to implement algorithms.
(My hackish extensions of the Macsyma -> Lisp translator didn't always
provide transparent results at the time.)
It was quite interesting to see what lead to the start of the SMP project
at Caltech and later the product Mathematica. 

The just deserts? Probably that the lack of a mindful/lowlevel memory
model (not FORTRAN or C/C++ like) in Basic will lead eventually to
superior runtime performance, especially on dual cpu machines,
for Visual Basic applications over what gets produced
by the more sophisticated C and C++
environments.



-gjc




(Others have already described partial evaluators here. There were
some standard libraries with partial evaluators ported to the various
implementations of the late 1970's and early 1980's).