From: Mike Haertel
Subject: Re: Eval (Was: What's a treeshaker?)
Date: 
Message-ID: <MIKE.94Aug1132640@pdx399.intel.com>
In article <············@cs.cmu.edu> ···@sef-pmax.slisp.cs.cmu.edu writes:
>If you have a good VM system, and if you have the Lisp program
>development environment on your system at all, then the bundled model is
>preferable.  Instead of messing around with linking and loading specific
>libraries, the whole thing comes pre-linked and self-consistent.  The
>facilities that you are not using just don't get paged in.  They're sitting
>out there on disk, just as an unused but available library would be.  Only
>if you don't have VM does the size of the pre-linked entity become
>critical.

Everybody *says* the facilities you don't use don't get paged in, and
then uses this as an excuse to provide creeping featurism from hell.
The creepy features probably don't get used terribly often, and certainly
not in the greatest generality, but likely just often enough to keep
them continually paged in.

I've never seen a lisp system yet that didn't have a huge working set.
Not even when I used a VAX running BSD with 1K pages.

I think the reason for this is psychological: Lisp system implementors
somehow got into a mindset that with VM features can be free.  Nothing
could be further from the case; VM makes them *possible*, but it sure
doesn't make them free.

In the "all the world's linked in" model of software development, you
don't have to go to any hassle to use J. Random Feature from the Foo
library.  It's already there.  By contrast, if you have to explicitly
link the Foo library, there's a hassle factor, and perhaps also a moment
of guilt, that makes you think about whether you really need the feature.
Thus, although the "VM can replace explicit linker control" mindset 
certainally theoretically true, I believe in practice it is doomed
to fail because of psychological factors.

The result is that the Lisp community has produced dog slow, cache-buster
environments with huge working sets.  [But don't worry, the C++/Unix
communities are rapidly catching up.  You should what the Unix kernel
looks like these days.]

IMHO the "all the world's linked in" model is great for rapid prototyping,
[which Lisp excels at] but falls flat on its face in production software
development and delivery environments.

There's also another factor: the Lisp community often tends to implement
things in greatest generality, rather than weighing cost of implementation
against frequency of use and utility.  The old MIT vs. New Jersey schism.
But that's another article.
--
Mike Haertel <····@ichips.intel.com>
Not speaking for Intel.
From: Martin Rodgers
Subject: Re: Eval (Was: What's a treeshaker?)
Date: 
Message-ID: <776010781snz@wildcard.demon.co.uk>
In article <·················@pdx399.intel.com>
           ····@ichips.intel.com "Mike Haertel" writes:

> I think the reason for this is psychological: Lisp system implementors
> somehow got into a mindset that with VM features can be free.  Nothing
> could be further from the case; VM makes them *possible*, but it sure
> doesn't make them free.

Curiously, Microsoft have made the same mistake with their apps, and
they write them in VC++. Even worse, the platform they currently run
on uses a heap manager that runs on top on VM, but doesn't know it.

Your point about features is a good one, and I believe this is as
much a problem for mass market apps as it is for Common Lisp, judging
by what I've read in comp.lang.lisp, anyway. I don't know about other
Lisps, but the Scheme system I've used seems to be from the problems
that CL is accused off.

The real test is surely not just the language or the development system
that _you_ use, but the quality of the apps that you write and deliver.
I'm becoming increasingly less impressed by the apps I see and read
about these days, and I've also seen criticism of VC++.

> There's also another factor: the Lisp community often tends to implement
> things in greatest generality, rather than weighing cost of implementation
> against frequency of use and utility.  The old MIT vs. New Jersey schism.
> But that's another article.

Isn't that a feature, instead of a bug? :-)

I'd much rather re-use code than write new code. "Editing is easier
than generating." Where did I read that...? Ah, yes. In an article
about Smalltalk. It doesn't seem to have had as much impact as I'd
hoped, but at least I can use a clipboard these days. That helps.

I also read about LOOM in the same magazine. It was claimed that
"locality of reference" (another quote from my often unreliable
memory) would help reduce pages, or increase the chance of data
on a page faulted in being useful in the near future. For example,
a method faulted into memory might well be accompanied by other
methods in the same class, as it's likely they might be called
in the near future. This could certainly help with a Dylan system
as much as a Smalltalk-80 system, I would imagine.

-- 
Future generations are relying on us
It's a world we've made - Incubus	
We're living on a knife edge, looking for the ground -- Hawkwind