From: SCHNEIDER daniel
Subject: LUV-93 conference report (longish and dirty)
Date: 
Message-ID: <1993Aug17.170831.24954@news.unige.ch>
Notes from the LUV-93 conference
********************************

This is quick and dirty, some people may want to correct things. Sorry
for my English, but I can't spend too much on this.

1. Attendance
*************

Was pretty low (about 100 including vendors). This is quite sad. Even
people from the Boston area did not show up....

2. Tutorials
************

1/2 day tutorials: CLOS I, Interfacing Lisp with other Languages, GNU Emacs
Lisp Programming, CLOS II, CLIM I, Good Lisp Style, CLIM II

I went to "Good style", "CLOS II (Advanced CLOS and the Meta-Object Protocol)"
and CLIM II. Missed Gnu Emacs since it was rescheduled... Speakers came
prepared (including handouts) and I learned something.

3. Technical Sessions
*********************

The quality of most papers was fine, but the conference's main topic
"Lisp in Education" rather failed. "Lisp in Education" means either
teaching Lisp or using Lisp for programming educational software.
There was only one paper on teaching Lisp (Scheme for Liberal Arts
students) and three papers on using Lisp as tool for creating some
kind of educational software. I gave one of those talks and general
interest was very low (like "interesting topic, but not my field ...")

More talks dealt with using LISP in industry and about the future
of LISP. There was some agreement in the air that lisp can give an
edge in producing deployable products, but you have to hide LISP
away from the clients ....

Other talks were of more general interest. An interesting fact came
from John Boreczky: Most of the optimization (space & speed) you can
gain are from modifying and tuning the code. Furthermore, doing this
and adding vendor dependent optimization takes a lot of time....

David Moon gave an invited talk on the Past, Present and Future of
LISP.  He pointed out that most LISP users don't use CL. They use
either embedded Lisp languages (a la Gnu Emacs) or very vague
derrivatives of LISP, like C++ (!).  I don't remember whether Moon
sees a future for CL or not.

R. Laddaga from Symbolics claims that the LISP world is still the best
for creating advanced Software development environments, since you
can do it object-oriented and growable. Furthermore, a good SDE (like
Genera) is "OS oriented", meaning that all "services" should be
always accessible. He also pointed out the value of a persistent
object system...

Some speaker form Itaska Corp. pointed out that their DB system can
be used for transparent CLOS, C++ coding, since objects are shared
by both the LISP and C world.

4. Products shown
*****************

Franz: (Did not look at ACL, but went to the presentation). Franz
showed the next version of ACL and CLIM II to be shipped soon (at
least on Sparc stations). In respect to ACL 4.2Beta and CLIM II Beta,
the editing interface has improved (you can have Lemacs with menus if
you want, and the Composer is more strongly tied to Emacs).  CLIM II
was promised to be fully documented and working much better. The
Presto product seems to be further improved. More space can be gained
with shared libraries under Solaris 2. So Franz seems to able to keep
its lead in small size. They also reiterated their strong involvement
with CLIM.  CL/PC (CL for the PC) version 2 (Beta?) was also shown. It
has a nice interface building tool. No CLIM support for this in the
near future.

LUCID: (Did look at XLT, and saw part of the presentation). Lucid
showed its new XLT programming environment. While being fast and
small, it has new features such as a set of impressive Source Level
Tools. I've never seen a Data Flow analyzer before in Lisp. They also
announced some kind of alliance with Gold Hill. Lucid also plans to
the increase its integration of the "foreign world" (e.g.  better
debugging of foreign functions, foreign CLOS classes, etc.).  Lucid
will further improve it's support/development for CLIM under MCL.

Symbolics: (Missed some of the presentation). The first alpha(?)
version of Genera running as a Unix process under Alpha has been
showed.  This means that you open up a Genera console as X window.
Everything you do inside this window is the same as on a Symbolics. I
forgot to ask whether they have some idea on how to deploy a program
without running the Full Genera environment.  The Alpha port runs at
2/3 of the speed of a XL1200 (oops, forgot which machine), but
promised to be faster.  The price tags seems to be nasty (if I heard
correctly about $18'000).  However big discounts can be expected for
certain clients since it is all software.

Harlequin: (Did not go to their presentation). Harlequin is now
established as the forth major vendor. Their tightly integrated
programming environment plus add-on tools (like production engine and
DB support) may set them somewhat apart from the two other big UNIX
vendors. The also showed a PC implementation which includes a minimal
development environment. It is been sold with a package that allows
a Unix implementation (including CLIM II) to be deployed on a PC. On a
beefed up PC (486/66 with lots of memory) CLIM seemed to much faster
than on the Unix box. This looks like Symbolic's CLOE, which for
the moment seems to be put "on ice"....

MCL (No presentation): MCL showed a first version of MCL 3(?) which
includes support for processes. Since I won't use MCL before I can
get a working version of CLIM II (not officially supported
by MCL) I did not look out for any other improvements. However MCL
seem to be the only ones that cares about multi-media (well, it is
in the Mac already). Yes, there is foreign character support.

Overally speaking, all Lisp products add improvements, but nothing
earth shaking could be observed. Lisp is still a big language, it is
still not standardized and a lot of components will never be.  However
it was pointed out that CLOS will probably be the first OOP language
to be standardized. Development Environments (run time and source
code tools) are getting there.

5. Lisp in Education
********************

Not discussed was Lisp as great tool for educational software. The big
concern of many at the conference (and especially the chairman) was
how to bring LISP to the students. It was pointed out that we need a
cheap, but fully working LISP (like CL/PC or MCL) and that people (all
LISP hackers out there) should go and teach classes where ever they
can.


6. Vendor's response, vendor's bashing and the future of LISP
*************************************************************

Some friendly and not so friendly vendor bashing could be observed
and I had my share in it. Now I'm giving just my opinion, and that
covers about 50% of the world's LISP users :) ....

There are a lot of things we need:

(1) Smaller and faster LISP's

things improve (although vendors do not have enough resources, e.g.
many multiples of mean/years go into a Fortran compiler

(2) Integration with foreign code

some vendors do better than others, but linkable object code is
probably too hard to do with limited resources of the lisp world.

(3) Powerful Development Environments

Unless Symbolics will gain a lot of new forces (and survive its
restructuration) most vendors will catch up some day.  However, there
is no interest in standardization (e.g. each vendor has its own
defsystem)

(4) Data sharing with foreign programs

Does not seem to be a real topic, except for CL/PC and MCL (?)

(5) Interface Tools

There is no "light" standard library, CLIM still does not work really
well.

(6) Integration of Multimedia

none, except for MCL.

(7) Free runtimes

Exists in the PC world, but Unix vendors don't seem to see its
importance.

(8) Additional libraries

Persistent Object dumping is inexistent, with the exception of
Statice(?)  from Symbolics. Actually there was a flyer from a third
party (CLOD).  Few vendors offer other tools. A lot of documented high
quality tools could be offered by the user community (like in the
Fortran world), but there isn't any interest either.


So overally speaking, the record is pretty bad for the vendors
comparing to what we already had on the Symbolics long time ago.  The
management strategy seems to focus on taking away market shares
instead of trying to augment the global share of LISP and LISP
products.

The attitude of the user community towards individualism and towards
selling any kind of good hack doesn't help either. So the argument of
LISP being the ideal Glue for putting things together is only half as
good as it sounds, since things have to programmed where elsewhere
they can be taken....

The market penetration of LISP can be favored if the vendors work
together on standardization (defsystem, processes, interface building,
foreign function calls, etc.). Right now, it just means trouble for me
to cooperate with somebody working with another vendor. At the
conference, the vendors said that it was up to us to provide them with
standards, but I don't want to wait another five years for that... I
just need something that works and that is standard. As somebody said
"Most of that stuff is taken from Genera, so why no just copying it in
a uniform way ...).

More important: Because of the lack of resources, the vendors should
set up task groups for add-on tools such as CLIM, fully integrated
production engines and data bases. The will for that seems to be
pretty low (in my opinion).  I was at a LISP conference in England
three years ago and many users asked for a uniform interface tool
among other things. The vendor's response was "We hear you loud and
clear". Three year's later we may be able to get a CLIM that works (it
was promised for 1991), but this is not clear yet.

I know that all vendors are in a difficult situation and that it may
not be fair to ask for more "vision" and for more cooperation among
themselves.  But they don't seem to realize that the focus of advanced
computer software has changed. Of course it is a good thing to have
better compilers, but we also need increased support for what I call
"knowledge tools", tools which allow you to store, retrieve,
manipulate, visualize, analyze, etc. all sorts of information. These
need advanced graphics, widgets, presentations, a way to store data,
multi-media, animation, some inferencing etc. Without having a real
edge in using LISP, people will just switch to another language or
more likely to a set of tools/ languages which will be hooked together
with data-communication protocols.  No vendor mentioned that kind of
programs as target for LISP.

From my personal experience, almost all the vendors are very friendly
and offer good user support. They also make a good job in improving
the LISP language and development facilities. But they lack a lot of
vision when it comes down to:
  (1) secure the future of lisp in new markets (interface programming,
persistent data, etc). Asking the users will maybe do, but they should
study the (future) users possible needs and NOT just do simple market
research like some famous Detroit based companies have been doing....
  (2) help the distribution of LISP products (free runtimes,
standardization of extended stuff like system building tools, etc.)
  (3) pool together resources in areas where all go in the same
direction (light interfaces, CLIM II) or which are not competitive
(e.g. defsystem, processes..)

6. Summary
**********

I liked this conference, because I had some nice tutorials, met some
real programmers and could see what the vendors are up to.  LISP will
definitely survive since most sold hardware now can cope with it and
it is still a very powerful tool for computer science or AI research.
However there is an increased danger that LISP won't have the same
edge in applications programming (prototyping or not). In some areas
(like deployable interface oriented programming) it is behind and may
fall behind further.

---
Daniel K.Schneider, TECFA (Educational Technologies and Learning)
Faculte de Psychologie et des Sciences de l'Education, University of Geneva,
9 route de Drize, 1227 Carouge(Switzerland),
Phone: (..41) 22 705 9694, Fax: (..41) 22 342 89 24

Internet:   ········@divsun.unige.ch  (and various national nets)    | if reply
X400:       S=schneide;OU=divsun;O=unige;PRMD=switch;ADMD=arcom;C=ch | does not
uucp:       mcvax!cui!divsun.unige.ch!shneider                       | work, try
BITNET:     ········@cgeuge51                                        | one of
DECNET:     UGUN2A::SCHNEIDE (local Swiss)                           | these
From: Carl R. Manning
Subject: Re: LUV-93 conference report (longish and dirty)
Date: 
Message-ID: <CAROMA.93Aug19133408@raisin-nut.ai.mit.edu>
In article <······················@news.unige.ch> ········@divsun.unige.ch (SCHNEIDER daniel) writes:
  [points out that the lisp community isn't providing `standardized'
   facilities for persistence, interprocess/interapplication/network
   communications, ...] 

Some rambling observations which may provoke some thoughts:

 o This might be generalized to say that the lisp community isn't
   providing interoperable (standard) facilities for systems with
   `distributed' processing, storage, and communications.  In process
   terms, interprocess (inter-address-space) and network
   communications, interapplication and database access.  In object
   terms, distributed (thus concurrent) objects and distributed
   persistent object bases.

 o C,C++,...Mach, Unix,.. provide facilities at the process level,
   through streams/ports/system calls/libraries.

 o C++ programs integrated with distributed object oriented databases
   also are at an object level.  However, the model here is that
   objects are pulled into a process from the database to be traversed
   or operated on.

 o Some Smalltalk (and NeXT Objective-C?) implementations are providing
   distributed objects, to address this at the object level (e.g., I
   was surprised to see a new Smalltalk implementation supporting
   distributed objects at MacWorld).  The model is of message passing
   objects, which fits even when objects are distributed, in different
   processes or across the network: methods are chosen based on the
   type of just the receiver of the message, which is by definition
   local when the message is received.  These languages have been able
   to benefit from the OO popularity and need for dynamicly
   recofigurable distributed systems (e.g., OMG's CORBA standards).

 o The CL/CLOS object model doesn't appear to fit the
   distributed/interprocess object environment well, since the generic
   function model of dispatching on multiple arguments assumes that
   the types of those arguments are available locally at the generic
   function (process thread), which may not be true if some arguments
   are distributed objects (in another address space and/or may change
   type concurrently).  [note that functions/closures are objects, too]

   This may have inhibited the development of object-level facilities,
   and/or may be inhibiting the choice of CL/CLOS by users developing
   applications requiring interaddressspace communication to make use
   of existing applications and servers. 

The majority of sophisticated applications aren't closed,
self-contained worlds anymore.  Creating foreign function interfaces
and small footprint applications (which can share the machine with
other apps concurrently) (a la Dylan) is a step towards catching up
with C/C++ today (but it will likely always be easier to link up with
class libraries in the same language).  Having a story about robustly
(`intelligently', maintainably) dealing with (mobile) persistent
distributed objects will be important for Lisp to have a future.  It
is perhaps a better opportunity to show value added over its
competitors (e.g., C/C++), as interaddressspace communication standards
(perhaps starting with OMG's CORBA or IBM's extension, SOM) may be
less language dependent than library or class library interfaces.

(I just raise the issue, I haven't worked out a proposal for such a story.)
Just some thoughts.

Cheers,

CarlManning