From: Steven G. Chappell
Subject: "Easy" way to put "AI" in realtime embedded systems?
Date: 
Message-ID: <1991Mar26.163917.14641@unhd.unh.edu>
Hi
	Over the years we have been working on an experimental autonomous
vehicle (EAVE) of the submersible persuasion. In the early stages, effort was
centered on the "traditional" robotics issues of positioning and motion
control, with a very heavy emphasis directed at the realtime aspects of same.
Subsequent work centered on an architecture which provides the capability for
adding new functionality to the already existing vehicle system. Recent work
dealt with the business of incorporating concepts from the AI community into
the vehicle runtime system so as to render it more "intelligent" (less stupid).
This is a nontrivial business since traditional embedded systems don't support
symbolic programming environments and such environments don't mix well with
"realtime" operation. Thus, my general information request:

By what methods can "AI" algorithms be installed in an embedded system?

Methods we have examined or heard about:
    software:
      augment embedded system with library which supports symbolic functionality
        develop in some extension of C, compile, download, run
      automated translation of symbolic code to supported code (C)
        develop in LISP, translate, compile, download, run
      utilize C++ (is C++ an adequate "AI" environment?)
        develop in C++, compile, download, run
      rehost the symbolic environment to the embedded system
        develop in LISP, download, run
    hardware: (is this really possible?)
      install LISP capable subsystem in target bus

In particular, we went the "rehost" route: transporting the University of
Utah's Portable Common LISP Subset from an HP Bobcat to our particular
development system and subsequently to our vehicle system. This has worked
to a degree, but it is not without its warts.

What experiences have you all had or heard about in regards to this?

Please email responses.
-- 
					Marine Systems Engineering Laboratory
					Marine Program Building
					University of New Hampshire
···@msel.unh.edu			Durham, NH  03824-3525

From: Phil Spelt
Subject: Re: "Easy" way to put "AI" in realtime embedded systems?
Date: 
Message-ID: <1991Mar27.151129.8754@cs.utk.edu>
In article 777, Steven Chappell inquires about embedded AI, suggesting
several possible alternatives, and reports choosing "rehost" as their
solution.  My question:  Why stay with LISP?????

Concerning "embedded architecture" robotic control, we here at the Center
for Engineering Systems Advanced Research (CESAR) have been using just
such an architecture for our autonomous mobile robots for the past 4 or 5
years.  We have chosen the "rehost" solution, also, but with entirely different directions:

Our hardware configuration is: IBM PC/AT (industrialized), so that there is an 80286 host running 16 nodes of an NCube hypercube parallel computer.  All code
is written in 'C' (er, well, ONE program on the host is in FORTRAN).  We use 
an expert system to control both naviagtion and machine learning.  This is
created in CLIPS, which runs in two versions on two different nodes (one for navigation and one for Learning).  CLIPS provides the source code, which we then
ported to the node environment.  It also provides "LISP-like" rule construction,
but with (IMHO  8=) ) much better mathematical computation ability on the RHS.
Our robot runs around in an "unstructured", dynamic environment, locates a 
control panel, and learns to manipulate the panel to shut off a "DANGER" light.
All this is in "real time" -- the limiting factors are the speed of the arms and of the vision processing.  The ESs perform at a MUCH faster speed than the
mechanical parts of the robot.

I repeat my question:  WHY insist on LISP?????

Phil Spelt, Cognitive Systems & Human Factors Group  ···@epm.ornl.gov
============================================================================
Any opinions expressed or implied are my own, IF I choose to own up to them.
============================================================================
From: Jeff Dalton
Subject: Re: "Easy" way to put "AI" in realtime embedded systems?
Date: 
Message-ID: <4471@skye.ed.ac.uk>
In article <·····················@cs.utk.edu> ···@mars.epm.ornl.gov (Phil Spelt) writes:
>In article 777, Steven Chappell inquires about embedded AI, suggesting
>several possible alternatives, and reports choosing "rehost" as their
>solution.  My question:  Why stay with LISP?????

Why not?  Some people like Lisp.  Since we don't insist that you use
the language we like, why give us a hard time for not using the
language you like?

>It also provides "LISP-like" rule construction, but with (IMHO  8=) ) 
>much better mathematical computation ability on the RHS.

It's not clear what this means, since Lisp doesn't have rules.
From: Bob Kitzberger @sation
Subject: Re: "Easy" way to put "AI" in realtime embedded systems?
Date: 
Message-ID: <1234@telesoft.com>
In article <····@skye.ed.ac.uk>, ····@aiai.ed.ac.uk (Jeff Dalton) writes:
> 
> Why not?  Some people like Lisp.  Since we don't insist that you use
> the language we like, why give us a hard time for not using the
> language you like?

It doesn't seem to me like it's a quiestion of 'liking' a language or not.
Jeez, we're supposed to be _engineers_ with the wisdom to select the correct
tool for its merits, not because of our subjective preferences.

I missed the original post, but if the poster wants to embed LISP in a hard
real-time application, there are valid engineering questions that need
to be asked.

I don't prefess to be anything near a LISP expert, but I'll toss out
a few questions you need to ask yourself.  Can you calculate guaranteed 
worst-case times for LISP routines in the presence of interrupts?  How 
about the dynamic memory allocation algorithms used in LISP -- are they 
non-deterministic?   How can you protect LISP data structures in the 
presence of concurrency?  Is reclamation of stale heap space performed?
I don't mean to imply that LISP fails in these areas.

Worst-case analysis is necessary for hard real-time applications.  You're
charting new territory if you use LISP in hard real-time, IMHO.

	.Bob.
-- 
Bob Kitzberger               Internet : ···@telesoft.com
TeleSoft                     uucp     : ...!ucsd.ucsd.edu!telesoft!rlk
5959 Cornerstone Court West, San Diego, CA  92121-9891  (619) 457-2700 x163
------------------------------------------------------------------------------
"Wretches, utter wretches, keep your hands from beans!"	-- Empedocles
From: Ken Dickey
Subject: Re^2: "Easy" way to put "AI" in realtime embedded systems?
Date: 
Message-ID: <473@data.UUCP>
···@telesoft.com (Bob Kitzberger @sation) writes:

>I don't prefess to be anything near a LISP expert, but I'll toss out
>a few questions you need to ask yourself. 

> Can you calculate guaranteed 
>worst-case times for LISP routines in the presence of interrupts?  

Yes.  Just like C/asm/whatever.


>How 
>about the dynamic memory allocation algorithms used in LISP -- are they 
>non-deterministic?

One has to be a bit smart about this.  The standard real-time strategy
for any language is static (pre)allocation.  For dynamic allocation,
one typically maintains a "free-list" of the appropriate storage size
to guarentee maximum allocation times [even in C, one tends to use
power-of-2 fixed-size heaps and a buddy-system allocation rather than
1st-fit].


>   How can you protect LISP data structures in the 
>presence of concurrency?

Same way as for any other language.  As a matter of fact, if you have
1st class continuations you can write a much smaller OS kernel.**


>  Is reclamation of stale heap space performed?

In any hard-real-time systems I have seen, storage allocations are all
precalculated.  To allow general GC, one (A) Calculates the size of
the maximum heap needed by any processing between GC's, and (B)
Schedules GC just as any other task to be performed.

In general, compilers for modern Lisps such as Scheme are comparable
to C in code speed/space, use somewhat larger representations for heap
objects, are more memory efficient in general (because there is no
extra code for storage management decisions), and are much more
flexible with fewer bugs (no pointer abstraction => don't dereference
bogus pointers, etc).  Real-time usages and problems are the same as
for other languages.

-Ken Dickey					····@data.uucp
----------------------
**
%A Mitchell Wand
%T Continuation-Based Multiprocessing
%J Conference Record of the 1980 Lisp Conference
%P 19-28
%I The Lisp Conference
%D 1980
From: Jeff Dalton
Subject: Re: "Easy" way to put "AI" in realtime embedded systems?
Date: 
Message-ID: <4495@skye.ed.ac.uk>
In article <····@telesoft.com> ···@telesoft.com (Bob Kitzberger @sation) writes:
>In article <····@skye.ed.ac.uk>, ····@aiai.ed.ac.uk (Jeff Dalton) writes:
>> 
>> Why not?  Some people like Lisp.  Since we don't insist that you use
>> the language we like, why give us a hard time for not using the
>> language you like?
>
>It doesn't seem to me like it's a quiestion of 'liking' a language or not.
>Jeez, we're supposed to be _engineers_ with the wisdom to select the correct
>tool for its merits, not because of our subjective preferences.

Unfortunately, engineering reasons seldom tell you the single right
tool to use.  This doesn't stop some people from claiming that they do
or from being convinced themselves.  In the end, it often comes down
to whether you value, say, a certain degress of flexibility over a
certain degree of efficiency, or vice versa, and to what you're used
to.  In many cases, some people will find Lisp better than C and
others will find C better than Lisp.  This doesn't mean one of them
must be wrong.

This is not entirely subjective, but neither is it entirely objective.

>I missed the original post, but if the poster wants to embed LISP in a hard
>real-time application, there are valid engineering questions that need
>to be asked.

Just so, and for embedded real time it may be that Lisp is unsuitable.

>I don't prefess to be anything near a LISP expert, but I'll toss out
>a few questions you need to ask yourself.  Can you calculate guaranteed 
>worst-case times for LISP routines in the presence of interrupts?  How 
>about the dynamic memory allocation algorithms used in LISP -- are they 
>non-deterministic?   How can you protect LISP data structures in the 
>presence of concurrency?  Is reclamation of stale heap space performed?
>I don't mean to imply that LISP fails in these areas.

I will leave this to the experts on various Lisp systems.  Lisp can
certainly come close, but few if any implementations aim to be
suitable for embedded real-time use.

-- Jeff
From: Larry Baker
Subject: Re: ..."AI" in realtime embedded systems.
Date: 
Message-ID: <1991Apr16.175901.21879@csl.dl.nec.com>
I don't remember if I posted this previously, or what, but I'm posting
it again, since the train of postings seems to warrant it.

You *can* embed "AI" (or Knowledge-Based Systems, really) in a "hard"
real-time application.  Take a look at the following article:

"Real-Time Data Acquisition at Mission Control," Muratore, Heindel, Murphy,
Rasmussen and McFarland, CACM December 1990, v33, no. 12, p. 19.

I don't want to start a religious war about KBS vs. AI.  Their technique
is applicable to either domain.  It's worth pointing out, though, that
the KBS/AI application is, in their approach, a "soft" real-time application,
with a predicted latency varying between 1-3 seconds. The data-collection
and conditioning subsystem is the "hard" application, with a required latency
of no more than 1/4 second to clear telemetry data buffers.

The point is that if you can dissect your system into "hard" and
"soft" real-time subsystems, and keep the AI in the "soft" part of the
system, then you're OK; if you have hard real-time requirements that cannot
fit within the latency predictions for the AI environment, then you're sunk.

At least until next generation of embedable hardware is available ;-).
--
Larry Baker
NEC America C&C Software Laboratories, Irving (near Dallas), TX
·····@texas.csl.dl.nec.com  cs.utexas.edu!necssd!baker
From: Donald A. Varvel
Subject: Re: "Easy" way to put "AI" in realtime embedded systems?
Date: 
Message-ID: <1334@muleshoe.cs.utexas.edu>
In article <····@telesoft.com> ···@telesoft.com (Bob Kitzberger @sation) writes:

>I missed the original post, but if the poster wants to embed LISP in a hard
>real-time application, there are valid engineering questions that need
>to be asked.

>I don't prefess to be anything near a LISP expert, but I'll toss out
>a few questions you need to ask yourself.  Can you calculate guaranteed 
>worst-case times for LISP routines in the presence of interrupts?  How 
>about the dynamic memory allocation algorithms used in LISP -- are they 
>non-deterministic?   How can you protect LISP data structures in the 
>presence of concurrency?  Is reclamation of stale heap space performed?
>I don't mean to imply that LISP fails in these areas.

>Worst-case analysis is necessary for hard real-time applications.  You're
>charting new territory if you use LISP in hard real-time, IMHO.

Funny you should mention it.  Yesterday I happened to be
at TI Dallas.  Folks there mentioned with some pride a
hard real-time system being developed in LISP.  They didn't
seem to want to talk much about details, but two points
that were mentioned were incremental garbage collection
and worst-case time guarantees for functions.

Very few off-the-shelf systems of any sort are suitable
for hard real-time applications.  Hardware designers seem
particularly prone to confuse *fast* with *guaranteed worst-
case performance*.  That tends to land us with "real-time"
processors with 17-level caches.

"Robotics" is often done by AI sorts, in LISP.  "Equipment
control" is often done by real-time sorts, in assembly.
It is clear to *me*, at least, that the two must eventually
evolve together.  If the worst problem we have in finding 
a reasonable meeting ground is producing a real-time LISP, 
we should count ourselves lucky.

-- Don Varvel (······@cs.utexas.edu)
From: Rick Smith
Subject: Re: "Easy" way to put "AI" in realtime embedded systems?
Date: 
Message-ID: <1991Mar27.195127.3156@sctc.com>
···@msel.unh.edu (Steven G. Chappell) writes:

>By what methods can "AI" algorithms be installed in an embedded system?

The same way as any other algorithm...

Once you know how you want the system to operate there's nothing to
prevent you from porting your ideas from LISP to a conventional
embedded programming language. It's the approach I used when I did
thesis research in robotics.

On the other hand, if you don't know what '"AI" algorithms' you
want to use, you'll produce at best illusory progress pouring effort
into developing a platform with an "embedded LISP" environment.

When I think of '"AI" algorithms' I think of things like rule based
systems, constraint systems, heuristic search strategies, frame
systems, etc. Any or all of these can be implemented just fine
in non-LISP environments. Personally, I believe that it's better
not to use LISP since LISP can mask some resource issues (i.e.
memory usage) that you should be sure to solve if you're applying
such algorithms to an embedded system problem.

A potential weakness to my arguments is that '"AI" algorithms' are
most often described using LISP, so you need to know how to see
"under" the LISP in order to implement them in a non-LISP environment.
Still, I encourage this since it forces you to understand what
you're doing... and believe me, you _can_ do AI work even if you do
understand what you're doing!

Rick.
·····@sctc.com    Arden Hills, Minnesota
From: Larry Baker
Subject: Re: "Easy" way to put "AI" in realtime embedded systems?
Date: 
Message-ID: <1991Mar30.003044.3941@csl.dl.nec.com>
There's a recent article in CACM that may be of interest here.

"Real-time Data Acquisition at Mission Control," Muratore, Heindel, Murphy,
Rasmussen and McFarland; CACM December 1990, v33 no 12.

One of the issues they faced was integrating "expert systems" (using CLIPS)
and a fairly hardcore realtime data acquisition problem.  I don't know the
details (e.g. how CLIPS compares to the other so-called "AI" tools), though.

--
Larry E. Baker, Jr.
NEC America, Inc.  C&C Software Laboratories
1525 Walnut Hill Ln., Irving, TX
(214) 518-3489
·····@texas.csl.dl.nec.com 		-or-	cs.utexas.edu!necssd!baker
From: Jack J. Woehr
Subject: Re: "Easy" way to put "AI" in realtime embedded systems?
Date: 
Message-ID: <23896@well.sf.ca.us>
···@msel.unh.edu (Steven G. Chappell) writes:

>symbolic programming environments and such environments don't mix well with
>"realtime" operation. Thus, my general information request:

>By what methods can "AI" algorithms be installed in an embedded system?

	There have been several succesful ports LISP >> Forth. GE's DELTA
and somebody's (NASA?) autorouter (Symbolics>>RTX2000) to name two. Performance
vastly increased in both cases.

-- 
 <···@well.{UUCP,sf.ca.us} ><  Member, >        /// ///\\\    \\\  ///
 <········@lll-winken.arpa >< X3J14 TC >       /// ///  \\\    \\\/// 
 <JAX on GEnie             >< for ANS  > \\\  /// ///====\\\   ///\\\ 
 <SYSOP RCFB (303) 278-0364><  Forth   >  \\\/// ///      \\\ ///  \\\