From: Bruno Haible
Subject: Re: Book Review: _Object-Oriented Common LISP_, by S. Slade
Date: 
Message-ID: <61bm8g$pdq$1@nz12.rz.uni-karlsruhe.de>
Barry Margolin <······@bbnplanet.com> wrote:
> If you're referring to code like:
>
> (defun with-open-file-helper (body-fun filename &rest open-options)
>   (let ((stream nil))
>     (unwind-protect
>         (progn (setq stream (apply #'open filename open-options))
>                (funcall funarg stream))
                          s/funarg/body-fun/
>       (when stream
>         (close stream)))))
>
> (defmacro with-open-file ((var filename &rest open-options) &body body)
>   `(flet ((.with-open-file-internal. (,var) ,@body))
>      (apply #'with-open-file-helper #'.with-open-file-internal.
>             ,filename ,@open-options)))
                        s/,@/,/
>
> I happen to agree with him that this is the better way to implement macros
> when feasible.  Its main advantage is that you can make changes to the
> -HELPER functions and they'll take effect without requiring the user to
> recompile all the calling functions.  It also reduces code bloat; 

But it has also disadvantages:

  - If the user code signals an error within `body', the programmer will
    most likely be informed about an error in function
    .WITH-OPEN-FILE-INTERNAL. Similarly, when doing profiling, the
    metering code will display a function .WITH-OPEN-FILE-INTERNAL. as
    consuming some time (or even several of them). I therefore find it
    a bad habit to wrap user code in functions with dummy names.

  - Two function calls.

  - Variables defines outside and used inside the body will be forced
    into a closure.

These disadvantages are all avoided by the gensym approach. In this approach
you can also reduce code bloat by packing large pieces of the code into
functions.

                     Bruno

From: Kent M Pitman
Subject: Re: Book Review: _Object-Oriented Common LISP_, by S. Slade
Date: 
Message-ID: <sfw201yffyw.fsf@world.std.com>
······@ma2s2.mathematik.uni-karlsruhe.de (Bruno Haible) writes:

> But it has also disadvantages:
> 
> - If the user code signals an error within `body', the programmer will
> most likely be informed about an error in function
> .WITH-OPEN-FILE-INTERNAL. Similarly, when doing profiling, the
> metering code will display a function .WITH-OPEN-FILE-INTERNAL. as
> consuming some time (or even several of them). I therefore find it
> a bad habit to wrap user code in functions with dummy names.

This is a not portably solvable in CL right now, but note that there exist
solutions.  In Symbolics Genera, you can declare a function to be an 
"error reporter" and such forms are skipped by the debugger; however,
it is simply an implementation defect if the compiler doesn't usefully
report this context.  Many implementions will report things like
 Error in function (:INTERNAL MY-READ-FILE 0 .WITH-OPEN-FILE-INTERNAL.)
and the user can tell that it's both "inside READ-FILE", in the first
of maybe several generated definitions AND inside a named internal
definition whose name is .WITH-OPEN-FILE-INTERNAL.
If the user can't locate the specific part, he's no worth off than knowing
he's in MY-READ-FILE.  So I don't really buy this as a disadvantage.

Further, a profiling tool which doesn't know how to attribute internal
functions to the containing function needs work.  Report such problems
as bugs and I bet your vendor will work on it.  It SHOULDN'T be accumulated
as a single function when it's an internal function; that's an out and
out bug.

> - Two function calls.

This is an engineering choice.  In a tight inner loop, it matters.  In most
cases where you're about to open a file for processing, the opening of the
file at all and probably of any of the character-read operations will utterly
dwarf the two function calls so I can't IMAGINE the cost of two function
calls will ever matter--though there are other situations where it might.
In the case under consideration, the value of redefinability enormously
exceeds the cost in execution speed; in other cases that's not true.
It's good to have both tools available.

> - Variables defines outside and used inside the body will be forced
> into a closure.

Actually, this third one is fixed by correcting the code Barry supplied to
be the following, which permits the clever implementation to allocate
a stack-closure that doesn't have to require consing:

 (defmacro with-open-file ((var filename &rest open-options) &body body)
   `(flet ((.with-open-file-internal. (,var) ,@body))
      (declare (dynamic-extent #'.with-open-file-internal.))
      (with-open-file-helper #'.with-open-file-internal.
                             ,filename ,@open-options)))

[Your correction with the ",@" => "," wasn't correct, btw.  It does want a ,@
 but it does not want to use apply.  APPLY is needed when merging args
 at compile time, not at macro expansion time.]
[Also, personally, I always use the name CALL-WITH-xxx as the companion 
 function for WITH-xxx macros so I feel 
 comfortable documenting them. e.g., CALL-WITH-OPEN-FILE.]

> THESE disadvantages are all avoided by the gensym approach. In this approach
> you can also reduce code bloat by packing large pieces of the code into
> functions.

I find the problem to be caused when I redefine the details of this function
and have to recompile my whole system.  I hate that.  But the point is,
again, to realize there is more than one answer.  The good thing is to know
your options and choose what's good for you.
From: Joerg Hoehle
Subject: Re: Book Review: _Object-Oriented Common LISP_, by S. Slade
Date: 
Message-ID: <61qp4f$kvb@omega.gmd.de>
Bruno Haible (······@ma2s2.mathematik.uni-karlsruhe.de) wrote:
: Barry Margolin <······@bbnplanet.com> wrote:
[Example of macro with auxiliary function and &body as a FUNARG for
that function, against "traditional" macros using GENSYM and expanding
everything to primitives available in CL]

: But it has also disadvantages:
[...]

:   - Two function calls.
[   - inner variables may end up in closures]

This brings us to a very long history of people using macros in Lisp.
I got the impression that (in not so past times) people wrote macros
where Norvig's book advertizes "use functions and inline" because of
the cost of creating closures and calling them, thus those people that
want(ed?) fast code wrote functions avoiding #'lambda, typically
leading to uses of either many top-level defuns (avoids closure
creation at run-time) or of an imperative/iterative style vs. a
functional/mapping one.

what do you think is faster (assuming result order doesn't matter)?
(let ((result ()))
  (dolist (e ... result)
    (push ... result))
(loop for e in ...
      collect ...)
(mapcar #'(lambda (e) ...) ...)

So to address the real root of the problem IMHO:

What may make function calls and lambdas slow and how to make them faster?


: These disadvantages are all avoided by the gensym approach. In this approach
: you can also reduce code bloat by packing large pieces of the code into
: functions.
I don't follow you here.

	Jo"rg Ho"hle.
············@gmd.de		http://zeus.gmd.de/~hoehle/amiga-clisp.html


PS: for a real example about inlining (not macros), given:
;; they are so simple they should be inlined
(declaim (inline graph-nodes))
(defun graph-nodes (states successors
		    &optional (state= #'eql) old-states)
  (declare (type list states old-states))
  ;;Teilentrekursiviert zur Vermeidung von append
  (dolist (state states old-states)
    (unless (member state old-states :test state=)
      (setq old-states
	    (graph-nodes (funcall successors state) successors state=
			 (cons state old-states))))))
compare:
(defun get-all-subs-initial1 (partition hierarchy)
  (flet ((subs (part)
	   (partition-subs (get-partition part hierarchy))))
    (graph-nodes (subs partition) #'subs #'eq)))
with:
(defun get-all-subs-initial2 (partition hierarchy &optional accu)
  (declare (type list accu))
  (dolist (part (partition-subs (get-partition partition hierarchy)) accu)
    (unless (member part accu :test #'eq)
      (setq accu
	    (get-all-subs-initial2 part hierarchy (cons part accu))))))
If your compiler supports inlining, what's the speed (and space)
difference between the first and the second version?