From: Barry Margolin
Subject: Re: reducing number consing
Date: 
Message-ID: <barmar-E25D00.02021314122003@netnews.attbi.com>
In article <···················@twister.nyroc.rr.com>,
 Cliff Crawford <·····@cornell.edu> wrote:

> I'm trying to write a simple neural network simulation in Lisp, but
> I'm having trouble with number consing in the inner loop in the
> training function.  According to SBCL's profiler, the function
> feed-forward (see below) conses 32 bytes every time it's called, but
> I'd like it to not cons at all if possible.  Does anyone see what's
> wrong with it?  TIA...

There's probably some boxing and unboxing of floats going on.  Maybe 
declaring the type of SUM would allow the computations in the inner loop 
to be open-coded.
 
> (deftype net-vector () `(simple-array double-float))
> (deftype net-matrix () `(simple-array double-float 2))
> (deftype net-activation-function () `(member :linear :logistic :exp :tanh))
> 
> (defun feed-forward (layer1 layer2 weights biases activation-function)
>   (declare (optimize (speed 3) (safety 0))
>            (net-vector layer1 layer2 biases)
>            (net-matrix weights)
>            (net-activation-function activation-function))
>   (loop :for i :below (length layer2) :do
>     (let ((input (+ (aref biases i)
>                     (loop :with sum = 0d0
>                           :for j :below (length layer1) :do
>                           (incf sum (* (aref weights i j)
>                                        (aref layer1 j)))
>                           :finally (return sum)))))
>       (setf (aref layer2 i) (case activation-function
>                               (:linear input)
>                               (:logistic (logistic input))
>                               (:exp (exp input))
>                               (:tanh (tanh input)))))))

-- 
Barry Margolin, ······@alum.mit.edu
Woburn, MA