From: Alexey Dejneka
Subject: Re: reducing number consing
Date: 
Message-ID: <m365gjlw81.fsf@comail.ru>
Cliff Crawford <·····@cornell.edu> writes:

> I'm trying to write a simple neural network simulation in Lisp, but
> I'm having trouble with number consing in the inner loop in the
> training function.  According to SBCL's profiler,

Implementation-specific questions are better asked on its
mailing-list.

> the function
> feed-forward (see below) conses 32 bytes every time it's called, but
> I'd like it to not cons at all if possible.  Does anyone see what's
> wrong with it?  TIA...
> 
> (deftype net-vector () `(simple-array double-float))
> (deftype net-matrix () `(simple-array double-float 2))
> (deftype net-activation-function () `(member :linear :logistic :exp :tanh))
> 
> (defun feed-forward (layer1 layer2 weights biases activation-function)
>   (declare (optimize (speed 3) (safety 0))
>            (net-vector layer1 layer2 biases)
>            (net-matrix weights)
>            (net-activation-function activation-function))
>   (loop :for i :below (length layer2) :do
>     (let ((input (+ (aref biases i)
>                     (loop :with sum = 0d0
>                           :for j :below (length layer1) :do
>                           (incf sum (* (aref weights i j)
>                                        (aref layer1 j)))
>                           :finally (return sum)))))
>       (setf (aref layer2 i) (case activation-function
>                               (:linear input)
>                               (:logistic (logistic input))
>                               (:exp (exp input))
>                               (:tanh (tanh input)))))))

Put (DECLAIM (INLINE LOGISTIC)) before definition of LIGISTIC. You
/may/ also need to slightly change FEED-FORWARD:

(defun feed-forward (layer1 layer2 weights biases activation-function)
  (declare (optimize (speed 3) (safety 0))
           (net-vector layer1 layer2 biases)
           (net-matrix weights)
           #+nil (net-activation-function activation-function)          ; <---
  )
  (loop :for i :below (length layer2) :do
    (let ((input (+ (aref biases i)
                    (loop :with sum = 0d0
                          :for j :below (length layer1) :do
                          (incf sum (* (aref weights i j)
                                       (aref layer1 j)))
                          :finally (return sum)))))
      (setf (aref layer2 i) (case activation-function
                              (:linear input)
                              (:logistic (logistic input))
                              (:exp (exp input))
                              (:tanh (tanh input))
                              (t (error-not-activation-function input)) ; <---
                            )))))

-- 
Regards,
Alexey Dejneka

"Alas, the spheres of truth are less transparent than those of
illusion." -- L.E.J. Brouwer