From: Paulo J. Matos aka PDestroy
Subject: Can you make it faster?
Date: 
Message-ID: <3B1B560F.17B44496@netcabo.pt>
Hi,

I have a function that is running many, many times and after profiling
the code I found out that I'll have an extreme gain if I optimize it.
Anyway, I can't see a way to do it.
The ideia is to create a copy of a hash table:
(defun copy-hash (estado)
  (let ((copy (make-hash-table)))
    (maphash #'(lambda (key val)
		 (setf (gethash key copy) val))
	     estado)
    copy))

Any ideias?

Best regards
-- 
+------------------Paulo J. Matos aka PDestroy--------------------+
|  ICQ # 361853                  |            ········@netcabo.pt |
|  http://iascp.sourceforge.net  |  http://mega.ist.utl.pt/~pocm  |
|  "Fixed width font LIVEZ!"     |    "Portability came to rule!" |
+-----------------------------------------------------------------+

Heroin, cocaine, drunkography, opium and pederasty were sure vehicles to
ephemeral success. The freemasonry of vice buoyed all its members...
Gala's and my strenght was that we always lived a healthy life in the
midst of all the physical and moral promiscuity, taking no part in it
without smoking, without taking dope, without sleeping around.
           - Dali, Salvador

From: Kent M Pitman
Subject: Re: Can you make it faster?
Date: 
Message-ID: <sfwg0dfcusn.fsf@world.std.com>
"Paulo J. Matos aka PDestroy" <········@netcabo.pt> writes:

> I have a function that is running many, many times and after profiling
> the code I found out that I'll have an extreme gain if I optimize it.
> Anyway, I can't see a way to do it.
> The ideia is to create a copy of a hash table:
> (defun copy-hash (estado)
>   (let ((copy (make-hash-table)))
>     (maphash #'(lambda (key val)
> 		 (setf (gethash key copy) val))
> 	     estado)
>     copy))
> 
> Any ideias?

This might help a little.  The part about size could, in some
implementations, be a big savings:

(defun copy-hash (orig)
  (let ((copy (make-hash-table 
		;; Specify size so that the table is redundantly 
		;; grown every time you add one or a few elements.
		 :size (hash-table-size orig))))
    (flet ((copy-key (key val)
             (setf (gethash key copy) val)))
      ;; Declare this function dynamic-extent so it doesn't make
      ;; closures in the heap.
      (declare (dynamic-extent #'copy-key))
      (maphash #'copy-key orig)
      copy)))
From: Pierre R. Mai
Subject: Re: Can you make it faster?
Date: 
Message-ID: <877kyr3w1d.fsf@orion.bln.pmsf.de>
"Paulo J. Matos aka PDestroy" <········@netcabo.pt> writes:

> I have a function that is running many, many times and after profiling
> the code I found out that I'll have an extreme gain if I optimize it.
> Anyway, I can't see a way to do it.
> The ideia is to create a copy of a hash table:
> (defun copy-hash (estado)
>   (let ((copy (make-hash-table)))
>     (maphash #'(lambda (key val)
> 		 (setf (gethash key copy) val))
> 	     estado)
>     copy))

Well, if copy-hash really is a hot-spot in your program, then it might
be the case that micro-optimizing it is not going to be that much of a
win:  Making a copy of a hash-table is going to take some time,
however much you tune it.  The much larger gain would be to not copy
hash-tables so often in the first place.  Why do you need to copy them
that often?  Maybe using association-lists which share tails and are
non-destructively modified will be much faster?

Something like:

(defstruct estado
  (contents nil))

(defun get-estado (key estado &optional default)
  (let ((entry (assoc key (estado-contents estado))))
    (if entry
        (cdr entry)
        default)))

(defun (setf get-estado) (value key estado &optional default)
  (declare (ignore default))
  (push (cons key value) (estado-contents estado))
  value)

Now copying an estado (state?) via copy-estado is an O(1) operation,
and it consumes less memory.

Whether this approach will be an overall win depends on the number of
entries, the number of accesses/modifications on estado, the number of
copies, the number of modifications in copies, etc.

There are probably other approaches, depending on the algorithms you
are using.

Regs, Pierre.

-- 
Pierre R. Mai <····@acm.org>                    http://www.pmsf.de/pmai/
 The most likely way for the world to be destroyed, most experts agree,
 is by accident. That's where we come in; we're computer professionals.
 We cause accidents.                           -- Nathaniel Borenstein