From: Peter Dudey
Subject: Free goodies coming up!
Date: 
Message-ID: <C53nFF.C6s@willamette.edu>
For my final project in my AI/Lisp class, I've written an implementation
of John Koza's Genetic Programming algorithm for designing neural
networks.  It's got nice output, and even some fairly spiffy graphics. 
(You may have to tweak those--I wrote 'em for an Oracle PC clone with SVGA.)
It's also generic.  I've gotten it to produce XOR-solving neural nets, and
I'm working on 3-bit parity.  (There are still a few bugs.)  It should be
able to handle any problem that you use a neural net to solve, and there
are lots of parameters that are easy to tweak, twiddle and frob.

Anyway, when I'm done, I'll write up some nice, clear documentation, and
post the whole deal, XLISP code and all, to these newsgroups.  Somewhere
in the mean time, I'll give you a nice sample output.

For those of you in GP land, here are some preliminary findings:

It seems to work best to have about 1/3 of each generation produced by
breeding, the rest by cloning.  The best and worst critters are always the
ones produced by breeding...

If you're doing some logical problem (like XOR) where a critter would be
able to get half of the questions right just by guessing, reduce all
fitnesses by half of the maximum.  That way, the completely clueless ones
get wiped out in the first generation.

Breeding for parsimony (counting huge critters as less fit) is a REALLY
NEAT trick.

Feel free to ask questions, and I'll have the goodies here in a few weeks.
This should be fairly easy to port to other Lisps, except for the graphics
part, and you can turn that off...
-- 
| Gummathron Palputrex XCIV, Lisp SubGuru and Salad/Crude/Massage Oil Baron
| Mad Scientists For A Better Tomorrow  "Yeah, whatever.  Lick the electrodes!"
| Please mail me plastic spaceships:  900 State St.  C-210,  Salem,  OR,  97301
| *** ENFORCE ANTI-TRUST LAWS *** TEACH SCHOOLCHILDREN CRITICAL TV WATCHING ***