From: Jordan Bortz
Subject: Neural Networks - Pointers to good texts?
Date: 
Message-ID: <230@titn.TITN>
Hello;
	I'm looking for some good texts and/or articles on neural networks,
in English, and preferably focusing on real life algorithms/implementations
rather than obscure mathematics.

	If you know of any, please let me know by mail, and I'll summarize
to the net; as this topic seems to be generating more interest.

	I know some articles were mentioned earlier, but what about others?

		Jordan

-- 
=============================================================================
Jordan Bortz	Higher Level Software 1085 Warfield Ave  Piedmont, CA   94611
(415) 268-8948	UUCP:	(decvax|ucbvax|ihnp4)!decwrl!sun!plx!titn!jordan
=============================================================================

From: William Calvin
Subject: Re: Neural Networks - Pointers to good texts?
Date: 
Message-ID: <4191@well.UUCP>
Best book on neural networks is THE CRUSTACEAN STOMATOGASTRIC GANGLION by
Selverston and Moulins (Springer 1987).  If you mean neural-like networks,
try the Rumelhart et al PARALLEL DISTRIBUTED PROCESSING (MIT Press 1986).
	We brain researchers sure get tired of hearing neural-like networks
referred to as "neural networks", an established subject for 25 years since
the days of Limulus lateral inhibition.  Calling silicon networks "neural" is
about like the hype in the early days when every digital computer was 
called a "brain" by the media.
		William H. Calvin
		University of Washington NJ-15, Seattle WA 98195
			·······@well.uucp   ·······@uwalocke.bitnet
From: Frederick J Dickey
Subject: Re: Neural Networks - Pointers to good texts?
Date: 
Message-ID: <1465@ssc-vax.UUCP>
In article <····@well.UUCP>, ·······@well.UUCP (William Calvin) writes:
> 	We brain researchers sure get tired of hearing neural-like networks
> referred to as "neural networks", an established subject for 25 years since
> the days of Limulus lateral inhibition.  

I think the above says that "biological" neural nets have been studied as a
formal discipline for 25 years and that this great ancestry gives biology
prior claim to the term "neural nets". Assuming that this is a correct
interpretation, let me make the following observation. In 1943, McCulloch
and Pitts published a paper entitled "A logical calculus of the ideas
immanent in neural nets". Minsky and Papert (Perceptrons) state that this
paper presents the "prototypes of the linear threshold functions". This paper
stikes me as clearly being in the "neural net-like" tradition. Now
1987-1943 = 44. Also note that 44 > 25. Therefore, it apears that the
"neural net-like" guys have prior claim to the term "neural net". :-).
From: Stephen Smoliar
Subject: Re: Neural Networks - Pointers to good texts?
Date: 
Message-ID: <3807@venera.isi.edu>
In article <····@ssc-vax.UUCP> ······@ssc-vax.UUCP (Frederick J Dickey) writes:
>In article <····@well.UUCP>, ·······@well.UUCP (William Calvin) writes:
>> 	We brain researchers sure get tired of hearing neural-like networks
>> referred to as "neural networks", an established subject for 25 years since
>> the days of Limulus lateral inhibition.  
>
>I think the above says that "biological" neural nets have been studied as a
>formal discipline for 25 years and that this great ancestry gives biology
>prior claim to the term "neural nets". Assuming that this is a correct
>interpretation, let me make the following observation. In 1943, McCulloch
>and Pitts published a paper entitled "A logical calculus of the ideas
>immanent in neural nets". Minsky and Papert (Perceptrons) state that this
>paper presents the "prototypes of the linear threshold functions". This paper
>stikes me as clearly being in the "neural net-like" tradition. Now
>1987-1943 = 44. Also note that 44 > 25. Therefore, it apears that the
>"neural net-like" guys have prior claim to the term "neural net". :-).


Well . . . this is all rather silly.  The PUBLISHED title of the classic
paper by McCullogh and Pitts is "A Logigal Calculus of the Ideas Immanent
in Nervous Activity."  They NEVER use "neural net" as a technical term
(or in any other capacity) in the paper.  They ARE, however, concerned
with a net model based on the interconnection of elements which they call
neurons--appealing to properties of neurons which were known at the time
they wrote the paper.  Personally, I think Calvin has a point.  Investigators
who are searching the literature will probably benefit from cues which
distinguish papers about actual physiological properties from those about
computational models of those properties.
From: Frederick J Dickey
Subject: Re: Neural Networks - Pointers to good texts?
Date: 
Message-ID: <1475@ssc-vax.UUCP>
From postnews Wed Oct 21 07:49:49 1987
> >interpretation, let me make the following observation. In 1943, McCulloch
> >and Pitts published a paper entitled "A logical calculus of the ideas
> >immanent in neural nets". Minsky and Papert (Perceptrons) state that this
> 
> Well . . . this is all rather silly.  The PUBLISHED title of the classic
> paper by McCullogh and Pitts is "A Logigal Calculus of the Ideas Immanent
> in Nervous Activity."  They NEVER use "neural net" as a technical term

Well, this is very interesting. When I read Calvin's original 
posting I was struck by the claim that neural nets had been studied for
25 years. This surely seemed too small a figure to me. To check this
out without initiating a major research project, I grabbed a copy of 
Minsky and Papert's "Perceptrons" which happened to be on my desk at the
time and opened to the bibliography. M&P give the title of the McCullough and
Pitts paper as "A logical calculus of the ideas immanent in neural nets". 
I'm looking at it right now and that's what it says. Apparently, the citation
is wrong. Well, I stand corrected. 

I might comment by the way that regardless of the merits of Calvin's claim
that artificial neural nets ought to be named something else, I think the
effort is doomed to failure. The reason being that we seem to have become
an excessively marketing-oriented society. Labels are attached to things to
stimulate desired responses in "consumers," not to clarify crtical
distinctions. The practical problem one faces in much of the industrial world
is attempting to gain support for one's "research." To do this, one presents
one's proposed program to a manager, i.e., one markets it. The noise level 
in this process is so high that nothing less than hype makes it through.
My experience with managers leads me to believe that they may have heard
of neural nets. If I tried to start a "neuroid" project, they would say
"Isn't that the same thing as a neural net?" I can guarantee you that they
aren't interested in the distinctions between artifical and biological nets.
How can an aerospace company make a profit from biological nets? In other
words, to start a artifical neural net project, I have to call it a neural
net, show how it applies to some product, how it adds value to the product
(neural nets will make the product more powerful than a locomotive, faster
than a speeding bullet, and able to leap over tall buildings at a single
bound), and how all this can be done by next year at a half man-year level
of effort.

If I lived in an ivory tower (a not unpleasant domicile), I'd say that
Calvin is right on. Out here in the cinder block towers, he's out to lunch.
To summarize, I'm sympathic to his viewpoint, but my sympathy isn't going
to make much difference.
From: David E. Leasure
Subject: Re: Neural Networks - Pointers to good texts?
Date: 
Message-ID: <1636@sabbath.rutgers.edu>
 ·······@well.UUCP (William Calvin) writes:
>	We brain researchers sure get tired of hearing neural-like networks
>referred to as "neural networks", an established subject for 25 years since
>the days of Limulus lateral inhibition.  Calling silicon networks "neural" is
>about like the hype in the early days when every digital computer was 
>called a "brain" by the media.

Maybe we could all agree on a more faithful/less ingratiating term?
Maybe connectionist processing models or neuromorphic (after
Touretzky), or fine-grained parallel processing? (even Rumelhart's
Parallel Distributed Processing?)

David E. Leasure
Rutgers/AT&T