Date: Mon, 28 May 2007 18:22:35 -0500
Reply-To: OR Stats <stats112@GMAIL.COM>
Sender: "SAS(r) Discussion" <SAS-L@LISTSERV.UGA.EDU>
From: OR Stats <stats112@GMAIL.COM>
Subject: Re: Seminal Paper(s) on Neural Networks
Content-Type: text/plain; charset=ISO-8859-1; format=flowed
Hmmm, alrightie. Thank you. Perhaps another question to ask then, is there
a good paper/book that reviews all the major concepts and debates of NN? It
would be extremely informative for those who wish to become more familiar w.
its mathematics and applications. Best wishes, OrStats
On 5/28/07, firstname.lastname@example.org <email@example.com> wrote:
> On May 23, 10:39 pm, davidlcass...@MSN.COM (David L Cassell) wrote:
> > stats...@GMAIL.COM wrote:
> > >Dear All:
> > >For those who research/practice in Neural Networks, what is(are) the
> > >seminal
> > >paper(s) on the modelling?
> > >Cheers,
> > >ORstats
> > I would say that the *seminal* paper on neural networks is "A Logical
> > Calculus of the Ideas Immanent in Nervous Activity" by McCullough and
> Yes, in regard to parts I and II of that paper. But part III does not
> appear to be quite right.
> A decade later, S.C.Kleene developed what turned out to be a better
> theory -- namely the
> idea of "regular expressions." ( That same idea was also developed
> around the same time
> by Mason and Zimmerman.)
> > But that won't get you very far in terms of modern modeling of neural
> > networks. "Perceptrons", by Minskyand Papert, lies in between it and
> > current computational processes in neural networks. And "Perceptrons"
> > is almost the anti-seminal paper in the field.
> What's your basis for that remark? Do you believe the rumor that back-
> propagation can enable a loop-free multi-layer net to effectively
> recognize topological features of things? That rumor was spread by
> people who did not actually understand our book. Of course 'parity'
> can be recognized with logaritmic complexity, but connectedness still
> seems to require exponentially growing loop-free such networks.