1991 | John Hertz, Anders Krogh, Richard G. Palmer
This is a lecture note volume on the theory of neural computation, authored by John Hertz, Anders Krogh, and Richard Palmer. It is part of the Santa Fe Institute Studies in the Sciences of Complexity series. The book provides an introduction to the theory of neural computation, covering topics such as the Hopfield model, optimization problems, simple perceptrons, multi-layer networks, recurrent networks, unsupervised learning, and formal statistical mechanics of neural networks. The book is structured into ten chapters, each discussing various aspects of neural computation. The first chapter introduces the field, drawing inspiration from neuroscience and outlining the main issues. The second chapter discusses the Hopfield model, an associative memory model, and its statistical mechanics. The third chapter explores extensions of the Hopfield model, including variations, correlated patterns, and continuous-valued units. The fourth chapter covers optimization problems, including the weighted matching problem, the traveling salesman problem, and graph bipartitioning. The fifth chapter discusses simple perceptrons, including feed-forward networks, threshold units, and the convergence of the perceptron learning rule. The sixth chapter covers multi-layer networks, including back-propagation, variations, and applications. The seventh chapter discusses recurrent networks, including Boltzmann machines and reinforcement learning. The eighth chapter covers unsupervised Hebbian learning, including principal component analysis and self-organizing feature extraction. The ninth chapter discusses unsupervised competitive learning, including adaptive resonance theory and feature mapping. The tenth chapter provides a formal statistical mechanics approach to neural networks, including the Hopfield model and Gardner theory. The book also includes an appendix on statistical mechanics and a bibliography.This is a lecture note volume on the theory of neural computation, authored by John Hertz, Anders Krogh, and Richard Palmer. It is part of the Santa Fe Institute Studies in the Sciences of Complexity series. The book provides an introduction to the theory of neural computation, covering topics such as the Hopfield model, optimization problems, simple perceptrons, multi-layer networks, recurrent networks, unsupervised learning, and formal statistical mechanics of neural networks. The book is structured into ten chapters, each discussing various aspects of neural computation. The first chapter introduces the field, drawing inspiration from neuroscience and outlining the main issues. The second chapter discusses the Hopfield model, an associative memory model, and its statistical mechanics. The third chapter explores extensions of the Hopfield model, including variations, correlated patterns, and continuous-valued units. The fourth chapter covers optimization problems, including the weighted matching problem, the traveling salesman problem, and graph bipartitioning. The fifth chapter discusses simple perceptrons, including feed-forward networks, threshold units, and the convergence of the perceptron learning rule. The sixth chapter covers multi-layer networks, including back-propagation, variations, and applications. The seventh chapter discusses recurrent networks, including Boltzmann machines and reinforcement learning. The eighth chapter covers unsupervised Hebbian learning, including principal component analysis and self-organizing feature extraction. The ninth chapter discusses unsupervised competitive learning, including adaptive resonance theory and feature mapping. The tenth chapter provides a formal statistical mechanics approach to neural networks, including the Hopfield model and Gardner theory. The book also includes an appendix on statistical mechanics and a bibliography.