1991 | John Hertz, Anders Krogh, Richard G. Palmer
This section provides an overview of the "Introduction to the Theory of Neural Computation" by John Hertz, Anders Krogh, and Richard G. Palmer, published by Addison-Wesley Publishing Company as part of the Santa Fe Institute Studies in the Sciences of Complexity series. The book is divided into several chapters that cover various aspects of neural computation, including:
1. **Inspiration from Neuroscience**: Discusses the historical context and key issues in the field.
2. **The Hopfield Model**: Introduces the model for associative memory, its statistical mechanics, and stochastic networks.
3. **Extensions of the Hopfield Model**: Explores variations, correlated patterns, continuous-valued units, hardware implementations, and temporal sequences.
4. **Optimization Problems**: Covers weighted matching, the traveling salesman problem, graph bipartitioning, and optimization in image processing.
5. **Simple Perceptrons**: Focuses on feed-forward networks, threshold units, perceptron learning rules, linear and nonlinear units, stochastic units, and network capacity.
6. **Multi-Layer Networks**: Discusses back-propagation, variations, examples, performance, generalization, and optimal network architectures.
7. **Recurrent Networks**: Covers Boltzmann machines, recurrent back-propagation, learning time sequences, and reinforcement learning.
8. **Unsupervised Hebbian Learning**: Explains unsupervised learning, linear units, principal component analysis, and self-organizing feature extraction.
9. **Unsupervised Competitive Learning**: Introduces simple competitive learning, examples, adaptive resonance theory, feature mapping, and hybrid learning schemes.
10. **Formal Statistical Mechanics of Neural Networks**: Provides a theoretical framework using the Hopfield model and Gardner's theory of connections.
The book also includes appendices on statistical mechanics and a bibliography, along with subject and author indices.This section provides an overview of the "Introduction to the Theory of Neural Computation" by John Hertz, Anders Krogh, and Richard G. Palmer, published by Addison-Wesley Publishing Company as part of the Santa Fe Institute Studies in the Sciences of Complexity series. The book is divided into several chapters that cover various aspects of neural computation, including:
1. **Inspiration from Neuroscience**: Discusses the historical context and key issues in the field.
2. **The Hopfield Model**: Introduces the model for associative memory, its statistical mechanics, and stochastic networks.
3. **Extensions of the Hopfield Model**: Explores variations, correlated patterns, continuous-valued units, hardware implementations, and temporal sequences.
4. **Optimization Problems**: Covers weighted matching, the traveling salesman problem, graph bipartitioning, and optimization in image processing.
5. **Simple Perceptrons**: Focuses on feed-forward networks, threshold units, perceptron learning rules, linear and nonlinear units, stochastic units, and network capacity.
6. **Multi-Layer Networks**: Discusses back-propagation, variations, examples, performance, generalization, and optimal network architectures.
7. **Recurrent Networks**: Covers Boltzmann machines, recurrent back-propagation, learning time sequences, and reinforcement learning.
8. **Unsupervised Hebbian Learning**: Explains unsupervised learning, linear units, principal component analysis, and self-organizing feature extraction.
9. **Unsupervised Competitive Learning**: Introduces simple competitive learning, examples, adaptive resonance theory, feature mapping, and hybrid learning schemes.
10. **Formal Statistical Mechanics of Neural Networks**: Provides a theoretical framework using the Hopfield model and Gardner's theory of connections.
The book also includes appendices on statistical mechanics and a bibliography, along with subject and author indices.