Nonlinear Neural Networks: Principles, Mechanisms, and Architectures

Nonlinear Neural Networks: Principles, Mechanisms, and Architectures

1988 | STEPHEN GROSSBERG
Stephen Grossberg's article discusses the historical development of interdisciplinary studies in physics and psychobiology during the 19th century, which eventually led to the separation of these fields into distinct scientific disciplines in the 20th century. He emphasizes the nonlinear, nonlocal, and nonstationary nature of behavioral and brain data, and outlines three main sources of contemporary neural network research: binary, linear, and continuous-nonlinear models. The article explores various continuous-nonlinear models, including content-addressable memory models, and describes a Liapunov functional method for proving global limit or oscillation theorems for nonlinear competitive systems. It also discusses the properties of shunting competitive feedback networks, their role in pattern transformation and memory storage, and their connections to models of competitive learning and categorical perception. Grossberg compares adaptive resonance theory (ART) models with off-line models, highlighting the importance of stability and capacity in these models. The article also discusses the role of top-down expectations and attention in learning and information processing, and contrasts models regulated by internal or external signals. Examples from sensory-motor control and adaptive vector encoders illustrate these models. The article concludes with a discussion of the importance of interdisciplinary research in understanding the complexities of mind and brain.Stephen Grossberg's article discusses the historical development of interdisciplinary studies in physics and psychobiology during the 19th century, which eventually led to the separation of these fields into distinct scientific disciplines in the 20th century. He emphasizes the nonlinear, nonlocal, and nonstationary nature of behavioral and brain data, and outlines three main sources of contemporary neural network research: binary, linear, and continuous-nonlinear models. The article explores various continuous-nonlinear models, including content-addressable memory models, and describes a Liapunov functional method for proving global limit or oscillation theorems for nonlinear competitive systems. It also discusses the properties of shunting competitive feedback networks, their role in pattern transformation and memory storage, and their connections to models of competitive learning and categorical perception. Grossberg compares adaptive resonance theory (ART) models with off-line models, highlighting the importance of stability and capacity in these models. The article also discusses the role of top-down expectations and attention in learning and information processing, and contrasts models regulated by internal or external signals. Examples from sensory-motor control and adaptive vector encoders illustrate these models. The article concludes with a discussion of the importance of interdisciplinary research in understanding the complexities of mind and brain.
Reach us at info@study.space
Understanding Nonlinear neural networks%3A Principles%2C mechanisms%2C and architectures