The Capacity of the Hopfield Associative Memory

The Capacity of the Hopfield Associative Memory

VOL. IT-33, NO. 4, JULY 1987 | ROBERT J. McELIECE, FELLOW, IEEE, EDWARD C. POSNER, FELLOW, IEEE, EUGENE R. RODEMIC, AND SANTOSH S. VENKATESH, STUDENT MEMBER, IEEE
The paper by McEliece, Posner, Rodemich, and Venkatesh rigorously analyzes the capacity of Hopfield associative memory, a neural network model for storing and retrieving information. The memory stores $n$-tuples of $\pm 1$ values, and the state of each neuron changes based on a hard-limited version of linear functions of all other neurons. The authors focus on the sum-of-outer products connection matrix construction, where the connection matrix $T$ is the sum of outer products of $m$ fundamental memories. They derive the maximum asymptotic value of $m$ for which most of the $m$ original memories can be exactly recovered, which is $n/(2 \log n)$ when no restrictions are placed on the fundamental memories. If each fundamental memory must be recoverable exactly, the maximum value is $n/(4 \log n)$. The paper also discusses extensions, such as quantizing the outer-product connection matrix, and relates this to the capacity of the quantized Gaussian channel. The authors provide a heuristic derivation of the capacity and rigorous proofs for certain cases, showing that the capacity is asymptotically $n/(2 \log n)$ or $n/(4 \log n)$ depending on the constraints.The paper by McEliece, Posner, Rodemich, and Venkatesh rigorously analyzes the capacity of Hopfield associative memory, a neural network model for storing and retrieving information. The memory stores $n$-tuples of $\pm 1$ values, and the state of each neuron changes based on a hard-limited version of linear functions of all other neurons. The authors focus on the sum-of-outer products connection matrix construction, where the connection matrix $T$ is the sum of outer products of $m$ fundamental memories. They derive the maximum asymptotic value of $m$ for which most of the $m$ original memories can be exactly recovered, which is $n/(2 \log n)$ when no restrictions are placed on the fundamental memories. If each fundamental memory must be recoverable exactly, the maximum value is $n/(4 \log n)$. The paper also discusses extensions, such as quantizing the outer-product connection matrix, and relates this to the capacity of the quantized Gaussian channel. The authors provide a heuristic derivation of the capacity and rigorous proofs for certain cases, showing that the capacity is asymptotically $n/(2 \log n)$ or $n/(4 \log n)$ depending on the constraints.
Reach us at info@study.space
[slides and audio] The capacity of the Hopfield associative memory