The computational foundations of dynamic coding in working memory

The computational foundations of dynamic coding in working memory

July 2024 | Jake P. Stroud, John Duncan, and Máté Lengyel
Working memory (WM) involves maintaining and manipulating information during delay periods. The prefrontal cortex (PFC) plays a key role in WM, and recent evidence suggests that neural activities in PFC show dynamic variations during WM maintenance, a phenomenon called dynamic coding. This dynamic coding is not just an epiphenomenon but a fundamental computational feature. Task-optimized neural networks also exhibit dynamic coding when optimized for WM tasks. Two key aspects of a neural network determine whether it exhibits dynamic coding: the connectivity of the network and the inputs it receives. Dynamic coding results from an optimality principle that enhances task performance for a broad range of task constraints and neural network architectures. Dynamic coding has been observed in PFC neurons during simple and complex WM tasks. Neural recordings show that different sets of neurons are active during different task periods, and that neural activities can be strongly dynamic during the delay period. More advanced analyses have characterized dynamic coding at the level of neural populations, revealing low correlation between cue and late delay period selectivities and high positive correlation between mid and late delay period selectivities. Cross-temporal decoding, a method to distinguish dynamic coding from stable coding, shows that dynamic coding results in low cross-temporal decoding performance, while stable coding results in high performance. Classical models of WM rely on attractor neural network dynamics to maintain stimuli during delay periods, leading to stable coding. However, task-optimized networks exhibit dynamic coding due to non-normal connectivity and optimal inputs that utilize effective feedforward connections. These networks show sequential activities during the cue and early delay periods followed by more persistent activities during the late delay period. Dynamic coding is more prevalent in complex tasks and scales with task complexity. The generality of dynamic coding is supported by the fact that many neural circuits, real or artificial, contain effective feedforward connections. The degree to which a network exhibits dynamic coding depends on the functional objective for which it is optimized. Networks optimized for long and variable retention intervals develop attractor dynamics for stable coding, while those with short retention intervals may exhibit only stable coding. In sensory processing, where immediate decoding is required, dynamic coding is less prevalent. In conclusion, WM is a cognitive function implemented by the concerted dynamics of large neural populations. The study of other cognitive functions, such as motor control, has benefited from a dynamical systems perspective. Dynamic coding results from an optimality principle that enhances task performance, and its strength is influenced by task complexity and network architecture.Working memory (WM) involves maintaining and manipulating information during delay periods. The prefrontal cortex (PFC) plays a key role in WM, and recent evidence suggests that neural activities in PFC show dynamic variations during WM maintenance, a phenomenon called dynamic coding. This dynamic coding is not just an epiphenomenon but a fundamental computational feature. Task-optimized neural networks also exhibit dynamic coding when optimized for WM tasks. Two key aspects of a neural network determine whether it exhibits dynamic coding: the connectivity of the network and the inputs it receives. Dynamic coding results from an optimality principle that enhances task performance for a broad range of task constraints and neural network architectures. Dynamic coding has been observed in PFC neurons during simple and complex WM tasks. Neural recordings show that different sets of neurons are active during different task periods, and that neural activities can be strongly dynamic during the delay period. More advanced analyses have characterized dynamic coding at the level of neural populations, revealing low correlation between cue and late delay period selectivities and high positive correlation between mid and late delay period selectivities. Cross-temporal decoding, a method to distinguish dynamic coding from stable coding, shows that dynamic coding results in low cross-temporal decoding performance, while stable coding results in high performance. Classical models of WM rely on attractor neural network dynamics to maintain stimuli during delay periods, leading to stable coding. However, task-optimized networks exhibit dynamic coding due to non-normal connectivity and optimal inputs that utilize effective feedforward connections. These networks show sequential activities during the cue and early delay periods followed by more persistent activities during the late delay period. Dynamic coding is more prevalent in complex tasks and scales with task complexity. The generality of dynamic coding is supported by the fact that many neural circuits, real or artificial, contain effective feedforward connections. The degree to which a network exhibits dynamic coding depends on the functional objective for which it is optimized. Networks optimized for long and variable retention intervals develop attractor dynamics for stable coding, while those with short retention intervals may exhibit only stable coding. In sensory processing, where immediate decoding is required, dynamic coding is less prevalent. In conclusion, WM is a cognitive function implemented by the concerted dynamics of large neural populations. The study of other cognitive functions, such as motor control, has benefited from a dynamical systems perspective. Dynamic coding results from an optimality principle that enhances task performance, and its strength is influenced by task complexity and network architecture.
Reach us at info@study.space
[slides] The computational foundations of dynamic coding in working memory | StudySpace