This article by Karl Friston explores the nature of evoked brain responses and the principles underlying their generation. It argues that the sensory brain evolved to infer the causes of changes in sensory inputs, using statistical inference. The statistical fundamentals of inference provide constraints on neuronal implementation. By formulating Helmholtz's ideas on perception in modern statistical theories, a model of perceptual inference and learning is developed that can explain a wide range of neurobiological phenomena.
The article highlights that both perceptual inference and learning can be resolved using the same principle: minimizing the brain's free energy, as defined in statistical physics. This principle allows cortical responses to be seen as the brain's attempt to minimize free energy induced by a stimulus, encoding the most likely cause. Learning emerges from changes in synaptic efficacy that minimize free energy averaged over all stimuli encountered. The underlying scheme uses empirical Bayes and hierarchical models to construct prior expectations dynamically and contextually.
The theoretical framework predicts hierarchical cortical architectures, reciprocal connections, and functional asymmetries between forward and backward connections. It also accounts for associative plasticity, spike-timing-dependent plasticity, classical and extra-classical receptive field effects, long-latency responses, and phenomena like repetition suppression, mismatch negativity (MMN), and the P300 in electroencephalography. The article concludes by discussing the implications of these findings for understanding cortical organization and responses, and the measurement of perceptual learning using the MMN.This article by Karl Friston explores the nature of evoked brain responses and the principles underlying their generation. It argues that the sensory brain evolved to infer the causes of changes in sensory inputs, using statistical inference. The statistical fundamentals of inference provide constraints on neuronal implementation. By formulating Helmholtz's ideas on perception in modern statistical theories, a model of perceptual inference and learning is developed that can explain a wide range of neurobiological phenomena.
The article highlights that both perceptual inference and learning can be resolved using the same principle: minimizing the brain's free energy, as defined in statistical physics. This principle allows cortical responses to be seen as the brain's attempt to minimize free energy induced by a stimulus, encoding the most likely cause. Learning emerges from changes in synaptic efficacy that minimize free energy averaged over all stimuli encountered. The underlying scheme uses empirical Bayes and hierarchical models to construct prior expectations dynamically and contextually.
The theoretical framework predicts hierarchical cortical architectures, reciprocal connections, and functional asymmetries between forward and backward connections. It also accounts for associative plasticity, spike-timing-dependent plasticity, classical and extra-classical receptive field effects, long-latency responses, and phenomena like repetition suppression, mismatch negativity (MMN), and the P300 in electroencephalography. The article concludes by discussing the implications of these findings for understanding cortical organization and responses, and the measurement of perceptual learning using the MMN.