26 May 2018 | Douglas Bates, Reinhold Kliegl, Shravan Vasishth, R. Harald Baayen
The article discusses the challenges and recommendations for fitting mixed-effects models in psychological and linguistic experiments. It highlights the importance of selecting an appropriate random-effects structure to avoid overparameterization, which can lead to non-convergence and uninterpretable models. The authors propose an iterative reduction method to simplify models, including principal component analysis (PCA) to determine the dimensionality of the random-effects structure, constraining correlation parameters to zero initially, and removing non-significant variance components. They demonstrate this method using two examples: a psycholinguistic experiment on pragmatic comprehension and a visual-attention experiment. The article also compares frequentist and Bayesian approaches, showing that both methods yield similar results. The authors conclude that maximal models are not necessary to protect against anti-conservative conclusions and that parsimony is a virtue in statistical modeling.The article discusses the challenges and recommendations for fitting mixed-effects models in psychological and linguistic experiments. It highlights the importance of selecting an appropriate random-effects structure to avoid overparameterization, which can lead to non-convergence and uninterpretable models. The authors propose an iterative reduction method to simplify models, including principal component analysis (PCA) to determine the dimensionality of the random-effects structure, constraining correlation parameters to zero initially, and removing non-significant variance components. They demonstrate this method using two examples: a psycholinguistic experiment on pragmatic comprehension and a visual-attention experiment. The article also compares frequentist and Bayesian approaches, showing that both methods yield similar results. The authors conclude that maximal models are not necessary to protect against anti-conservative conclusions and that parsimony is a virtue in statistical modeling.