This paper by Hirotozu Akaike presents an extension of the maximum likelihood principle based on information theory. The author shows that the classical maximum likelihood principle can be viewed as an asymptotic method for achieving an optimal estimate under a general information-theoretic criterion. This extension provides a framework for addressing many practical problems in statistical model fitting.
The paper introduces an extended maximum likelihood principle where the final estimate is chosen to maximize the expected log-likelihood. This is equivalent to maximizing an information-theoretic quantity related to the Kullback-Leibler divergence, which measures the difference between two probability distributions. This divergence is used to define the loss function for estimating parameters.
The paper argues that the log-likelihood is a more natural quantity than the simple likelihood for defining the maximum likelihood principle. The extended principle is applied to decision-making in finite parameter models, where multiple maximum likelihood estimates are obtained under different model restrictions. Log-likelihood ratio statistics are used to determine the final estimate, revealing the statistical nature of information-theoretic quantities.
The extended principle can solve various practical problems that have traditionally been treated as hypothesis testing issues. These include determining the number of factors in factor analysis, significant factors in ANOVA, the number of independent variables in multiple regression, and the order of autoregressive models. Numerical examples show that this approach differs from conventional statistical testing methods and may eventually replace many traditional statistical procedures.This paper by Hirotozu Akaike presents an extension of the maximum likelihood principle based on information theory. The author shows that the classical maximum likelihood principle can be viewed as an asymptotic method for achieving an optimal estimate under a general information-theoretic criterion. This extension provides a framework for addressing many practical problems in statistical model fitting.
The paper introduces an extended maximum likelihood principle where the final estimate is chosen to maximize the expected log-likelihood. This is equivalent to maximizing an information-theoretic quantity related to the Kullback-Leibler divergence, which measures the difference between two probability distributions. This divergence is used to define the loss function for estimating parameters.
The paper argues that the log-likelihood is a more natural quantity than the simple likelihood for defining the maximum likelihood principle. The extended principle is applied to decision-making in finite parameter models, where multiple maximum likelihood estimates are obtained under different model restrictions. Log-likelihood ratio statistics are used to determine the final estimate, revealing the statistical nature of information-theoretic quantities.
The extended principle can solve various practical problems that have traditionally been treated as hypothesis testing issues. These include determining the number of factors in factor analysis, significant factors in ANOVA, the number of independent variables in multiple regression, and the order of autoregressive models. Numerical examples show that this approach differs from conventional statistical testing methods and may eventually replace many traditional statistical procedures.