This paper by Hirotogu Akaike from the Institute of Statistical Mathematics explores an extension of the classical maximum likelihood principle. The author proposes that the maximum likelihood principle can be seen as an asymptotic realization of an optimum estimate based on a general information-theoretic criterion. This extension aims to provide solutions to practical problems in statistical model fitting.
The key idea is to maximize the expected log-likelihood, which is equivalent to maximizing an information-theoretic quantity, specifically the Kullback-Leibler divergence between the estimated and true probability distributions. This approach is argued to be more natural and reasonable for developing a unified asymptotic theory of estimation. The paper discusses the definition of information and highlights the advantages of using the log-likelihood over simple likelihood in the maximum likelihood principle.
The extended maximum likelihood principle is particularly useful for deciding the final estimate of a finite parameter model when multiple alternative maximum likelihood estimates are available. It can also be applied to various practical problems, such as determining the number of factors in factor analysis, significant factors in ANOVA, the number of independent variables in multiple regression, and the order of autoregressive models in time series analysis. Numerical examples are provided to illustrate the effectiveness of this new approach compared to conventional statistical testing methods.This paper by Hirotogu Akaike from the Institute of Statistical Mathematics explores an extension of the classical maximum likelihood principle. The author proposes that the maximum likelihood principle can be seen as an asymptotic realization of an optimum estimate based on a general information-theoretic criterion. This extension aims to provide solutions to practical problems in statistical model fitting.
The key idea is to maximize the expected log-likelihood, which is equivalent to maximizing an information-theoretic quantity, specifically the Kullback-Leibler divergence between the estimated and true probability distributions. This approach is argued to be more natural and reasonable for developing a unified asymptotic theory of estimation. The paper discusses the definition of information and highlights the advantages of using the log-likelihood over simple likelihood in the maximum likelihood principle.
The extended maximum likelihood principle is particularly useful for deciding the final estimate of a finite parameter model when multiple alternative maximum likelihood estimates are available. It can also be applied to various practical problems, such as determining the number of factors in factor analysis, significant factors in ANOVA, the number of independent variables in multiple regression, and the order of autoregressive models in time series analysis. Numerical examples are provided to illustrate the effectiveness of this new approach compared to conventional statistical testing methods.