NONLINEAR STATISTICAL MODELS

NONLINEAR STATISTICAL MODELS

1982 | A. Ronald Gallant
This chapter, authored by A. Ronald Gallant, focuses on developing a unified asymptotic theory for nonlinear statistical models. The author begins by noting that many discussions in the nonlinear models literature follow a classical treatment of maximum likelihood estimation, with some technical issues arising due to the need to condition on independent variables and handle Pitman drift. The chapter aims to unify these approaches by treating the objective function of optimization problems as if it were a log-likelihood, allowing for the derivation of test statistics such as the Wald, likelihood ratio, and Rao's efficient score. The chapter introduces two main classes of estimators: least mean distance estimators and method of moments estimators. Least mean distance estimators, such as multivariate nonlinear least squares, minimize an objective function that depends on the sample and possibly on preliminary estimates of nuisance parameters. Method of moments estimators, like two-stage nonlinear least squares, minimize a distance between moment equations and their empirical counterparts. Both types of estimators are shown to be asymptotically normally distributed with a limiting variance-covariance matrix. The chapter also addresses the issue of misspecification, where the true data generating model differs from the model used for inference. It provides conditions under which the asymptotic theory can be applied even when the model is misspecified. These conditions include the assumption that the data follows a multivariate implicit model, where the parameter space may be infinite-dimensional. The chapter derives the Uniform Strong Law of Large Numbers and the Central Limit Theorem under these conditions, providing a rigorous foundation for asymptotic inference in nonlinear models. Finally, the chapter discusses the construction of Pitman drift, which is necessary to handle the case where the true model is not the same as the model used for estimation. This is achieved by confining the independent variables to a compact set and expanding the true model in a polynomial series. The chapter concludes with a discussion on the compactness of the estimation space and how it can be relaxed without affecting the asymptotic results.This chapter, authored by A. Ronald Gallant, focuses on developing a unified asymptotic theory for nonlinear statistical models. The author begins by noting that many discussions in the nonlinear models literature follow a classical treatment of maximum likelihood estimation, with some technical issues arising due to the need to condition on independent variables and handle Pitman drift. The chapter aims to unify these approaches by treating the objective function of optimization problems as if it were a log-likelihood, allowing for the derivation of test statistics such as the Wald, likelihood ratio, and Rao's efficient score. The chapter introduces two main classes of estimators: least mean distance estimators and method of moments estimators. Least mean distance estimators, such as multivariate nonlinear least squares, minimize an objective function that depends on the sample and possibly on preliminary estimates of nuisance parameters. Method of moments estimators, like two-stage nonlinear least squares, minimize a distance between moment equations and their empirical counterparts. Both types of estimators are shown to be asymptotically normally distributed with a limiting variance-covariance matrix. The chapter also addresses the issue of misspecification, where the true data generating model differs from the model used for inference. It provides conditions under which the asymptotic theory can be applied even when the model is misspecified. These conditions include the assumption that the data follows a multivariate implicit model, where the parameter space may be infinite-dimensional. The chapter derives the Uniform Strong Law of Large Numbers and the Central Limit Theorem under these conditions, providing a rigorous foundation for asymptotic inference in nonlinear models. Finally, the chapter discusses the construction of Pitman drift, which is necessary to handle the case where the true model is not the same as the model used for estimation. This is achieved by confining the independent variables to a compact set and expanding the true model in a polynomial series. The chapter concludes with a discussion on the compactness of the estimation space and how it can be relaxed without affecting the asymptotic results.
Reach us at info@study.space
Understanding Nonlinear Statistical Models