Information and the Accuracy Attainable in the Estimation of Statistical Parameters

Information and the Accuracy Attainable in the Estimation of Statistical Parameters

1992 | C. Radhakrishna Rao
The chapter introduces the methods of estimating statistical parameters, starting with Markoff's least squares method, which aims to find a linear function of observations that minimizes variance. Fisher's contributions are highlighted, particularly his introduction of consistency, efficiency, and sufficiency of estimating functions, and the advocacy for the maximum likelihood method. This method selects the function that maximizes the probability density of the observations, often providing the minimum variance even when the distribution is not normal. Aitken further developed this by finding functions that minimize the integral of the squared difference between the function and the parameter, subject to certain conditions. The paper aims to derive inequality relations connecting the elements of the Information Matrix and the variances and covariances of estimating functions, focusing on distributions that admit estimation with minimum variance. The concept of distance between populations is also discussed, defined by a quadratic differential metric. The chapter concludes with a discussion on estimating parameters by minimizing variance, emphasizing the trade-offs between unbiasedness and variance in selecting the best estimate.The chapter introduces the methods of estimating statistical parameters, starting with Markoff's least squares method, which aims to find a linear function of observations that minimizes variance. Fisher's contributions are highlighted, particularly his introduction of consistency, efficiency, and sufficiency of estimating functions, and the advocacy for the maximum likelihood method. This method selects the function that maximizes the probability density of the observations, often providing the minimum variance even when the distribution is not normal. Aitken further developed this by finding functions that minimize the integral of the squared difference between the function and the parameter, subject to certain conditions. The paper aims to derive inequality relations connecting the elements of the Information Matrix and the variances and covariances of estimating functions, focusing on distributions that admit estimation with minimum variance. The concept of distance between populations is also discussed, defined by a quadratic differential metric. The chapter concludes with a discussion on estimating parameters by minimizing variance, emphasizing the trade-offs between unbiasedness and variance in selecting the best estimate.
Reach us at info@study.space