This paper discusses the accuracy attainable in estimating statistical parameters. The method of least squares, introduced by Markoff, is the earliest method for estimating parameters. It involves finding a linear function of observations whose expectation is a linear function of unknown parameters and whose variance is minimized. Fisher made significant advances by introducing concepts like consistency, efficiency, and sufficiency, and advocating the maximum likelihood method. This method estimates parameters by maximizing the probability density function. The validity of this method arises from the fact that, among unbiased estimating functions, the one that maximizes the probability density has the least variance. Aitken (1941) developed a method for estimating parameters by minimizing the variance of an estimating function. This method is possible only for a class of distribution functions that admit sufficient statistics. The paper aims to derive inequality relations between elements of the information matrix and variances and covariances of estimating functions. It discusses a class of distribution functions that allow parameter estimation with minimum possible variance. The concept of distance between populations is developed using a quadratic differential metric. The paper also discusses estimation by minimizing variance. The probability density function for a sample of n observations contains a parameter θ to be estimated by a function t = f(x₁, ..., xₙ). The best estimate is one that satisfies certain probability inequalities. If the condition is replaced by a less stringent one, it leads to the condition that the expected value of (t - θ)² is less than or equal to that of (t' - θ)². The paper emphasizes that unbiasedness and minimum variance are necessary but not sufficient conditions for good estimation. The paper concludes that while unbiasedness is important, biased estimates with smaller variances may be better.This paper discusses the accuracy attainable in estimating statistical parameters. The method of least squares, introduced by Markoff, is the earliest method for estimating parameters. It involves finding a linear function of observations whose expectation is a linear function of unknown parameters and whose variance is minimized. Fisher made significant advances by introducing concepts like consistency, efficiency, and sufficiency, and advocating the maximum likelihood method. This method estimates parameters by maximizing the probability density function. The validity of this method arises from the fact that, among unbiased estimating functions, the one that maximizes the probability density has the least variance. Aitken (1941) developed a method for estimating parameters by minimizing the variance of an estimating function. This method is possible only for a class of distribution functions that admit sufficient statistics. The paper aims to derive inequality relations between elements of the information matrix and variances and covariances of estimating functions. It discusses a class of distribution functions that allow parameter estimation with minimum possible variance. The concept of distance between populations is developed using a quadratic differential metric. The paper also discusses estimation by minimizing variance. The probability density function for a sample of n observations contains a parameter θ to be estimated by a function t = f(x₁, ..., xₙ). The best estimate is one that satisfies certain probability inequalities. If the condition is replaced by a less stringent one, it leads to the condition that the expected value of (t - θ)² is less than or equal to that of (t' - θ)². The paper emphasizes that unbiasedness and minimum variance are necessary but not sufficient conditions for good estimation. The paper concludes that while unbiasedness is important, biased estimates with smaller variances may be better.