The paper by Herbert Robbins presents an empirical Bayes approach to statistical estimation, focusing on the estimation of an unknown parameter \(\Lambda\) based on observed data \(X\). The author assumes that \(X\) is a discrete random variable with a probability distribution that depends on \(\Lambda\), which itself is a random variable with a prior distribution \(G(\lambda)\). The unconditional probability distribution of \(X\) is given by \(p_{\sigma}(x)\), and the expected squared deviation of an estimator \(\varphi(X)\) of \(\Lambda\) is minimized when \(\varphi(x)\) is defined as the value that minimizes the integral \(I(x)\).
Robbins discusses the challenges of computing the Bayes estimator \(\varphi_{\sigma}(X)\) when the prior distribution \(G\) is unknown. He proposes using an empirical distribution function \(G_{n-1}(\lambda)\) based on the observed values \(\Lambda_1, \ldots, \Lambda_{n-1}\) to estimate \(\Lambda_n\). The empirical Bayes estimator \(\psi_n(X_n)\) is defined similarly to \(\varphi_{\sigma}(x)\) but uses the empirical distribution function \(G_{n-1}(\lambda)\).
The paper also explores the problem of approximating the unknown distribution function \(G\) from the empirical distribution function \(F_n(x)\). It suggests using a smoothing technique to approximate the Bayes function \(\varphi_a(x)\) and discusses the general problem of approximating other functionals of \(G\), such as \(G\) itself. The author provides several examples, including the Poisson, geometric, binomial, and Laplacian kernels, to illustrate the application of the empirical Bayes approach.
Finally, Robbins emphasizes the importance of this approach in practical situations where the prior distribution is unknown and the traditional methods of maximum likelihood or minimum variance unbiased estimation may not be feasible. He concludes by highlighting the need for further research to develop satisfactory solutions to the problem of approximating the unknown distribution function \(G\) from empirical data.The paper by Herbert Robbins presents an empirical Bayes approach to statistical estimation, focusing on the estimation of an unknown parameter \(\Lambda\) based on observed data \(X\). The author assumes that \(X\) is a discrete random variable with a probability distribution that depends on \(\Lambda\), which itself is a random variable with a prior distribution \(G(\lambda)\). The unconditional probability distribution of \(X\) is given by \(p_{\sigma}(x)\), and the expected squared deviation of an estimator \(\varphi(X)\) of \(\Lambda\) is minimized when \(\varphi(x)\) is defined as the value that minimizes the integral \(I(x)\).
Robbins discusses the challenges of computing the Bayes estimator \(\varphi_{\sigma}(X)\) when the prior distribution \(G\) is unknown. He proposes using an empirical distribution function \(G_{n-1}(\lambda)\) based on the observed values \(\Lambda_1, \ldots, \Lambda_{n-1}\) to estimate \(\Lambda_n\). The empirical Bayes estimator \(\psi_n(X_n)\) is defined similarly to \(\varphi_{\sigma}(x)\) but uses the empirical distribution function \(G_{n-1}(\lambda)\).
The paper also explores the problem of approximating the unknown distribution function \(G\) from the empirical distribution function \(F_n(x)\). It suggests using a smoothing technique to approximate the Bayes function \(\varphi_a(x)\) and discusses the general problem of approximating other functionals of \(G\), such as \(G\) itself. The author provides several examples, including the Poisson, geometric, binomial, and Laplacian kernels, to illustrate the application of the empirical Bayes approach.
Finally, Robbins emphasizes the importance of this approach in practical situations where the prior distribution is unknown and the traditional methods of maximum likelihood or minimum variance unbiased estimation may not be feasible. He concludes by highlighting the need for further research to develop satisfactory solutions to the problem of approximating the unknown distribution function \(G\) from empirical data.