The article presents the empirical likelihood ratio confidence regions for vector-valued statistical functionals, offering a nonparametric version of Wilks' theorem and a multivariate extension of Owen's work. It shows that empirical likelihood intervals for a one-dimensional mean are less affected by skewness than those based on the Student's t statistic. An effective method is introduced for computing empirical profile likelihoods for the mean of a vector random variable, reducing the problem to an unconstrained minimization of a convex function on a low-dimensional domain. Algorithms exist for finding the unique global minimum at a superlinear rate of convergence, and a noncombinatorial algorithm is provided for determining whether a point lies within the convex hull of a finite set of points.
The multivariate empirical likelihood regions are justified for functions of several means, such as variances, correlations, and regression parameters, and for statistics with linear estimating equations. An algorithm is given for computing profile empirical likelihoods for these statistics.
The central result is for the mean of X, showing that the empirical likelihood regions converge to a chi-squared distribution as the sample size increases. The article also discusses the use of empirical likelihood in comparing to the bootstrap and other methods, and provides examples of empirical likelihood inference for the standard deviation, correlation coefficient, and regression coefficients. It concludes with a discussion of the theoretical and practical implications of the empirical likelihood method, including its comparison to the t-test and its potential for better performance in the presence of skewness.The article presents the empirical likelihood ratio confidence regions for vector-valued statistical functionals, offering a nonparametric version of Wilks' theorem and a multivariate extension of Owen's work. It shows that empirical likelihood intervals for a one-dimensional mean are less affected by skewness than those based on the Student's t statistic. An effective method is introduced for computing empirical profile likelihoods for the mean of a vector random variable, reducing the problem to an unconstrained minimization of a convex function on a low-dimensional domain. Algorithms exist for finding the unique global minimum at a superlinear rate of convergence, and a noncombinatorial algorithm is provided for determining whether a point lies within the convex hull of a finite set of points.
The multivariate empirical likelihood regions are justified for functions of several means, such as variances, correlations, and regression parameters, and for statistics with linear estimating equations. An algorithm is given for computing profile empirical likelihoods for these statistics.
The central result is for the mean of X, showing that the empirical likelihood regions converge to a chi-squared distribution as the sample size increases. The article also discusses the use of empirical likelihood in comparing to the bootstrap and other methods, and provides examples of empirical likelihood inference for the standard deviation, correlation coefficient, and regression coefficients. It concludes with a discussion of the theoretical and practical implications of the empirical likelihood method, including its comparison to the t-test and its potential for better performance in the presence of skewness.