KERNEL ESTIMATION OF REGRESSION FUNCTIONS

KERNEL ESTIMATION OF REGRESSION FUNCTIONS

| Theo Gasser, Hans-Georg Müller
The paper introduces a new kernel estimate for nonparametric regression functions with a one-dimensional design parameter, which is shown to outperform the method proposed by Priestley and Chao (1972). The method is applicable to a broader class of kernels satisfying certain moment conditions, and it provides an asymptotically valid solution for boundary problems in non-circular models. This allows for the derivation of the asymptotic integrated mean square error. The paper also discusses the rates of convergence for splines and tabulates higher-order kernels under two optimality criteria: minimum variance and minimum mean square error. The authors highlight the practical importance of nonparametric regression in data analysis, particularly in longitudinal studies and EEG analysis, where parametric models may not be sufficient due to sample heterogeneity.The paper introduces a new kernel estimate for nonparametric regression functions with a one-dimensional design parameter, which is shown to outperform the method proposed by Priestley and Chao (1972). The method is applicable to a broader class of kernels satisfying certain moment conditions, and it provides an asymptotically valid solution for boundary problems in non-circular models. This allows for the derivation of the asymptotic integrated mean square error. The paper also discusses the rates of convergence for splines and tabulates higher-order kernels under two optimality criteria: minimum variance and minimum mean square error. The authors highlight the practical importance of nonparametric regression in data analysis, particularly in longitudinal studies and EEG analysis, where parametric models may not be sufficient due to sample heterogeneity.
Reach us at info@study.space