KERNEL ESTIMATION OF REGRESSION FUNCTIONS

KERNEL ESTIMATION OF REGRESSION FUNCTIONS

1978 | Theo Gasser, Hans-Georg Müller
Kernel estimation of regression functions, as introduced by Theo Gasser and Hans-Georg Müller, provides a new method for nonparametric estimation of regression functions with a one-dimensional design parameter. This method is shown to be superior to the kernel estimate introduced by Priestley and Chao (1972). The results are not limited to positive kernels but apply to classes of kernels satisfying certain moment conditions. An asymptotically valid solution for the boundary problem in non-circular models is derived, allowing the calculation of the asymptotic integrated mean square error. As a special case, the method achieves the same convergence rates as splines. For two optimality criteria—minimum variance and minimum mean square error—higher-order kernels are explicitly tabulated. Key words: nonparametric regression, kernel estimation, curve smoothing. The nonparametric estimation of regression functions is an important data-analytic tool, particularly when data are collected over time or space. The first author's practical experience is based on the analysis of the Zurich longitudinal growth study and EEG analysis. A parametric approach often relies on a clever guess rather than prior knowledge in the field of application. Due to the heterogeneity of the sample, a single model may not be sufficient. Bias problems can become qualitative, as seen in the study of somatic growth, where an inadequate model led to an evidently erroneous conclusion.Kernel estimation of regression functions, as introduced by Theo Gasser and Hans-Georg Müller, provides a new method for nonparametric estimation of regression functions with a one-dimensional design parameter. This method is shown to be superior to the kernel estimate introduced by Priestley and Chao (1972). The results are not limited to positive kernels but apply to classes of kernels satisfying certain moment conditions. An asymptotically valid solution for the boundary problem in non-circular models is derived, allowing the calculation of the asymptotic integrated mean square error. As a special case, the method achieves the same convergence rates as splines. For two optimality criteria—minimum variance and minimum mean square error—higher-order kernels are explicitly tabulated. Key words: nonparametric regression, kernel estimation, curve smoothing. The nonparametric estimation of regression functions is an important data-analytic tool, particularly when data are collected over time or space. The first author's practical experience is based on the analysis of the Zurich longitudinal growth study and EEG analysis. A parametric approach often relies on a clever guess rather than prior knowledge in the field of application. Due to the heterogeneity of the sample, a single model may not be sufficient. Bias problems can become qualitative, as seen in the study of somatic growth, where an inadequate model led to an evidently erroneous conclusion.
Reach us at info@futurestudyspace.com
[slides and audio] Kernel estimation of regression functions