Smoothing Noisy Data with Spline Functions

Smoothing Noisy Data with Spline Functions

1975 | Grace Wahba
This paper discusses the optimal selection of the smoothing parameter when using a periodic spline of degree 2m-1 to reconstruct a smooth periodic curve from noisy data. The noise is assumed to be white, and the true curve is assumed to lie in the Sobolev space $ W_{g}^{(a,m)} $ of periodic functions with absolutely continuous derivatives and square integrable higher-order derivatives. The goal is to minimize the expected square error, averaged over the data points. The paper introduces two functionals: $ J(f) $, which measures the smoothness of the function $ f $, and $ R(f) $, which measures the fit of $ f $ to the data. The problem of minimizing $ R(f) + \lambda J(f) $ is closely related to the problem of minimizing $ J(f) $ under a constraint on $ R(f) $. The parameter $ \lambda $ is used in the computation of the solution, while $ S $, which is related to the residual sum of squares, is more intuitive to work with. The paper considers the case where the errors are random variables with zero mean and constant variance. It is suggested that $ S $ should be chosen less than $ \sigma^2 $, with a "fudge factor" that depends on $ n $, $ \sigma^2 $, and a higher derivative of the true function $ g $. The paper focuses on periodic smoothing splines, where the boundary conditions are imposed on the solution $ g_{n,\lambda} $. The results are derived for periodic splines, where the matrices involved are circulant, allowing for explicit computation of eigenvectors and eigenvalues. The main result is that the optimal smoothing parameter $ \lambda $ is given by $ \lambda^* = \left\{a_{m}(\sigma^{2}/\|g^{(2m)}\|^{2})^{2m/(4m+1)}\right\}[1+o(1)] $, where $ o(1) \rightarrow 0 $ as $ n \rightarrow \infty $. The expected squared error $ ET(\lambda) $ is minimized by this value of $ \lambda $, and the optimal choice of $ S $ satisfies $ E S^{*} \leqq \sigma^2 \{1 - k[1 + o(1)]\} $.This paper discusses the optimal selection of the smoothing parameter when using a periodic spline of degree 2m-1 to reconstruct a smooth periodic curve from noisy data. The noise is assumed to be white, and the true curve is assumed to lie in the Sobolev space $ W_{g}^{(a,m)} $ of periodic functions with absolutely continuous derivatives and square integrable higher-order derivatives. The goal is to minimize the expected square error, averaged over the data points. The paper introduces two functionals: $ J(f) $, which measures the smoothness of the function $ f $, and $ R(f) $, which measures the fit of $ f $ to the data. The problem of minimizing $ R(f) + \lambda J(f) $ is closely related to the problem of minimizing $ J(f) $ under a constraint on $ R(f) $. The parameter $ \lambda $ is used in the computation of the solution, while $ S $, which is related to the residual sum of squares, is more intuitive to work with. The paper considers the case where the errors are random variables with zero mean and constant variance. It is suggested that $ S $ should be chosen less than $ \sigma^2 $, with a "fudge factor" that depends on $ n $, $ \sigma^2 $, and a higher derivative of the true function $ g $. The paper focuses on periodic smoothing splines, where the boundary conditions are imposed on the solution $ g_{n,\lambda} $. The results are derived for periodic splines, where the matrices involved are circulant, allowing for explicit computation of eigenvectors and eigenvalues. The main result is that the optimal smoothing parameter $ \lambda $ is given by $ \lambda^* = \left\{a_{m}(\sigma^{2}/\|g^{(2m)}\|^{2})^{2m/(4m+1)}\right\}[1+o(1)] $, where $ o(1) \rightarrow 0 $ as $ n \rightarrow \infty $. The expected squared error $ ET(\lambda) $ is minimized by this value of $ \lambda $, and the optimal choice of $ S $ satisfies $ E S^{*} \leqq \sigma^2 \{1 - k[1 + o(1)]\} $.
Reach us at info@futurestudyspace.com
[slides] Smoothing noisy data with spline functions | StudySpace