THE APPROXIMATION POWER OF MOVING LEAST-SQUARES

THE APPROXIMATION POWER OF MOVING LEAST-SQUARES

Volume 67, Number 224, October 1998 | DAVID LEVIN
The paper discusses a general method for achieving near-best approximations to functionals on \(\mathbb{R}^d\) using scattered data, which is based on the moving least-squares (MLS) method. The MLS method, presented through the Backus-Gilbert approach, is shown to be effective for interpolation, smoothing, and derivative approximations. The method is particularly useful for handling scattered data points and can be applied to univariate and multivariate problems in \(\mathbb{R}^d\). Key points include: 1. **Error Analysis**: The error in the approximation is bounded in terms of the error of a local best polynomial approximation, making the method "near-best." 2. **Interpolation**: The method provides a \(C^\infty\) function as an interpolation solution, and an approximation order result is proven for quasi-uniform sets of data points. 3. **Numerical Examples**: The paper includes numerical demonstrations of the method's performance in univariate and multivariate interpolation, smoothing, and derivative approximation. Examples show that the method achieves local and near-best approximations, with coefficients decay exponentially away from the data points. 4. **Data-Dependent Approximants**: The paper suggests ways to introduce data-dependent approximants by using directional penalties, which can improve the approximation by focusing more on directions of smaller function variations. The MLS method is highlighted for its simplicity and effectiveness, providing a robust approach to approximating functionals from scattered data.The paper discusses a general method for achieving near-best approximations to functionals on \(\mathbb{R}^d\) using scattered data, which is based on the moving least-squares (MLS) method. The MLS method, presented through the Backus-Gilbert approach, is shown to be effective for interpolation, smoothing, and derivative approximations. The method is particularly useful for handling scattered data points and can be applied to univariate and multivariate problems in \(\mathbb{R}^d\). Key points include: 1. **Error Analysis**: The error in the approximation is bounded in terms of the error of a local best polynomial approximation, making the method "near-best." 2. **Interpolation**: The method provides a \(C^\infty\) function as an interpolation solution, and an approximation order result is proven for quasi-uniform sets of data points. 3. **Numerical Examples**: The paper includes numerical demonstrations of the method's performance in univariate and multivariate interpolation, smoothing, and derivative approximation. Examples show that the method achieves local and near-best approximations, with coefficients decay exponentially away from the data points. 4. **Data-Dependent Approximants**: The paper suggests ways to introduce data-dependent approximants by using directional penalties, which can improve the approximation by focusing more on directions of smaller function variations. The MLS method is highlighted for its simplicity and effectiveness, providing a robust approach to approximating functionals from scattered data.
Reach us at info@study.space