January 5, 2024 | DAVID KRIEG, KATERINA POZHARSKA, MARIO ULLRICH, TINO ULLRICH
This paper presents sampling projections on arbitrary n-dimensional subspaces of the space of complex-valued bounded functions on a set D, with at most 2n samples and norm of order √n. This improves upon Auerbach's lemma and provides a more explicit form of the Kadets-Snobar theorem for the uniform norm. The results have implications for optimal recovery in L_p spaces. The main result, Theorem 1, shows that there exist 2n points and functions such that a projection with norm bounded by C√n exists. The proof relies on a discretization of the uniform norm, which is then used to derive error bounds for least-squares approximation in the uniform norm. The results are extended to L_p spaces and show that the linear sampling numbers can be bounded by a factor of √n times the Kolmogorov numbers. The paper also discusses the implications of these results for optimal sampling recovery and the sharpness of the bounds.This paper presents sampling projections on arbitrary n-dimensional subspaces of the space of complex-valued bounded functions on a set D, with at most 2n samples and norm of order √n. This improves upon Auerbach's lemma and provides a more explicit form of the Kadets-Snobar theorem for the uniform norm. The results have implications for optimal recovery in L_p spaces. The main result, Theorem 1, shows that there exist 2n points and functions such that a projection with norm bounded by C√n exists. The proof relies on a discretization of the uniform norm, which is then used to derive error bounds for least-squares approximation in the uniform norm. The results are extended to L_p spaces and show that the linear sampling numbers can be bounded by a factor of √n times the Kolmogorov numbers. The paper also discusses the implications of these results for optimal sampling recovery and the sharpness of the bounds.