Universal Kernels for Multi-Task Learning

Universal Kernels for Multi-Task Learning

Submitted XX; Published XXX | Andrea Caponnetto, Charles A. Micchelli, Massimiliano Pontil, Yiming Ying
This paper focuses on reproducing kernel Hilbert spaces (RKHS) of vector-valued functions, which are essential for multi-task learning. The authors aim to derive conditions that ensure the kernel is universal, meaning that every continuous function with values in the output space can be uniformly approximated by sections of the kernel on every compact subset of the input space. They provide various characterizations of universal kernels and illustrate them with practical examples. The analysis relies on functional analysis principles, particularly vector measures, to establish that the density problem for RKHS of vector-valued functions is more complex than for scalar functions. The paper is organized into sections covering the basics of operator-valued kernels, feature maps, and characterizations of universality, with a focus on both theoretical and practical aspects.This paper focuses on reproducing kernel Hilbert spaces (RKHS) of vector-valued functions, which are essential for multi-task learning. The authors aim to derive conditions that ensure the kernel is universal, meaning that every continuous function with values in the output space can be uniformly approximated by sections of the kernel on every compact subset of the input space. They provide various characterizations of universal kernels and illustrate them with practical examples. The analysis relies on functional analysis principles, particularly vector measures, to establish that the density problem for RKHS of vector-valued functions is more complex than for scalar functions. The paper is organized into sections covering the basics of operator-valued kernels, feature maps, and characterizations of universality, with a focus on both theoretical and practical aspects.
Reach us at info@study.space