This paper introduces a family of graph kernels based on regularization operators, generalizing the concept of regularization and Green's functions to graphs. It shows that diffusion kernels are a special case of this framework. The paper presents a principled approach to regularization theory on graphs, proposing a family of regularization operators that include diffusion kernels and are invariant under vertex permutations.
The paper begins by discussing the importance of kernel-based learning algorithms for discrete input spaces, such as graphs. It then introduces the graph Laplacian and its relation to the Laplace operator on real-valued functions. The paper defines an extended class of regularization operators and shows that they must be functions of the Laplacian. It draws an analogy between graph regularization and real-valued Green's functions and presents efficient methods for computing such functions.
The paper then discusses the properties of the graph Laplacian and its normalized version. It presents two theorems regarding the spectrum of these matrices. The first theorem states that the normalized Laplacian is symmetric and positive semidefinite with eigenvalues between 0 and 2. The second theorem discusses the properties of the Laplacian and normalized Laplacian for regular graphs.
The paper concludes by discussing the relationship between the Laplacian and the Laplace operator on continuous spaces, noting that both quantify how much a function varies locally or how smooth it is over its domain. The paper emphasizes the importance of regularization theory in understanding and learning from graph-structured data.This paper introduces a family of graph kernels based on regularization operators, generalizing the concept of regularization and Green's functions to graphs. It shows that diffusion kernels are a special case of this framework. The paper presents a principled approach to regularization theory on graphs, proposing a family of regularization operators that include diffusion kernels and are invariant under vertex permutations.
The paper begins by discussing the importance of kernel-based learning algorithms for discrete input spaces, such as graphs. It then introduces the graph Laplacian and its relation to the Laplace operator on real-valued functions. The paper defines an extended class of regularization operators and shows that they must be functions of the Laplacian. It draws an analogy between graph regularization and real-valued Green's functions and presents efficient methods for computing such functions.
The paper then discusses the properties of the graph Laplacian and its normalized version. It presents two theorems regarding the spectrum of these matrices. The first theorem states that the normalized Laplacian is symmetric and positive semidefinite with eigenvalues between 0 and 2. The second theorem discusses the properties of the Laplacian and normalized Laplacian for regular graphs.
The paper concludes by discussing the relationship between the Laplacian and the Laplace operator on continuous spaces, noting that both quantify how much a function varies locally or how smooth it is over its domain. The paper emphasizes the importance of regularization theory in understanding and learning from graph-structured data.