2008 | S.V.N. Vishwanathan, Karsten M. Borgwardt, Imre Risi Kondor, Nicol N. Schraudolph
This paper presents a unified framework for graph kernels, which includes several special cases such as random walk graph kernels, marginalized graph kernels, and geometric kernels on graphs. The framework is based on extensions of linear algebra to Reproducing Kernel Hilbert Spaces (RKHS) and reduction to a Sylvester equation, leading to an algorithm that reduces the time complexity of kernel computation from O(n^6) to O(n^3). For sparse graphs, conjugate gradient solvers or fixed-point iterations bring the algorithm into the sub-cubic domain. Experiments on graphs from bioinformatics and other domains show that the algorithm is often more than a thousand times faster than previous approaches.
The paper explores connections between diffusion kernels, regularization on graphs, and graph kernels, and proposes new graph kernels based on these connections. It also shows that rational kernels, when specialized to graphs, reduce to the random walk graph kernel.
The paper discusses the computation of graph kernels using three efficient methods: reduction to a Sylvester equation, conjugate gradient solvers, and fixed-point iterations. These methods are shown to significantly reduce the computational complexity of kernel computation, especially for large and sparse graphs.
Experiments on real-world datasets, including molecular compounds and protein interaction networks, demonstrate the effectiveness of the proposed methods. The results show that the proposed methods are significantly faster than the direct approach and achieve high prediction accuracy on benchmark datasets. The paper also introduces a composite graph kernel that combines the original graph kernel with its complement, leading to improved performance in comparing co-integrated gene expression and protein interaction networks.This paper presents a unified framework for graph kernels, which includes several special cases such as random walk graph kernels, marginalized graph kernels, and geometric kernels on graphs. The framework is based on extensions of linear algebra to Reproducing Kernel Hilbert Spaces (RKHS) and reduction to a Sylvester equation, leading to an algorithm that reduces the time complexity of kernel computation from O(n^6) to O(n^3). For sparse graphs, conjugate gradient solvers or fixed-point iterations bring the algorithm into the sub-cubic domain. Experiments on graphs from bioinformatics and other domains show that the algorithm is often more than a thousand times faster than previous approaches.
The paper explores connections between diffusion kernels, regularization on graphs, and graph kernels, and proposes new graph kernels based on these connections. It also shows that rational kernels, when specialized to graphs, reduce to the random walk graph kernel.
The paper discusses the computation of graph kernels using three efficient methods: reduction to a Sylvester equation, conjugate gradient solvers, and fixed-point iterations. These methods are shown to significantly reduce the computational complexity of kernel computation, especially for large and sparse graphs.
Experiments on real-world datasets, including molecular compounds and protein interaction networks, demonstrate the effectiveness of the proposed methods. The results show that the proposed methods are significantly faster than the direct approach and achieve high prediction accuracy on benchmark datasets. The paper also introduces a composite graph kernel that combines the original graph kernel with its complement, leading to improved performance in comparing co-integrated gene expression and protein interaction networks.