Choosing Multiple Parameters for Support Vector Machines

Choosing Multiple Parameters for Support Vector Machines

2002 | OLIVIER CHAPELLE, VLADIMIR VAPNIK, OLIVIER BOUSQUET, SAYAN MUKHERJEE
The paper addresses the problem of automatically tuning multiple parameters for Support Vector Machines (SVMs) to improve their generalization performance. The authors propose a minimax approach that maximizes the margin over the hyperplane coefficients and minimizes an estimate of the generalization error over the set of kernel parameters using gradient descent. They evaluate several error estimates, including validation error, leave-one-out error, and bounds derived from theoretical analysis. The method is demonstrated on various databases, showing significant improvements in computational efficiency and generalization performance compared to exhaustive search methods. The paper also discusses the application of the method to feature selection and the optimization of scaling factors in kernel functions.The paper addresses the problem of automatically tuning multiple parameters for Support Vector Machines (SVMs) to improve their generalization performance. The authors propose a minimax approach that maximizes the margin over the hyperplane coefficients and minimizes an estimate of the generalization error over the set of kernel parameters using gradient descent. They evaluate several error estimates, including validation error, leave-one-out error, and bounds derived from theoretical analysis. The method is demonstrated on various databases, showing significant improvements in computational efficiency and generalization performance compared to exhaustive search methods. The paper also discusses the application of the method to feature selection and the optimization of scaling factors in kernel functions.
Reach us at info@study.space
[slides and audio] Choosing Multiple Parameters for Support Vector Machines