This paper presents a new approach for multi-class pattern recognition using support vector machines (SVMs) and linear programming machines. The authors propose a formulation that allows multi-class problems to be solved in a single optimization, rather than using multiple binary classifiers. This approach is shown to reduce the number of support vectors and kernel calculations needed compared to traditional methods like one-vs-rest and one-vs-one.
The paper describes a new k-class SVM formulation that directly optimizes for multi-class classification. This formulation generalizes the binary SVM optimization problem to handle multiple classes. The decision function is defined as the class with the maximum margin, and the optimization problem is solved using dual variables. The authors also present a similar generalization of linear programming machines, which also reduces the number of parameters needed.
The paper compares the performance of the new methods with traditional binary SVM approaches on benchmark datasets. The results show that the new methods achieve lower error rates and reduce the number of support vectors and kernel calculations. However, the paper also notes that the new methods may not always outperform traditional methods in terms of generalization ability.
The authors conclude that their new methods provide a more efficient way to solve multi-class pattern recognition problems. They also note that the methods can be used to construct examples where traditional methods fail, but this has not been reflected in the error rates on the datasets used. The paper acknowledges that the methods were independently derived by V. Vapnik and V. Blanz.This paper presents a new approach for multi-class pattern recognition using support vector machines (SVMs) and linear programming machines. The authors propose a formulation that allows multi-class problems to be solved in a single optimization, rather than using multiple binary classifiers. This approach is shown to reduce the number of support vectors and kernel calculations needed compared to traditional methods like one-vs-rest and one-vs-one.
The paper describes a new k-class SVM formulation that directly optimizes for multi-class classification. This formulation generalizes the binary SVM optimization problem to handle multiple classes. The decision function is defined as the class with the maximum margin, and the optimization problem is solved using dual variables. The authors also present a similar generalization of linear programming machines, which also reduces the number of parameters needed.
The paper compares the performance of the new methods with traditional binary SVM approaches on benchmark datasets. The results show that the new methods achieve lower error rates and reduce the number of support vectors and kernel calculations. However, the paper also notes that the new methods may not always outperform traditional methods in terms of generalization ability.
The authors conclude that their new methods provide a more efficient way to solve multi-class pattern recognition problems. They also note that the methods can be used to construct examples where traditional methods fail, but this has not been reflected in the error rates on the datasets used. The paper acknowledges that the methods were independently derived by V. Vapnik and V. Blanz.