This paper introduces a Support Vector Machine (SVM) approach to optimizing multivariate non-linear performance measures, such as the $F_1$-score, Precision/Recall Breakeven Point (PRBEP), Precision at $k$ (Prec$_{atk}$), and ROCArea. The conventional classification SVM is a special case of this method when using error rate as the performance measure. The proposed method formulates the learning problem as a multivariate prediction task, allowing for the optimization of a wide range of performance measures that can be computed from the contingency table. The training problem is solved in polynomial time using a sparse approximation algorithm adapted from structural SVMs. The method is evaluated on various datasets, showing improved performance, particularly for text classification tasks with highly unbalanced classes. The experiments demonstrate that the proposed approach outperforms or matches the performance of a classification SVM with a linear cost model in most cases.This paper introduces a Support Vector Machine (SVM) approach to optimizing multivariate non-linear performance measures, such as the $F_1$-score, Precision/Recall Breakeven Point (PRBEP), Precision at $k$ (Prec$_{atk}$), and ROCArea. The conventional classification SVM is a special case of this method when using error rate as the performance measure. The proposed method formulates the learning problem as a multivariate prediction task, allowing for the optimization of a wide range of performance measures that can be computed from the contingency table. The training problem is solved in polynomial time using a sparse approximation algorithm adapted from structural SVMs. The method is evaluated on various datasets, showing improved performance, particularly for text classification tasks with highly unbalanced classes. The experiments demonstrate that the proposed approach outperforms or matches the performance of a classification SVM with a linear cost model in most cases.