On Fast SVM Algorithms Used For Pattern Recognition

Título: On Fast SVM Algorithms Used For Pattern Recognition

Autores: Bastos, Felipe A. C. de; Campos, Marcello L. R. de

Resumo: This tutorial on fast Support Vector Machines (SVM) presents mathematical formulations and pseudo-code Implementations of three algorithms used for fast SVM training. Traditional SVM training is a quadratic-programming (QP) minimization problem that can be solved, e.g., using the Sequential Minimization Optimization (SMO) algorithm. This algorithm solves analytically a small QP optimization problem in each iteration, drastically reducing the training time needed by conventional QP optimizers. It is important to note that traditional SVM can be of two types: L1SVM and L2SVM, depending on the way that the training error is characterized in the SVM mathematical formulation. The SMO implementation presented in this tutorial applies only for the L1SVM, but it can be adapted to the L2SVM case. The Proximal SVM (PSVM) algorithm was also introduced as a fast alternative to traditional SVM classifiers that usually require a large amount of computation time for training. Unfortunately the PSVM algorithm may present poor performance due to biased optimal hyperplanes. The Unbiased Proximal SVM (UPSVM) algorithm uses a slightly different approach to circumvent this problem, such that an unbiased optimal hyperplane is always obtained. The results obtained show that the UPSVM algorithm performs better than the Sequential Minimal Optimization (SMO) algorithm with respect to training time with similar or better probability of correct pattern classification. The UPSVM algorithm also performs better than the PSVM algorithm with respect to probability of correct pattern classification (especially for low values of the regularization parameter ), to training time, and to the number of floating point operations.

Palavras-chave: Support Vector Machines – SVM; fast algorithms; Pattern Recognition

Páginas: 16

Código DOI: 10.21528/lmln-vol4-no2-art1

Artigo em PDF: vol4-no2-art1.pdf

Arquivo BibTex: vol4-no2-art1.bib