##### Department of Mathematics,

University of California San Diego

****************************

### Center for Computational Mathematics Seminar

## Olvi Mangasarian

#### UCSD

## Proximal Support Vector Machine Classification

##### Abstract:

Instead of a standard support vector machine (SVM) that classifies points by assigning them to one of two disjoint halfspaces, points are classified by assigning them to the closest of two parallel planes (in input or feature space) that are pushed apart as far as possible. This formulation, which can also be interpreted as regularized least squares, leads to an extremely fast and simple algorithm for generating a linear or nonlinear classifier that merely requires the solution of a single system of linear equations. In contrast, standard SVMs solve a quadratic or a linear program that require considerably longer computational time. Computational results on publicly available datasets indicate that the proposed proximal SVM classifier has comparable test-set correctness to that of standard SVM classifiers, but with considerably faster computational time that can be an order of magnitude faster. The linear proximal SVM can easily handle large datasets as indicated by the classification of a 2-million point 10-attribute dataset in 20.8 seconds. All computational results are based on 6 lines of a MATLAB code.

### January 26, 2010

### 10:00 AM

### AP&M 2402

****************************