a Multi-class Support Vector Machine package

by F. Lauer, Y. Guermeur and E. Didiot, members of the ABC research team at LORIA

MSVMpack is an open source package dedicated to multi-class support vector machines: SVMs which can handle classification problems with more than two classes without relying on decomposition methods. The aim is to provide a unified framework and implementation for all the different M-SVM models in a single package.

MSVMpack is available for Linux, Mac OS X and Windows as a set of command-line tools for training and testing M-SVMs together with a C API and a Matlab interface.

A web server is also included to offer a platform-independent GUI through a set of simple web pages. This means that training and testing can also be easily controlled from any computer (or even tablet or smartphone) with network access to a Linux host running MSVMpack.


Main features: Technical features:

Download source code (more than 7000 downloads on

Documentation (last update: July 3, 2014) [changes]

This software is freely available for non-commercial use under the terms of the GPL. Please use the following reference to cite MSVMpack:

F. Lauer and Y. Guermeur, MSVMpack: a Multi-Class Support Vector Machine Package, Journal of Machine Learning Research, 12:2269-2272, 2011.

BibTex entry:

	author = {F. Lauer and Y. Guermeur},
	title = {{MSVMpack}: a Multi-Class Support Vector Machine Package},
	journal = {Journal of Machine Learning Research},
	volume = {12},
	pages = {2269-2272},
	year = {2011},
	note = {\url{}}	



Install and setup

See the README file for a description of the files included in the package.
Full documentation is available in the doc/ sub-directory.

Basic installation on Linux and Mac OS X

cd MSVMpack1.5
sudo make install (optional)

Basic installation on Windows

Extract the archive in a directory, say DIR
Add the directory DIR\MSVMpack1.5\Windows\bin to your PATH.
To do this, you can type the following in an MS-DOS command-line window:
Then, close the MS-DOS window and start a new one for the changes to take effect.

Setup of the web server (Linux only)

Start the server with

msvmserver start

Access the web page at

Follow the link 'setup the server from the admin page' and login as

admin (password: admin)

Click the 'Save settings and start using MSVMpack server' button and go to the home page where you can start training and testing models.

Quick start

With the command line

Train an M-SVM with

trainmsvm myTrainingData

Classify a dataset with

predmsvm myTestData

With the web interface

Start the server (see above) and go to the web page at

Choose a training data file (e.g. iris.train) and a kernel function (e.g. Gaussian RBF)

Click the "Start training" button

Click on the "WATCH" link to follow training in real-time.

With Matlab

Add the MSVMpack toolbox to Matlab's PATH:


and run the demo with


Optional parameters

For trainmsvm

	trainmsvm training_file [file.model] [params]
 or	trainmsvm [file.model]

 where 'params' is a list of parameters specified as e.g.: 
	-c 2.5 -k 1 

Optional parameters: 
 -m	: model type (WW,CS,LLW,MSVM2) (default is MSVM2)
 -k	: nature of the kernel function (default is 1)
 	  1 -> linear kernel       k(x,z) = x^T z
 	  2 -> Gaussian RBF kernel k(x,z) = exp(-||x - z||^2 / 2*par^2)
 	  3 -> homogeneous polynomial kernel k(x,z) = (x^T z)^par 
 	  4 -> non-homo. polynomial kernel k(x,z) = (x^T z + 1)^par 
 	  5 -> custom kernel 1
 	  6 -> custom kernel 2
 	  7 -> custom kernel 3
 -p	: kernel parameter par (default is sqrt(5*dim(x)) (RBF) or 2 (POLY))
 -P     : list of space-separated kernel parameters starting with #parameters
 -c	: soft-margin (or trade-off) hyperparameter (default C = 10.0)
 -C	: list of space-separated hyperparameters C_k (default C_k = C)
 -cv k	: perform k-fold cross validation
 -n	: use normalized data for training (mean=0 and std=1 for all features)
 -u	: use unnormalized data for training (bypass the normalization test) 
 -x	: maximal cache memory size in MB (default is max available)
 -t	: number of working threads in [1, 2] (default is #CPUs=4)
 -f	: faster computations with single-precision floating-point data format
 -I	: faster computations with integer data format
 -i	: faster computations with short integer data format
 -B	: faster computations with byte data format
 -b	: faster computations with bit data format
 -o	: optimization method: 0 -> Franke-Wolfe (default), 1 -> Rosen
 -w	: size of the chunk (default is 10 for WW and 4 for others)
 -a	: optimization accuracy within [0,1] (default is 0.98)
 -r	: retrain model (resume training)
 -s	: save model using sparse format
 -S	: convert existing model to sparse format without training
 -q	: be quiet

These optional files can be specified on the command line:
 file.test	: Test data file
 file.outputs	: Computed outputs on the test set
 file.alpha	: Backup file for alpha
 file.init	: Initialization file for alpha 
 file.alpha	: Save alpha in matrix format
 file.log	: Log optimization information

For predmsvm

	predmsvm data_file [MSVM.model] [pred.outputs] [-t N]

computes the output of an M-SVM on the given data_file
using N processors (default N = number of CPU on this computer).

	predmsvm -i [MSVM.model] 

prints model information.


Thanks go to Fabienne Thomarat, Remi Bonidal, Nicolas Wicker and Gertjan van den Burg for their helpful comments and bug reports.


F. Lauer and Y. Guermeur,
MSVMpack: a Multi-Class Support Vector Machine Package
Journal of Machine Learning Research, 12:2269-2272, 2011

J. Weston and C. Watkins,
Multi-class support vector machines
Technical Report CSD-TR-98-04, Royal Holloway, University of London, 1998

K. Crammer and Y. Singer,
On the Algorithmic Implementation of Multiclass Kernel-based Vector Machines
Journal of Machine Learning Research 2:265-292, 2001

Y. Lee, Y. Lin, and G. Wahba.
Multicategory support vector machines: Theory and application to the classification of microarray data and satellite radiance data
Journal of the American Statistical Association 99(465):67-81, 2004

Y. Guermeur and E. Monfrini,
A Quadratic loss multi-class SVM for which a radius-margin bound applies
Informatica 22(1):73-96, 2011