Hello (kernel) Learning!

Let’s start with a very simple Example, that is a classification example based on a linear version of the Passive Aggressive algorithm. The full code of this example can be found in the GitHub repository kelp-full, in particular in the source file HelloLearning.java.

Dataset used are the ones used as examples in the svmlight. They have been modified to be read by KeLP. In fact, a single row in KeLP must indicate what kind of vectors your are using, Sparse or Dense. In the svmlight dataset there are Sparse vectors, so if you open the train.dat and test.dat files you can notice that each vector is enclosed in BeginVector (|BV|) and EndVector (|EV|) tags.

The following example will work by adding the online-large-margin Maven dependency to your project.

This example will consider a dataset composed by:

  • Training set (2000 examples, 1000 of class “+1” (positive), and 1000 of class “-1” (negative))
  • Test set (600 examples, 300 of class “+1” (positive), and 300 of class “-1” (negative))

Let’s start doing some Java code.

First of all, we need to load dataset in memory and define what is the positive class of the classification problem.

If you want, you can print some statistics about dataset through some useful built-in methods.

Then, instantiate a new Passive Aggressive algorithm and set some parameter on it.

Learn a model on the trainingSet obtaining a Classifier

Finally, we classify each example in the test set and compute some performance measure.

Kernel based Learning

Using Kernel functions within KeLP is very simple. It is sufficient to declare a kernel function, set on which representation it will operate and tell the algorithm the it must use a kernel function to compute similarity scores.

In the previous example, if we want to use a Polynomial kernel on top of a linear kernel, it is sufficient to do as following:

For a complete example with kernel, you can download the HelloKernelLearning.java file.