|Support Vector Machine|
Support Vector Machines (SVMs) are a set of related supervised learning methods, applicable to both classification and regression.
When used for classification, the SVM algorithm creates a hyperplane that separates the data into two classes with the maximum-margin. Given training examples labeled either "yes" or "no", a maximum-margin hyperplane splits the "yes" and "no" training examples, such that the distance from the closest examples (the margin) to the hyperplane is maximized.
In the Java applet below, the Kernel-Adatron Algorithm is used with polynomial kernel.
The applet was written by Tobias Denninger.
The theory is similar to the theory of the Optimal Perceptron algorithm, only here we use a kernel instead of a vector dot-product.
In this tutorial, you will classify 2-dimensional data points into two categories, using SVM.
Click here to see the instructions. You may find it helpful to open a separate browser window for the instructions, so you can view them at the same time as the applet window.