Plot decision function of a weighted dataset, where the size of points is proportional to its weight. The sample weighting rescales the C parameter, which means
Simple usage of Support Vector Machines to classify a sample. It will plot the decision surface and the support vectors.
Find the optimal separating hyperplane using an SVC for classes that are unbalanced. We first find the separating plane with a plain
An example using a one-class SVM for novelty detection.
Plot the maximum margin separating hyperplane within a two-class separable dataset using a Support Vector Machine classifier with linear kernel.
This example shows how to perform univariate feature selection before running a SVC (support vector classifier) to improve the classification
Comparison of different linear SVM classifiers on a 2D projection of the iris dataset. We only consider the first 2 features of this dataset:
Three different types of SVM-Kernels are displayed below. The polynomial and RBF are especially useful when the data-points are not linearly separable.
Perform binary classification using non-linear SVC with RBF kernel. The target to predict is a XOR of the inputs. The color map illustrates the decision function learned
The plots below illustrate the effect the parameter C has on the separation line. A large value of C basically tells our model that we do not have
Page 1 of 2