Plot decision function of a weighted dataset, where the size of points is proportional to its weight. The sample weighting rescales the C parameter, which means
Simple usage of Support Vector Machines to classify a sample. It will plot the decision surface and the support vectors.
An example using a one-class SVM for novelty detection.
Find the optimal separating hyperplane using an SVC for classes that are unbalanced. We first find the separating plane with a plain
This example shows how to perform univariate feature selection before running a SVC (support vector classifier) to improve the classification
Three different types of SVM-Kernels are displayed below. The polynomial and RBF are especially useful when the data-points are not linearly separable.
Comparison of different linear SVM classifiers on a 2D projection of the iris dataset. We only consider the first 2 features of this dataset:
Plot the maximum margin separating hyperplane within a two-class separable dataset using a Support Vector Machine classifier with linear kernel.
Perform binary classification using non-linear SVC with RBF kernel. The target to predict is a XOR of the inputs. The color map illustrates the decision function learned
The following example illustrates the effect of scaling the regularization parameter when using
Page 1 of 2