A simple graphical frontend for Libsvm mainly intended for didactic purposes. You can create data points by point and click and visualize the decision region induced by different
This example compares 2 dimensionality reduction strategies: univariate feature selection with Anova feature
Simple usage of Pipeline that runs successively a univariate feature selection with anova and then a C-SVM of the selected features.
Sample usage of Nearest Neighbors classification. It will plot the decision boundaries for each class.
Using orthogonal matching pursuit for recovering a sparse signal from a noisy measurement encoded with a dictionary print(__doc__)
Illustration of how the performance of an estimator on unseen data (test data) is not the same as the performance on training data. As the regularization increases
In this example we see how to robustly fit a linear model to faulty data using the RANSAC algorithm.
The PCA does an unsupervised dimensionality reduction, while the logistic regression does the prediction. We use a GridSearchCV to
This example demonstrates the behavior of Gaussian mixture models fit on data that was not sampled from a mixture of Gaussian random variables. The dataset
This example is based on Figure 10.2 from Hastie et al 2009 [1] and illustrates the difference in performance between the discrete SAMME [2] boosting algorithm
Page 1 of 22