This examples shows the use of forests of trees to evaluate the importance of features on an artificial classification task. The red bars are the feature
Out-of-bag (OOB) estimates can be a useful heuristic to estimate the ?optimal? number of boosting iterations. OOB estimates are almost identical to cross-validation
Using orthogonal matching pursuit for recovering a sparse signal from a noisy measurement encoded with a dictionary print(__doc__)
Simple usage of Support Vector Machines to classify a sample. It will plot the decision surface and the support vectors.
This example plots the ellipsoids obtained from a toy dataset (mixture of three Gaussians) fitted by the Baye
A simple graphical frontend for Libsvm mainly intended for didactic purposes. You can create data points by point and click and visualize the decision region induced by different
The RandomForestClassifier is trained using bootstrap aggregation, where each new tree is fit from a bootstrap sample of the training observations
An example using a one-class SVM for novelty detection.
Simple usage of Pipeline that runs successively a univariate feature selection with anova and then a C-SVM of the selected features.
This example compares 2 dimensionality reduction strategies: univariate feature selection with Anova feature
Page 3 of 22