This example is based on Figure 10.2 from Hastie et al 2009 [1] and illustrates the difference in performance between the discrete SAMME [2] boosting algorithm
The PCA does an unsupervised dimensionality reduction, while the logistic regression does the prediction. We use a GridSearchCV to
This examples shows the use of forests of trees to evaluate the importance of features on an artificial classification task. The red bars are the feature
A simple graphical frontend for Libsvm mainly intended for didactic purposes. You can create data points by point and click and visualize the decision region induced by different
The RandomForestClassifier is trained using bootstrap aggregation, where each new tree is fit from a bootstrap sample of the training observations
An example using a one-class SVM for novelty detection.
Simple usage of Pipeline that runs successively a univariate feature selection with anova and then a C-SVM of the selected features.
RandomTreesEmbedding provides a way to map data to a very high-dimensional, sparse representation, which might be beneficial for classification
This example compares 2 dimensionality reduction strategies: univariate feature selection with Anova feature
Computes Lasso Path along the regularization parameter using the LARS algorithm on the diabetes dataset. Each color represents a different feature of the coefficient vector
Page 1 of 22