A recursive feature elimination example with automatic tuning of the number of features selected with cross-validation.
Transform a signal as a sparse combination of Ricker wavelets. This example visually compares different sparse coding methods using the
Use SelectFromModel meta-transformer along with Lasso to select the best couple of features from the Boston dataset.
The plots display firstly what a K-means algorithm would yield using three clusters. It is then shown what the effect of a bad initialization is on the classification process:
The multi-task lasso allows to fit multiple regression problems jointly enforcing the selected features to be the same across tasks. This example
This example demonstrates the Spectral Co-clustering algorithm on the twenty newsgroups dataset. The ?comp.os.ms-windows.misc
This example applies to The Olivetti faces dataset different unsupervised matrix decomposition (dimension reduction) methods from
An example of estimating sources from noisy data.
Plot the classification probability for different classifiers. We use a 3 class dataset, and we classify it with a Support Vector classifier, L1 and L2 penalized
Reference: Dorin Comaniciu and Peter Meer, ?Mean Shift: A robust approach toward feature space analysis?. IEEE Transactions on Pattern Analysis
Page 5 of 22