The plots display firstly what a K-means algorithm would yield using three clusters. It is then shown what the effect of a bad initialization is on the classification process:
Finds core samples of high density and expands clusters from them. print(__doc__) import numpy as np from
Transform a signal as a sparse combination of Ricker wavelets. This example visually compares different sparse coding methods using the
This example shows the use of forests of trees to evaluate the importance of the pixels in an image classification task (faces). The hotter
Plot the classification probability for different classifiers. We use a 3 class dataset, and we classify it with a Support Vector classifier, L1 and L2 penalized
Reference: Dorin Comaniciu and Peter Meer, ?Mean Shift: A robust approach toward feature space analysis?. IEEE Transactions on Pattern Analysis
An example showing how different online solvers perform on the hand-written digits dataset.
A recursive feature elimination example with automatic tuning of the number of features selected with cross-validation.
This example demonstrates the Spectral Co-clustering algorithm on the twenty newsgroups dataset. The ?comp.os.ms-windows.misc
Partial dependence plots show the dependence between the target function
Page 6 of 22