A plot that compares the various convex loss functions supported by
Making sure that each Feature has approximately the same scale can be a crucial preprocessing step. However, when data contains outliers,
Find the optimal separating hyperplane using an SVC for classes that are unbalanced. We first find the separating plane with a plain
The PCA does an unsupervised dimensionality reduction, while the logistic regression does the prediction. We use a GridSearchCV to
Using orthogonal matching pursuit for recovering a sparse signal from a noisy measurement encoded with a dictionary print(__doc__)
RandomTreesEmbedding provides a way to map data to a very high-dimensional, sparse representation, which might be beneficial for classification
This example studies the scalability profile of approximate 10-neighbors queries using the LSHForest with n_estimators=20 and
This examples shows the use of forests of trees to evaluate the importance of features on an artificial classification task. The red bars are the feature
Demonstrates an active learning technique to learn handwritten digits using label propagation. We start by training a label propagation model
Sample usage of Nearest Neighbors classification. It will plot the decision boundaries for each class.
Page 1 of 22