Plot decision surface of multinomial and One-vs-Rest Logistic Regression. The hyperplanes corresponding to the three One-vs-Rest (OVR) classifiers
Score, and cross-validated scores As we have seen, every estimator exposes a score method that can judge the
In this example we compare the various initialization strategies for K-means in terms of runtime and quality of the results.
class sklearn.preprocessing.MinMaxScaler(feature_range=(0, 1), copy=True)
Plot the class probabilities of the first sample in a toy dataset predicted by three different classifiers and averaged by the
sklearn.metrics.completeness_score(labels_true, labels_pred)
An illustration of the metric and non-metric MDS on generated noisy data. The reconstructed points using the metric MDS and non metric MDS are slightly shifted
An example showing univariate feature selection. Noisy (non informative) features are added to the iris data and univariate feature selection is applied
If your number of features is high, it may be useful to reduce it with an unsupervised step prior to supervised steps. Many of the Unsupervised
Gaussian Processes (GP) are a generic supervised learning method designed to solve regression and probabilistic classification problems
Page 41 of 70