This example is meant to illustrate situations where k-means will produce unintuitive and possibly unexpected clusters. In the first three plots, the input
Support vector machines (SVMs) are a set of supervised learning methods used for
In this plot you can see the training scores and validation scores of an SVM for different values of the kernel parameter gamma. For very low values of gamma, you
Shows how shrinkage improves classification.
sklearn.datasets.make_friedman3(n_samples=100, noise=0.0, random_state=None)
Given a small number of observations, we want to recover which features of X are relevant to explain y. For this
Fit Ridge and HuberRegressor on a dataset with outliers. The example shows that the predictions in ridge are strongly influenced
Example of Receiver Operating Characteristic (ROC) metric to evaluate classifier output quality. ROC curves typically feature true positive rate
class sklearn.covariance.LedoitWolf(store_precision=True, assume_centered=False, block_size=1000)
Stochastic Gradient Descent (SGD) is a simple yet very efficient approach to discriminative learning of linear classifiers under convex loss functions
Page 37 of 70