sklearn.metrics.silhouette_samples(X, labels, metric='euclidean', **kwds)
The usual covariance maximum likelihood estimate can be regularized using shrinkage. Ledoit and Wolf proposed a close formula to compute the asymptotically optimal
class sklearn.linear_model.BayesianRidge(n_iter=300, tol=0.001, alpha_1=1e-06, alpha_2=1e-06, lambda_1=1e-06, lambda_2=1e-06, compute_score=False
An example to compare multi-output regression with random forest and the
This example shows the use of forests of trees to evaluate the importance of the pixels in an image classification task (faces). The hotter
sklearn.preprocessing.binarize(X, threshold=0.0, copy=True)
This example applies to The Olivetti faces dataset different unsupervised matrix decomposition (dimension reduction) methods from
An example using IsolationForest for anomaly detection. The IsolationForest ?isolates? observations by randomly selecting a feature and then randomly selecting
Demonstrate how model complexity influences both prediction accuracy and computational performance. The dataset is the Boston Housing dataset (resp. 20 Newsgroups)
class sklearn.model_selection.StratifiedShuffleSplit(n_splits=10, test_size=0.1, train_size=None, random_state=None)
Page 28 of 70