Normal and Shrinkage Linear Discriminant Analysis for classification
  • References/Python/scikit-learn/Examples/Classification

Shows how shrinkage improves classification.

2025-01-10 15:47:30
sklearn.tree.export_graphviz()
  • References/Python/scikit-learn/API Reference/tree

sklearn.tree.export_graphviz()

2025-01-10 15:47:30
model_selection.PredefinedSplit()
  • References/Python/scikit-learn/API Reference/model_selection

class sklearn.model_selection.PredefinedSplit(test_fold)

2025-01-10 15:47:30
Shrinkage covariance estimation
  • References/Python/scikit-learn/Examples/Covariance estimation

When working with covariance estimation, the usual approach is to use a maximum likelihood estimator, such as the

2025-01-10 15:47:30
linear_model.MultiTaskElasticNet()
  • References/Python/scikit-learn/API Reference/linear_model

class sklearn.linear_model.MultiTaskElasticNet(alpha=1.0, l1_ratio=0.5, fit_intercept=True, normalize=False, copy_X=True,

2025-01-10 15:47:30
sklearn.metrics.pairwise.additive_chi2_kernel()
  • References/Python/scikit-learn/API Reference/metrics

sklearn.metrics.pairwise.additive_chi2_kernel(X, Y=None)

2025-01-10 15:47:30
HuberRegressor vs Ridge on dataset with strong outliers
  • References/Python/scikit-learn/Examples/Generalized Linear Models

Fit Ridge and HuberRegressor on a dataset with outliers. The example shows that the predictions in ridge are strongly influenced

2025-01-10 15:47:30
Segmenting the picture of a raccoon face in regions
  • References/Python/scikit-learn/Examples/Clustering

This example uses

2025-01-10 15:47:30
sklearn.metrics.completeness_score()
  • References/Python/scikit-learn/API Reference/metrics

sklearn.metrics.completeness_score(labels_true, labels_pred)

2025-01-10 15:47:30
Plot class probabilities calculated by the VotingClassifier
  • References/Python/scikit-learn/Examples/Ensemble methods

Plot the class probabilities of the first sample in a toy dataset predicted by three different classifiers and averaged by the

2025-01-10 15:47:30