Univariate Feature Selection
  • References/Python/scikit-learn/Examples/Feature Selection

An example showing univariate feature selection. Noisy (non informative) features are added to the iris data and univariate feature selection is applied

2025-01-10 15:47:30
Sparse recovery
  • References/Python/scikit-learn/Examples/Generalized Linear Models

Given a small number of observations, we want to recover which features of X are relevant to explain y. For this

2025-01-10 15:47:30
Adjustment for chance in clustering performance evaluation
  • References/Python/scikit-learn/Examples/Clustering

The following plots demonstrate the impact of the number of clusters and number of samples on various clustering performance evaluation

2025-01-10 15:47:30
Probability calibration of classifiers
  • References/Python/scikit-learn/Examples/Calibration

When performing classification you often want to predict not only the class label, but also the associated probability. This probability gives you some

2025-01-10 15:47:30
Multi-dimensional scaling
  • References/Python/scikit-learn/Examples/Manifold learning

An illustration of the metric and non-metric MDS on generated noisy data. The reconstructed points using the metric MDS and non metric MDS are slightly shifted

2025-01-10 15:47:30
Normal and Shrinkage Linear Discriminant Analysis for classification
  • References/Python/scikit-learn/Examples/Classification

Shows how shrinkage improves classification.

2025-01-10 15:47:30
A demo of the Spectral Co-Clustering algorithm
  • References/Python/scikit-learn/Examples/Biclustering

This example demonstrates how to generate a dataset and bicluster it using the Spectral Co-Clustering algorithm. The dataset is generated

2025-01-10 15:47:30
Hyper-parameters of Approximate Nearest Neighbors
  • References/Python/scikit-learn/Examples/Nearest Neighbors

This example demonstrates the behaviour of the accuracy of the nearest neighbor queries of Locality Sensitive Hashing Forest as the number

2025-01-10 15:47:30
Plot class probabilities calculated by the VotingClassifier
  • References/Python/scikit-learn/Examples/Ensemble methods

Plot the class probabilities of the first sample in a toy dataset predicted by three different classifiers and averaged by the

2025-01-10 15:47:30
Multi-class AdaBoosted Decision Trees
  • References/Python/scikit-learn/Examples/Ensemble methods

This example reproduces Figure 1 of Zhu et al [1] and shows how boosting can improve prediction accuracy on a multi-class problem. The classification dataset

2025-01-10 15:47:30