Pipelining

The PCA does an unsupervised dimensionality reduction, while the logistic regression does the prediction. We use a GridSearchCV to

2017-01-15 04:24:56
Imputing missing values before building an estimator

This example shows that imputing the missing values can give better results than discarding the samples containing any missing value. Imputing

2017-01-15 04:22:53
The Johnson-Lindenstrauss bound for embedding with random projections

The

2017-01-15 04:27:15
Concatenating multiple feature extraction methods

In many real-world examples, there are many ways to extract features from a dataset. Often it is beneficial to combine several methods to obtain

2017-01-15 04:20:52
Face completion with a multi-output estimators

This example shows the use of multi-output estimator to complete images. The goal is to predict the lower half of a face given its upper half

2017-01-15 04:21:58
Plotting Cross-Validated Predictions

This example shows how to use cross_val_predict to visualize prediction errors.

2017-01-15 04:25:02
Isotonic Regression

An illustration of the isotonic regression on generated data. The isotonic regression finds a non-decreasing approximation of a function while minimizing the mean squared

2017-01-15 04:22:56
Feature Union with Heterogeneous Data Sources

Datasets can often contain components of that require different feature extraction and processing pipelines. This scenario might occur when:

2017-01-15 04:22:03
Explicit feature map approximation for RBF kernels

An example illustrating the approximation of the feature map of an RBF kernel. It shows how to use

2017-01-15 04:21:57
Multilabel classification

This example simulates a multi-label document classification problem. The dataset is generated randomly based on the following process: pick

2017-01-15 04:24:26