Pipelining

The PCA does an unsupervised dimensionality reduction, while the logistic regression does the prediction. We use a GridSearchCV to

2017-01-15 04:24:56
Imputing missing values before building an estimator

This example shows that imputing the missing values can give better results than discarding the samples containing any missing value. Imputing

2017-01-15 04:22:53
The Johnson-Lindenstrauss bound for embedding with random projections

The

2017-01-15 04:27:15
Plotting Cross-Validated Predictions

This example shows how to use cross_val_predict to visualize prediction errors.

2017-01-15 04:25:02
Face completion with a multi-output estimators

This example shows the use of multi-output estimator to complete images. The goal is to predict the lower half of a face given its upper half

2017-01-15 04:21:58
Concatenating multiple feature extraction methods

In many real-world examples, there are many ways to extract features from a dataset. Often it is beneficial to combine several methods to obtain

2017-01-15 04:20:52
Comparison of kernel ridge regression and SVR

Both kernel ridge regression (KRR) and SVR learn a non-linear function by employing the kernel trick, i.e., they learn a linear function in the

2017-01-15 04:20:50
Selecting dimensionality reduction with Pipeline and GridSearchCV

This example constructs a pipeline that does dimensionality reduction followed by prediction with a support vector classifier

2017-01-15 04:25:26
Isotonic Regression

An illustration of the isotonic regression on generated data. The isotonic regression finds a non-decreasing approximation of a function while minimizing the mean squared

2017-01-15 04:22:56
Multilabel classification

This example simulates a multi-label document classification problem. The dataset is generated randomly based on the following process: pick

2017-01-15 04:24:26