The PCA does an unsupervised dimensionality reduction, while the logistic regression does the prediction. We use a GridSearchCV to
This example shows that imputing the missing values can give better results than discarding the samples containing any missing value. Imputing
The
This example shows how to use cross_val_predict to visualize prediction errors.
This example shows the use of multi-output estimator to complete images. The goal is to predict the lower half of a face given its upper half
In many real-world examples, there are many ways to extract features from a dataset. Often it is beneficial to combine several methods to obtain
Both kernel ridge regression (KRR) and SVR learn a non-linear function by employing the kernel trick, i.e., they learn a linear function in the
This example constructs a pipeline that does dimensionality reduction followed by prediction with a support vector classifier
An illustration of the isotonic regression on generated data. The isotonic regression finds a non-decreasing approximation of a function while minimizing the mean squared
This example simulates a multi-label document classification problem. The dataset is generated randomly based on the following process: pick
Page 1 of 2