The PCA does an unsupervised dimensionality reduction, while the logistic regression does the prediction. We use a GridSearchCV to
This example shows that imputing the missing values can give better results than discarding the samples containing any missing value. Imputing
The
This example shows how to use cross_val_predict to visualize prediction errors.
In many real-world examples, there are many ways to extract features from a dataset. Often it is beneficial to combine several methods to obtain
This example shows the use of multi-output estimator to complete images. The goal is to predict the lower half of a face given its upper half
Datasets can often contain components of that require different feature extraction and processing pipelines. This scenario might occur when:
This example constructs a pipeline that does dimensionality reduction followed by prediction with a support vector classifier
An illustration of the isotonic regression on generated data. The isotonic regression finds a non-decreasing approximation of a function while minimizing the mean squared
An example illustrating the approximation of the feature map of an RBF kernel. It shows how to use
Page 1 of 2