The PCA does an unsupervised dimensionality reduction, while the logistic regression does the prediction. We use a GridSearchCV to
This example shows that imputing the missing values can give better results than discarding the samples containing any missing value. Imputing
The
This example shows the use of multi-output estimator to complete images. The goal is to predict the lower half of a face given its upper half
In many real-world examples, there are many ways to extract features from a dataset. Often it is beneficial to combine several methods to obtain
This example shows how to use cross_val_predict to visualize prediction errors.
Both kernel ridge regression (KRR) and SVR learn a non-linear function by employing the kernel trick, i.e., they learn a linear function in the
An example illustrating the approximation of the feature map of an RBF kernel. It shows how to use
This example constructs a pipeline that does dimensionality reduction followed by prediction with a support vector classifier
An illustration of the isotonic regression on generated data. The isotonic regression finds a non-decreasing approximation of a function while minimizing the mean squared
Page 1 of 2