The PCA does an unsupervised dimensionality reduction, while the logistic regression does the prediction. We use a GridSearchCV to
This example shows that imputing the missing values can give better results than discarding the samples containing any missing value. Imputing
The
This example shows how to use cross_val_predict to visualize prediction errors.
This example shows the use of multi-output estimator to complete images. The goal is to predict the lower half of a face given its upper half
In many real-world examples, there are many ways to extract features from a dataset. Often it is beneficial to combine several methods to obtain
An example illustrating the approximation of the feature map of an RBF kernel. It shows how to use
An illustration of the isotonic regression on generated data. The isotonic regression finds a non-decreasing approximation of a function while minimizing the mean squared
Both kernel ridge regression (KRR) and SVR learn a non-linear function by employing the kernel trick, i.e., they learn a linear function in the
Datasets can often contain components of that require different feature extraction and processing pipelines. This scenario might occur when:
Page 1 of 2