This example constructs a pipeline that does dimensionality reduction followed by prediction with a support vector classifier
This example employs several unsupervised learning techniques to extract the stock market structure from variations in historical quotes. The quantity
When performing classification one often wants to predict not only the class label, but also the associated probability. This probability gives some kind of confidence
Both kernel ridge regression (KRR) and SVR learn a non-linear function by employing the kernel trick, i.e., they learn a linear function in the
Datasets can often contain components of that require different feature extraction and processing pipelines. This scenario might occur when:
This example illustrates the predicted probability of GPC for an isotropic and anisotropic RBF kernel on a two-dimensional version for the
Compare randomized search and grid search for optimizing hyperparameters of a random forest. All parameters that influence
Sample usage of Nearest Centroid classification. It will plot the decision boundaries for each class.
Demonstrates the effect of different metrics on the hierarchical clustering. The example is engineered to show the effect of the choice
Toy example of 1D regression using linear, polynomial and RBF kernels. print(__doc__)
Page 20 of 22