A 1D regression with decision tree. The
Computes a Bayesian Ridge Regression on a synthetic dataset. See
When performing classification one often wants to predict not only the class label, but also the associated probability. This probability gives some kind of confidence
Toy example of 1D regression using linear, polynomial and RBF kernels. print(__doc__)
For greyscale image data where pixel values can be interpreted as degrees of blackness on a white background, like handwritten
An example illustrating the approximation of the feature map of an RBF kernel. It shows how to use
This example constructs a pipeline that does dimensionality reduction followed by prediction with a support vector classifier
This example illustrates the effect of the parameters gamma and C of the Radial Basis Function (RBF) kernel SVM. Intuitively, the gamma
The following example illustrates the effect of scaling the regularization parameter when using
An illustration of the isotonic regression on generated data. The isotonic regression finds a non-decreasing approximation of a function while minimizing the mean squared
Page 21 of 22