Computes a Bayesian Ridge Regression on a synthetic dataset. See
Lasso and elastic net (L1 and L2 penalisation) implemented using a coordinate descent. The coefficients can be forced to be positive.
An example to illustrate multi-output regression with decision tree. The
This example is based on Section 5.4.3 of ?Gaussian Processes for Machine Learning? [RW2006]. It illustrates an example of complex kernel
A simple one-dimensional regression example computed in two different ways: A noise-free case
Plot the decision boundaries of a VotingClassifier for two features of the Iris dataset. Plot the class probabilities
Shown in the plot is how the logistic regression would, in this synthetic dataset, classify values as either 0 or 1, i.e. class one or two, using the logistic curve.
Plot the decision surface of a decision tree trained on pairs of features of the iris dataset. See
Plot the decision surfaces of forests of randomized trees trained on pairs of features of the iris dataset. This
This example compares the timing of Birch (with and without the global clustering step) and MiniBatchKMeans on a synthetic dataset having 100,000 samples and
Page 21 of 22