A simple one-dimensional regression example computed in two different ways: A noise-free case
Compare randomized search and grid search for optimizing hyperparameters of a random forest. All parameters that influence
Demonstrate the resolution of a regression problem using a k-Nearest Neighbor and the interpolation of the target using both barycenter and constant weights.
An example to illustrate multi-output regression with decision tree. The
This example illustrates the effect of the parameters gamma and C of the Radial Basis Function (RBF) kernel SVM. Intuitively, the gamma
This example illustrates the predicted probability of GPC for an isotropic and anisotropic RBF kernel on a two-dimensional version for the
Example builds a swiss roll dataset and runs hierarchical clustering on their position. For more information, see
An illustration of Swiss Roll reduction with locally linear embedding
Demonstrates the effect of different metrics on the hierarchical clustering. The example is engineered to show the effect of the choice
Sample usage of Nearest Centroid classification. It will plot the decision boundaries for each class.
Page 19 of 22