Probabilistic PCA and Factor Analysis are probabilistic models. The consequence is that the likelihood of new data can be used
Performs a pixel-wise Vector Quantization (VQ) of an image of the summer palace (China), reducing the number of colors required to show the image from 96,615
Plot the decision boundaries of a VotingClassifier for two features of the Iris dataset. Plot the class probabilities
Demonstration of several covariances types for Gaussian mixture models. See
An illustration of the isotonic regression on generated data. The isotonic regression finds a non-decreasing approximation of a function while minimizing the mean squared
This example constructs a pipeline that does dimensionality reduction followed by prediction with a support vector classifier
This example is based on Section 5.4.3 of ?Gaussian Processes for Machine Learning? [RW2006]. It illustrates an example of complex kernel
For greyscale image data where pixel values can be interpreted as degrees of blackness on a white background, like handwritten
Lasso and elastic net (L1 and L2 penalisation) implemented using a coordinate descent. The coefficients can be forced to be positive.
Plot the decision surface of a decision tree trained on pairs of features of the iris dataset. See
Page 20 of 22