class sklearn.covariance.EmpiricalCovariance(store_precision=True, assume_centered=False)
class sklearn.gaussian_process.GaussianProcessRegressor(kernel=None, alpha=1e-10, optimizer='fmin_l_bfgs_b', n_restarts_optimizer=0
class sklearn.preprocessing.MaxAbsScaler(copy=True)
class sklearn.model_selection.LeavePOut(p)
These images how similar features are merged together using feature agglomeration.
class sklearn.mixture.BayesianGaussianMixture(n_components=1, covariance_type='full', tol=0.001, reg_covar=1e-06, max_iter=100
class sklearn.linear_model.Lars(fit_intercept=True, verbose=False, normalize=True, precompute='auto', n_nonzero_coefs=500, eps=2.2204460492503131e-16
A decision tree is boosted using the AdaBoost.R2 [1] algorithm on a 1D sinusoidal dataset with a small amount of Gaussian noise. 299 boosts (300 decision
class sklearn.base.TransformerMixin
When the amount of contamination is known, this example illustrates three different ways of performing
Page 34 of 70