class sklearn.preprocessing.OneHotEncoder(n_values='auto', categorical_features='all', dtype=, sparse=True, handle_unknown='error')
sklearn.datasets.make_sparse_uncorrelated(n_samples=100, n_features=10, random_state=None)
Plot the decision surfaces of forests of randomized trees trained on pairs of features of the iris dataset. This
sklearn.datasets.load_boston(return_X_y=False)
class sklearn.cluster.KMeans(n_clusters=8, init='k-means++', n_init=10, max_iter=300, tol=0.0001, precompute_distances='auto', verbose=0, random_state=None
Demonstrates the effect of different metrics on the hierarchical clustering. The example is engineered to show the effect of the choice
Warning DEPRECATED class sklearn
class sklearn.ensemble.BaggingRegressor(base_estimator=None, n_estimators=10, max_samples=1.0, max_features=1.0, bootstrap=True,
sklearn.metrics.confusion_matrix(y_true, y_pred, labels=None, sample_weight=None)
class sklearn.cluster.FeatureAgglomeration(n_clusters=2, affinity='euclidean', memory=Memory(cachedir=None), connectivity=None
Page 68 of 70