class sklearn.ensemble.ExtraTreesRegressor(n_estimators=10, criterion='mse', max_depth=None, min_samples_split=2, min_samples_leaf=1
A plot that compares the various convex loss functions supported by
Making sure that each Feature has approximately the same scale can be a crucial preprocessing step. However, when data contains outliers,
Find the optimal separating hyperplane using an SVC for classes that are unbalanced. We first find the separating plane with a plain
class sklearn.decomposition.MiniBatchSparsePCA(n_components=None, alpha=1, ridge_alpha=0.01, n_iter=100, callback=None, batch_size=3
sklearn.preprocessing.scale(X, axis=0, with_mean=True, with_std=True, copy=True)
sklearn.model_selection.train_test_split(*arrays, **options)
sklearn.metrics.pairwise.paired_euclidean_distances(X, Y)
The PCA does an unsupervised dimensionality reduction, while the logistic regression does the prediction. We use a GridSearchCV to
sklearn.linear_model.lasso_path(X, y, eps=0.001, n_alphas=100, alphas=None, precompute='auto', Xy=None, copy_X=True, coef_init=None
Page 1 of 70