When working with covariance estimation, the usual approach is to use a maximum likelihood estimator, such as the
sklearn.datasets.make_biclusters(shape, n_clusters, noise=0.0, minval=10, maxval=100, shuffle=True, random_state=None)
An example to show covariance estimation with the Mahalanobis distances on Gaussian distributed data. For Gaussian distributed
sklearn.metrics.pairwise.polynomial_kernel(X, Y=None, degree=3, gamma=None, coef0=1)
Warning All classifiers in scikit-learn do multiclass classification
sklearn.metrics.pairwise.additive_chi2_kernel(X, Y=None)
sklearn.metrics.pairwise.distance_metrics()
sklearn.metrics.mutual_info_score(labels_true, labels_pred, contingency=None)
Plot decision surface of multi-class SGD on iris dataset. The hyperplanes corresponding to the three one-versus-all (OVA) classifiers are represented
class sklearn.linear_model.MultiTaskLassoCV(eps=0.001, n_alphas=100, alphas=None, fit_intercept=True, normalize=False, max_iter=1000
Page 38 of 70