In this plot you can see the training scores and validation scores of an SVM for different values of the kernel parameter gamma. For very low values of gamma, you
sklearn.metrics.auc(x, y, reorder=False)
sklearn.datasets.load_svmlight_file(f, n_features=None, dtype=, multilabel=False, zero_based='auto', query_id=False)
The
sklearn.datasets.dump_svmlight_file(X, y, f, zero_based=True, comment=None, query_id=None, multilabel=False)
class sklearn.semi_supervised.LabelPropagation(kernel='rbf', gamma=20, n_neighbors=7, alpha=1, max_iter=30, tol=0.001, n_jobs=1)
class sklearn.linear_model.LassoCV(eps=0.001, n_alphas=100, alphas=None, fit_intercept=True, normalize=False, precompute='auto', max_iter=1000
class sklearn.base.TransformerMixin
An example comparing the effect of reconstructing noisy fragments of a raccoon face image using firstly online
class sklearn.decomposition.SparsePCA(n_components=None, alpha=1, ridge_alpha=0.01, max_iter=1000, tol=1e-08, method='lars', n_jobs=1
Page 34 of 70