On the left side the learning curve of a naive Bayes classifier is shown for the digits dataset. Note that the training score and the cross-validation score are both
sklearn.datasets.make_checkerboard(shape, n_clusters, noise=0.0, minval=10, maxval=100, shuffle=True, random_state=None)
class sklearn.decomposition.LatentDirichletAllocation(n_topics=10, doc_topic_prior=None, topic_word_prior=None, learning_method=None
class sklearn.discriminant_analysis.LinearDiscriminantAnalysis(solver='svd', shrinkage=None, priors=None,
sklearn.metrics.pairwise.cosine_similarity(X, Y=None, dense_output=True)
This example illustrates how sigmoid calibration changes predicted probabilities for a 3-class classification problem. Illustrated is the
class sklearn.cluster.Birch(threshold=0.5, branching_factor=50, n_clusters=3, compute_labels=True, copy=True)
class sklearn.tree.DecisionTreeClassifier(criterion='gini', splitter='best', max_depth=None, min_samples_split=2, min_samples_leaf=1
This examples shows how a classifier is optimized by cross-validation, which is done using the
sklearn.metrics.adjusted_rand_score(labels_true, labels_pred)
Page 52 of 70