neighbors.DistanceMetric

class sklearn.neighbors.DistanceMetric DistanceMetric class This class provides a uniform interface to fast distance metric functions. The various metrics can be accessed via the get_metric class method and the metric string identifier (see below). For example, to use the Euclidean distance: >>> dist = DistanceMetric.get_metric('euclidean') >>> X = [[0, 1, 2], [3, 4, 5]]) >>> dist.pairwise(X) array([[ 0. , 5.19615242], [ 5.19615242, 0.

Nearest Neighbors regression

Demonstrate the resolution of a regression problem using a k-Nearest Neighbor and the interpolation of the target using both barycenter and constant weights. print(__doc__) # Author: Alexandre Gramfort <alexandre.gramfort@inria.fr> # Fabian Pedregosa <fabian.pedregosa@inria.fr> # # License: BSD 3 clause (C) INRIA Generate sample data import numpy as np import matplotlib.pyplot as plt from sklearn import neighbors np.random.seed(0) X = np.sort(5 * np.random.rand(40, 1), a

Nearest Neighbors Classification

Sample usage of Nearest Neighbors classification. It will plot the decision boundaries for each class. print(__doc__) import numpy as np import matplotlib.pyplot as plt from matplotlib.colors import ListedColormap from sklearn import neighbors, datasets n_neighbors = 15 # import some data to play with iris = datasets.load_iris() X = iris.data[:, :2] # we only take the first two features. We could # avoid this ugly slicing by using a two-dim dataset y = iris.targ

Nearest Centroid Classification

Sample usage of Nearest Centroid classification. It will plot the decision boundaries for each class. Out: None 0.813333333333 0.2 0.82 print(__doc__) import numpy as np import matplotlib.pyplot as plt from matplotlib.colors import ListedColormap from sklearn import datasets from sklearn.neighbors import NearestCentroid n_neighbors = 15 # import some data to play with iris = datasets.load_iris() X = iris.data[:, :2] # we only take the first two features. We could

naive_bayes.MultinomialNB()

class sklearn.naive_bayes.MultinomialNB(alpha=1.0, fit_prior=True, class_prior=None) [source] Naive Bayes classifier for multinomial models The multinomial Naive Bayes classifier is suitable for classification with discrete features (e.g., word counts for text classification). The multinomial distribution normally requires integer feature counts. However, in practice, fractional counts such as tf-idf may also work. Read more in the User Guide. Parameters: alpha : float, optional (default=1

naive_bayes.GaussianNB()

class sklearn.naive_bayes.GaussianNB(priors=None) [source] Gaussian Naive Bayes (GaussianNB) Can perform online updates to model parameters via partial_fit method. For details on algorithm used to update feature means and variance online, see Stanford CS tech report STAN-CS-79-773 by Chan, Golub, and LeVeque: http://i.stanford.edu/pub/cstr/reports/cs/tr/79/773/CS-TR-79-773.pdf Read more in the User Guide. Parameters: priors : array-like, shape (n_classes,) Prior probabilities of the clas

naive_bayes.BernoulliNB()

class sklearn.naive_bayes.BernoulliNB(alpha=1.0, binarize=0.0, fit_prior=True, class_prior=None) [source] Naive Bayes classifier for multivariate Bernoulli models. Like MultinomialNB, this classifier is suitable for discrete data. The difference is that while MultinomialNB works with occurrence counts, BernoulliNB is designed for binary/boolean features. Read more in the User Guide. Parameters: alpha : float, optional (default=1.0) Additive (Laplace/Lidstone) smoothing parameter (0 for no

multioutput.MultiOutputRegressor()

class sklearn.multioutput.MultiOutputRegressor(estimator, n_jobs=1) [source] Multi target regression This strategy consists of fitting one regressor per target. This is a simple strategy for extending regressors that do not natively support multi-target regression. Parameters: estimator : estimator object An estimator object implementing fit and predict. n_jobs : int, optional, default=1 The number of jobs to run in parallel for fit. If -1, then the number of jobs is set to the number o

multioutput.MultiOutputClassifier()

class sklearn.multioutput.MultiOutputClassifier(estimator, n_jobs=1) [source] Multi target classification This strategy consists of fitting one classifier per target. This is a simple strategy for extending classifiers that do not natively support multi-target classification Parameters: estimator : estimator object An estimator object implementing fit, score and predict_proba. n_jobs : int, optional, default=1 The number of jobs to use for the computation. If -1 all CPUs are used. If 1

multiclass.OutputCodeClassifier()

class sklearn.multiclass.OutputCodeClassifier(estimator, code_size=1.5, random_state=None, n_jobs=1) [source] (Error-Correcting) Output-Code multiclass strategy Output-code based strategies consist in representing each class with a binary code (an array of 0s and 1s). At fitting time, one binary classifier per bit in the code book is fitted. At prediction time, the classifiers are used to project new points in the class space and the class closest to the points is chosen. The main advantage