gaussian_process.kernels.ConstantKernel()

class sklearn.gaussian_process.kernels.ConstantKernel(constant_value=1.0, constant_value_bounds=(1e-05, 100000.0)) [source] Constant kernel. Can be used as part of a product-kernel where it scales the magnitude of the other factor (kernel) or as part of a sum-kernel, where it modifies the mean of the Gaussian process. k(x_1, x_2) = constant_value for all x_1, x_2 New in version 0.18. Parameters: constant_value : float, default: 1.0 The constant value which defines the covariance: k(x_1,

gaussian_process.kernels.CompoundKernel()

class sklearn.gaussian_process.kernels.CompoundKernel(kernels) [source] Kernel which is composed of a set of other kernels. New in version 0.18. Methods clone_with_theta(theta) Returns a clone of self with given hyperparameters theta. diag(X) Returns the diagonal of the kernel k(X, X). get_params([deep]) Get parameters of this kernel. is_stationary() Returns whether the kernel is stationary. set_params(\*\*params) Set the parameters of this kernel. __init__(kernels) [source] boun

gaussian_process.GaussianProcessRegressor()

class sklearn.gaussian_process.GaussianProcessRegressor(kernel=None, alpha=1e-10, optimizer='fmin_l_bfgs_b', n_restarts_optimizer=0, normalize_y=False, copy_X_train=True, random_state=None) [source] Gaussian process regression (GPR). The implementation is based on Algorithm 2.1 of Gaussian Processes for Machine Learning (GPML) by Rasmussen and Williams. In addition to standard scikit-learn estimator API, GaussianProcessRegressor: allows prediction without prior fitting (based on the GP pri

gaussian_process.GaussianProcessClassifier()

class sklearn.gaussian_process.GaussianProcessClassifier(kernel=None, optimizer='fmin_l_bfgs_b', n_restarts_optimizer=0, max_iter_predict=100, warm_start=False, copy_X_train=True, random_state=None, multi_class='one_vs_rest', n_jobs=1) [source] Gaussian process classification (GPC) based on Laplace approximation. The implementation is based on Algorithm 3.1, 3.2, and 5.1 of Gaussian Processes for Machine Learning (GPML) by Rasmussen and Williams. Internally, the Laplace approximation is use

gaussian_process.GaussianProcess()

Warning DEPRECATED class sklearn.gaussian_process.GaussianProcess(*args, **kwargs) [source] The legacy Gaussian Process model class. Deprecated since version 0.18: This class will be removed in 0.20. Use the GaussianProcessRegressor instead. Read more in the User Guide. Parameters: regr : string or callable, optional A regression function returning an array of outputs of the linear regression functional basis. The number of observations n_samples should be greater than the size p of t

Gaussian Processes regression

A simple one-dimensional regression example computed in two different ways: A noise-free case A noisy case with known noise-level per datapoint In both cases, the kernel?s parameters are estimated using the maximum likelihood principle. The figures illustrate the interpolating property of the Gaussian Process model as well as its probabilistic nature in the form of a pointwise 95% confidence interval. Note that the parameter alpha is applied as a Tikhonov regularization of the assumed covari

Gaussian process regression with noise-level estimation

This example illustrates that GPR with a sum-kernel including a WhiteKernel can estimate the noise level of data. An illustration of the log-marginal-likelihood (LML) landscape shows that there exist two local maxima of LML. The first corresponds to a model with a high noise level and a large length scale, which explains all variations in the data by noise. The second one has a smaller noise level and shorter length scale, which explains most of the variation by the noise-free functional relat

Gaussian process regression on Mauna Loa CO2 data.

This example is based on Section 5.4.3 of ?Gaussian Processes for Machine Learning? [RW2006]. It illustrates an example of complex kernel engineering and hyperparameter optimization using gradient ascent on the log-marginal-likelihood. The data consists of the monthly average atmospheric CO2 concentrations (in parts per million by volume (ppmv)) collected at the Mauna Loa Observatory in Hawaii, between 1958 and 1997. The objective is to model the CO2 concentration as a function of the time t.

Gaussian process classification on iris dataset

This example illustrates the predicted probability of GPC for an isotropic and anisotropic RBF kernel on a two-dimensional version for the iris-dataset. The anisotropic RBF kernel obtains slightly higher log-marginal-likelihood by assigning different length-scales to the two feature dimensions. print(__doc__) import numpy as np import matplotlib.pyplot as plt from sklearn import datasets from sklearn.gaussian_process import GaussianProcessClassifier from sklearn.gaussian_process.kernels i

Gaussian Mixture Model Sine Curve

This example demonstrates the behavior of Gaussian mixture models fit on data that was not sampled from a mixture of Gaussian random variables. The dataset is formed by 100 points loosely spaced following a noisy sine curve. There is therefore no ground truth value for the number of Gaussian components. The first model is a classical Gaussian Mixture Model with 10 components fit with the Expectation-Maximization algorithm. The second model is a Bayesian Gaussian Mixture Model with a Dirichlet