cluster.FeatureAgglomeration()
  • References/Python/scikit-learn/API Reference/cluster

class sklearn.cluster.FeatureAgglomeration(n_clusters=2, affinity='euclidean', memory=Memory(cachedir=None), connectivity=None

2025-01-10 15:47:30
gaussian_process.kernels.WhiteKernel()
  • References/Python/scikit-learn/API Reference/gaussian_process

class sklearn.gaussian_process.kernels.WhiteKernel(noise_level=1.0, noise_level_bounds=(1e-05, 100000.0))

2025-01-10 15:47:30
sklearn.datasets.load_svmlight_files()
  • References/Python/scikit-learn/API Reference/datasets

sklearn.datasets.load_svmlight_files(files, n_features=None, dtype=, multilabel=False, zero_based='auto', query_id=False)

2025-01-10 15:47:30
naive_bayes.GaussianNB()
  • References/Python/scikit-learn/API Reference/naive_bayes

class sklearn.naive_bayes.GaussianNB(priors=None)

2025-01-10 15:47:30
dummy.DummyRegressor()
  • References/Python/scikit-learn/API Reference/dummy

class sklearn.dummy.DummyRegressor(strategy='mean', constant=None, quantile=None)

2025-01-10 15:47:30
sklearn.metrics.pairwise.laplacian_kernel()
  • References/Python/scikit-learn/API Reference/metrics

sklearn.metrics.pairwise.laplacian_kernel(X, Y=None, gamma=None)

2025-01-10 15:47:30
Gaussian Mixture Model Selection
  • References/Python/scikit-learn/Examples/Gaussian Mixture Models

This example shows that model selection can be performed with Gaussian Mixture Models using information-theoretic criteria (BIC). Model selection concerns both

2025-01-10 15:47:30
sklearn.metrics.r2_score()
  • References/Python/scikit-learn/API Reference/metrics

sklearn.metrics.r2_score(y_true, y_pred, sample_weight=None, multioutput=None)

2025-01-10 15:47:30
Bayesian Ridge Regression
  • References/Python/scikit-learn/Examples/Generalized Linear Models

Computes a Bayesian Ridge Regression on a synthetic dataset. See

2025-01-10 15:47:30
preprocessing.KernelCenterer
  • References/Python/scikit-learn/API Reference/preprocessing

class sklearn.preprocessing.KernelCenterer

2025-01-10 15:47:30