Sparse inverse covariance estimation
  • References/Python/scikit-learn/Examples/Covariance estimation

Using the GraphLasso estimator to learn a covariance and sparse precision from a small number of samples. To estimate a probabilistic model (e.g

2025-01-10 15:47:30
Pipeline Anova SVM
  • References/Python/scikit-learn/Examples/Feature Selection

Simple usage of Pipeline that runs successively a univariate feature selection with anova and then a C-SVM of the selected features.

2025-01-10 15:47:30
sklearn.cluster.ward_tree()
  • References/Python/scikit-learn/API Reference/cluster

sklearn.cluster.ward_tree(X, connectivity=None, n_clusters=None, return_distance=False)

2025-01-10 15:47:30
sklearn.datasets.fetch_covtype()
  • References/Python/scikit-learn/API Reference/datasets

sklearn.datasets.fetch_covtype(data_home=None, download_if_missing=True, random_state=None, shuffle=False)

2025-01-10 15:47:30
model_selection.LeaveOneOut
  • References/Python/scikit-learn/API Reference/model_selection

class sklearn.model_selection.LeaveOneOut

2025-01-10 15:47:30
kernel_approximation.RBFSampler()
  • References/Python/scikit-learn/API Reference/kernel_approximation

class sklearn.kernel_approximation.RBFSampler(gamma=1.0, n_components=100, random_state=None)

2025-01-10 15:47:30
gaussian_process.kernels.PairwiseKernel()
  • References/Python/scikit-learn/API Reference/gaussian_process

class sklearn.gaussian_process.kernels.PairwiseKernel(gamma=1.0, gamma_bounds=(1e-05, 100000.0), metric='linear',

2025-01-10 15:47:30
SVM with custom kernel
  • References/Python/scikit-learn/Examples/Support Vector Machines

Simple usage of Support Vector Machines to classify a sample. It will plot the decision surface and the support vectors.

2025-01-10 15:47:30
3.4.
  • References/Python/scikit-learn/Guide

After training a scikit-learn model, it is desirable to have a way to persist the model for future use without having to retrain. The following section gives you an example

2025-01-10 15:47:30
Plot Ridge coefficients as a function of the L2 regularization
  • References/Python/scikit-learn/Examples/Generalized Linear Models

Ridge Regression is the estimator used in this example. Each color in the left plot represents one different dimension of the coefficient vector, and this is displayed as a function of the regularization parameter. The right plot shows how exact the solution is. This example illustrates how a well defined solution is found by Ridge regression and how regularization affects the coefficients and their values. The plot on the right shows how the difference of the coefficients from the estimator c

2025-01-10 15:47:30