Demonstration of k-means assumptions
  • References/Python/scikit-learn/Examples/Clustering

This example is meant to illustrate situations where k-means will produce unintuitive and possibly unexpected clusters. In the first three plots, the input

2025-01-10 15:47:30
1.4.
  • References/Python/scikit-learn/Guide

Support vector machines (SVMs) are a set of supervised learning methods used for

2025-01-10 15:47:30
Plotting Validation Curves
  • References/Python/scikit-learn/Examples/Model Selection

In this plot you can see the training scores and validation scores of an SVM for different values of the kernel parameter gamma. For very low values of gamma, you

2025-01-10 15:47:30
Normal and Shrinkage Linear Discriminant Analysis for classification
  • References/Python/scikit-learn/Examples/Classification

Shows how shrinkage improves classification.

2025-01-10 15:47:30
sklearn.datasets.make_friedman3()
  • References/Python/scikit-learn/API Reference/datasets

sklearn.datasets.make_friedman3(n_samples=100, noise=0.0, random_state=None)

2025-01-10 15:47:30
Sparse recovery
  • References/Python/scikit-learn/Examples/Generalized Linear Models

Given a small number of observations, we want to recover which features of X are relevant to explain y. For this

2025-01-10 15:47:30
HuberRegressor vs Ridge on dataset with strong outliers
  • References/Python/scikit-learn/Examples/Generalized Linear Models

Fit Ridge and HuberRegressor on a dataset with outliers. The example shows that the predictions in ridge are strongly influenced

2025-01-10 15:47:30
Receiver Operating Characteristic
  • References/Python/scikit-learn/Examples/Model Selection

Example of Receiver Operating Characteristic (ROC) metric to evaluate classifier output quality. ROC curves typically feature true positive rate

2025-01-10 15:47:30
covariance.LedoitWolf()
  • References/Python/scikit-learn/API Reference/covariance

class sklearn.covariance.LedoitWolf(store_precision=True, assume_centered=False, block_size=1000)

2025-01-10 15:47:30
1.5.
  • References/Python/scikit-learn/Guide

Stochastic Gradient Descent (SGD) is a simple yet very efficient approach to discriminative learning of linear classifiers under convex loss functions

2025-01-10 15:47:30