Plot Ridge coefficients as a function of the regularization

Shows the effect of collinearity in the coefficients of an estimator.

2017-01-15 04:25:01
SGD: Maximum margin separating hyperplane

Plot the maximum margin separating hyperplane within a two-class separable dataset using a linear Support Vector Machines classifier trained using SGD

2017-01-15 04:25:28
Plot multi-class SGD on the iris dataset

Plot decision surface of multi-class SGD on iris dataset. The hyperplanes corresponding to the three one-versus-all (OVA) classifiers are represented

2017-01-15 04:24:58
Path with L1- Logistic Regression

Computes path on IRIS dataset. print(__doc__) # Author: Alexandre Gramfort <alexandre.gramfort@inria.fr>

2017-01-15 04:24:53
HuberRegressor vs Ridge on dataset with strong outliers

Fit Ridge and HuberRegressor on a dataset with outliers. The example shows that the predictions in ridge are strongly influenced

2017-01-15 04:22:50
Sparse recovery

Given a small number of observations, we want to recover which features of X are relevant to explain y. For this

2017-01-15 04:27:00
Theil-Sen Regression

Computes a Theil-Sen Regression on a synthetic dataset. See

2017-01-15 04:27:15
Ordinary Least Squares and Ridge Regression Variance

Due to the few points in each dimension and the straight line that linear regression uses to follow these points as well as it can, noise

2017-01-15 04:24:50
Plot multinomial and One-vs-Rest Logistic Regression

Plot decision surface of multinomial and One-vs-Rest Logistic Regression. The hyperplanes corresponding to the three One-vs-Rest (OVR) classifiers

2017-01-15 04:24:58
L1 Penalty and Sparsity in Logistic Regression

Comparison of the sparsity (percentage of zero coefficients) of solutions when L1 and L2 penalty are used for different values of C. We can see

2017-01-15 04:23:05