SGD: Penalties

Plot the contours of the three penalties. All of the above are supported by sklearn.linear_model.stochastic_gradient.

2017-01-15 04:25:28
HuberRegressor vs Ridge on dataset with strong outliers

Fit Ridge and HuberRegressor on a dataset with outliers. The example shows that the predictions in ridge are strongly influenced

2017-01-15 04:22:50
Theil-Sen Regression

Computes a Theil-Sen Regression on a synthetic dataset. See

2017-01-15 04:27:15
Path with L1- Logistic Regression

Computes path on IRIS dataset. print(__doc__) # Author: Alexandre Gramfort <alexandre.gramfort@inria.fr>

2017-01-15 04:24:53
SGD: Maximum margin separating hyperplane

Plot the maximum margin separating hyperplane within a two-class separable dataset using a linear Support Vector Machines classifier trained using SGD

2017-01-15 04:25:28
Ordinary Least Squares and Ridge Regression Variance

Due to the few points in each dimension and the straight line that linear regression uses to follow these points as well as it can, noise

2017-01-15 04:24:50
Plot multi-class SGD on the iris dataset

Plot decision surface of multi-class SGD on iris dataset. The hyperplanes corresponding to the three one-versus-all (OVA) classifiers are represented

2017-01-15 04:24:58
Plot multinomial and One-vs-Rest Logistic Regression

Plot decision surface of multinomial and One-vs-Rest Logistic Regression. The hyperplanes corresponding to the three One-vs-Rest (OVR) classifiers

2017-01-15 04:24:58
Sparse recovery

Given a small number of observations, we want to recover which features of X are relevant to explain y. For this

2017-01-15 04:27:00
Logistic Regression 3-class Classifier

Show below is a logistic-regression classifiers decision boundaries on the

2017-01-15 04:23:52