Plot the contours of the three penalties. All of the above are supported by sklearn.linear_model.stochastic_gradient.
Plot the maximum margin separating hyperplane within a two-class separable dataset using a linear Support Vector Machines classifier trained using SGD
Plot decision surface of multinomial and One-vs-Rest Logistic Regression. The hyperplanes corresponding to the three One-vs-Rest (OVR) classifiers
Due to the few points in each dimension and the straight line that linear regression uses to follow these points as well as it can, noise
Computes a Theil-Sen Regression on a synthetic dataset. See
Given a small number of observations, we want to recover which features of X are relevant to explain y. For this
Plot decision surface of multi-class SGD on iris dataset. The hyperplanes corresponding to the three one-versus-all (OVA) classifiers are represented
Fit Ridge and HuberRegressor on a dataset with outliers. The example shows that the predictions in ridge are strongly influenced
Computes path on IRIS dataset. print(__doc__) # Author: Alexandre Gramfort <alexandre.gramfort@inria.fr>
Comparison of the sparsity (percentage of zero coefficients) of solutions when L1 and L2 penalty are used for different values of C. We can see
Page 2 of 4