Estimates Lasso and Elastic-Net regression models on a manually generated sparse signal corrupted with an additive noise. Estimated coefficients are
Plot the maximum margin separating hyperplane within a two-class separable dataset using a linear Support Vector Machines classifier trained using SGD
Computes path on IRIS dataset. print(__doc__) # Author: Alexandre Gramfort <alexandre.gramfort@inria.fr>
Fit Ridge and HuberRegressor on a dataset with outliers. The example shows that the predictions in ridge are strongly influenced
Plot decision surface of multinomial and One-vs-Rest Logistic Regression. The hyperplanes corresponding to the three One-vs-Rest (OVR) classifiers
Computes a Theil-Sen Regression on a synthetic dataset. See
Due to the few points in each dimension and the straight line that linear regression uses to follow these points as well as it can, noise
Plot decision surface of multi-class SGD on iris dataset. The hyperplanes corresponding to the three one-versus-all (OVA) classifiers are represented
Given a small number of observations, we want to recover which features of X are relevant to explain y. For this
Comparison of the sparsity (percentage of zero coefficients) of solutions when L1 and L2 penalty are used for different values of C. We can see
Page 2 of 4