Lasso and Elastic Net for Sparse Signals
  • References/Python/scikit-learn/Examples/Generalized Linear Models

Estimates Lasso and Elastic-Net regression models on a manually generated sparse signal corrupted with an additive noise. Estimated coefficients are

2025-01-10 15:47:30
SGD: Maximum margin separating hyperplane
  • References/Python/scikit-learn/Examples/Generalized Linear Models

Plot the maximum margin separating hyperplane within a two-class separable dataset using a linear Support Vector Machines classifier trained using SGD

2025-01-10 15:47:30
Path with L1- Logistic Regression
  • References/Python/scikit-learn/Examples/Generalized Linear Models

Computes path on IRIS dataset. print(__doc__) # Author: Alexandre Gramfort <alexandre.gramfort@inria.fr>

2025-01-10 15:47:30
HuberRegressor vs Ridge on dataset with strong outliers
  • References/Python/scikit-learn/Examples/Generalized Linear Models

Fit Ridge and HuberRegressor on a dataset with outliers. The example shows that the predictions in ridge are strongly influenced

2025-01-10 15:47:30
Plot multinomial and One-vs-Rest Logistic Regression
  • References/Python/scikit-learn/Examples/Generalized Linear Models

Plot decision surface of multinomial and One-vs-Rest Logistic Regression. The hyperplanes corresponding to the three One-vs-Rest (OVR) classifiers

2025-01-10 15:47:30
Theil-Sen Regression
  • References/Python/scikit-learn/Examples/Generalized Linear Models

Computes a Theil-Sen Regression on a synthetic dataset. See

2025-01-10 15:47:30
Ordinary Least Squares and Ridge Regression Variance
  • References/Python/scikit-learn/Examples/Generalized Linear Models

Due to the few points in each dimension and the straight line that linear regression uses to follow these points as well as it can, noise

2025-01-10 15:47:30
Plot multi-class SGD on the iris dataset
  • References/Python/scikit-learn/Examples/Generalized Linear Models

Plot decision surface of multi-class SGD on iris dataset. The hyperplanes corresponding to the three one-versus-all (OVA) classifiers are represented

2025-01-10 15:47:30
Sparse recovery
  • References/Python/scikit-learn/Examples/Generalized Linear Models

Given a small number of observations, we want to recover which features of X are relevant to explain y. For this

2025-01-10 15:47:30
L1 Penalty and Sparsity in Logistic Regression
  • References/Python/scikit-learn/Examples/Generalized Linear Models

Comparison of the sparsity (percentage of zero coefficients) of solutions when L1 and L2 penalty are used for different values of C. We can see

2025-01-10 15:47:30