Sparsity Example

Features 1 and 2 of the diabetes-dataset are fitted and plotted below. It illustrates that although feature 2 has a strong coefficient on the

2017-01-15 04:27:01
Polynomial interpolation

This example demonstrates how to approximate a function with a polynomial of degree n_degree by using ridge regression. Concretely, from n_samples 1d points, it suffices

2017-01-15 04:25:03
SGD: Weighted samples

Plot decision function of a weighted dataset, where the size of points is proportional to its weight.

2017-01-15 04:25:29
Lasso model selection

Use the Akaike information criterion (AIC), the Bayes Information criterion (BIC) and cross-validation to select an optimal value of the regularization

2017-01-15 04:23:08
Logistic Regression 3-class Classifier

Show below is a logistic-regression classifiers decision boundaries on the

2017-01-15 04:23:52
Linear Regression Example

This example uses the only the first feature of the diabetes dataset, in order to illustrate a two-dimensional plot of this regression technique. The

2017-01-15 04:23:13
Robust linear estimator fitting

Here a sine function is fit with a polynomial of order 3, for values close to zero. Robust fitting is demoed in different situations: No

2017-01-15 04:25:22
Lasso on dense and sparse data

We show that linear_model.Lasso provides the same results for dense and sparse data and that in the case of sparse data the speed is improved.

2017-01-15 04:23:09
Lasso and Elastic Net

Lasso and elastic net (L1 and L2 penalisation) implemented using a coordinate descent. The coefficients can be forced to be positive.

2017-01-15 04:23:07
Logistic function

Shown in the plot is how the logistic regression would, in this synthetic dataset, classify values as either 0 or 1, i.e. class one or two, using the logistic curve.

2017-01-15 04:23:52