Features 1 and 2 of the diabetes-dataset are fitted and plotted below. It illustrates that although feature 2 has a strong coefficient on the
This example demonstrates how to approximate a function with a polynomial of degree n_degree by using ridge regression. Concretely, from n_samples 1d points, it suffices
Plot decision function of a weighted dataset, where the size of points is proportional to its weight.
Use the Akaike information criterion (AIC), the Bayes Information criterion (BIC) and cross-validation to select an optimal value of the regularization
Show below is a logistic-regression classifiers decision boundaries on the
This example uses the only the first feature of the diabetes dataset, in order to illustrate a two-dimensional plot of this regression technique. The
Here a sine function is fit with a polynomial of order 3, for values close to zero. Robust fitting is demoed in different situations: No
We show that linear_model.Lasso provides the same results for dense and sparse data and that in the case of sparse data the speed is improved.
Lasso and elastic net (L1 and L2 penalisation) implemented using a coordinate descent. The coefficients can be forced to be positive.
Shown in the plot is how the logistic regression would, in this synthetic dataset, classify values as either 0 or 1, i.e. class one or two, using the logistic curve.
Page 3 of 4