We show that linear_model.Lasso provides the same results for dense and sparse data and that in the case of sparse data the speed is improved.
sklearn.linear_model.lasso_stability_path(X, y, scaling=0.5, random_state=None, n_resampling=200, n_grid=100, sample_fraction=0
class sklearn.cluster.SpectralClustering(n_clusters=8, eigen_solver=None, random_state=None, n_init=10, gamma=1.0, affinity='rbf'
Here a sine function is fit with a polynomial of order 3, for values close to zero. Robust fitting is demoed in different situations: No
The cross decomposition module contains two main families of algorithms: the partial least squares (PLS) and the canonical correlation analysis (CCA). These families
This example shows the use of multi-output estimator to complete images. The goal is to predict the lower half of a face given its upper half
class sklearn.svm.NuSVC(nu=0.5, kernel='rbf', degree=3, gamma='auto', coef0=0.0, shrinking=True, probability=False, tol=0.001, cache_size=200, class_weight=None
This is an example showing how the scikit-learn can be used to cluster documents by topics using a bag-of-words approach. This example uses a scipy.sparse
This example illustrates the predicted probability of GPC for an RBF kernel with different choices of the hyperparameters
Compute the segmentation of a 2D image with Ward hierarchical clustering. The clustering is spatially constrained in
Page 46 of 70