preprocessing.PolynomialFeatures()
  • References/Python/scikit-learn/API Reference/preprocessing

class sklearn.preprocessing.PolynomialFeatures(degree=2, interaction_only=False, include_bias=True)

2025-01-10 15:47:30
model_selection.KFold()
  • References/Python/scikit-learn/API Reference/model_selection

class sklearn.model_selection.KFold(n_splits=3, shuffle=False, random_state=None)

2025-01-10 15:47:30
Selecting the number of clusters with silhouette analysis on KMeans clustering
  • References/Python/scikit-learn/Examples/Clustering

Silhouette analysis can be used to study the separation distance between the resulting clusters. The silhouette

2025-01-10 15:47:30
sklearn.metrics.homogeneity_completeness_v_measure()
  • References/Python/scikit-learn/API Reference/metrics

sklearn.metrics.homogeneity_completeness_v_measure(labels_true, labels_pred)

2025-01-10 15:47:30
sklearn.metrics.pairwise.pairwise_distances()
  • References/Python/scikit-learn/API Reference/metrics

sklearn.metrics.pairwise.pairwise_distances(X, Y=None, metric='euclidean', n_jobs=1, **kwds)

2025-01-10 15:47:30
Libsvm GUI
  • References/Python/scikit-learn/Examples/Examples based on real world datasets

A simple graphical frontend for Libsvm mainly intended for didactic purposes. You can create data points by point and click and visualize the decision region induced by different

2025-01-10 15:47:30
kernel_approximation.Nystroem()
  • References/Python/scikit-learn/API Reference/kernel_approximation

class sklearn.kernel_approximation.Nystroem(kernel='rbf', gamma=None, coef0=1, degree=3, kernel_params=None, n_components=100

2025-01-10 15:47:30
Plot Ridge coefficients as a function of the L2 regularization
  • References/Python/scikit-learn/Examples/Generalized Linear Models

Ridge Regression is the estimator used in this example. Each color in the left plot represents one different dimension of the coefficient vector, and this is displayed as a function of the regularization parameter. The right plot shows how exact the solution is. This example illustrates how a well defined solution is found by Ridge regression and how regularization affects the coefficients and their values. The plot on the right shows how the difference of the coefficients from the estimator c

2025-01-10 15:47:30
model_selection.TimeSeriesSplit()
  • References/Python/scikit-learn/API Reference/model_selection

class sklearn.model_selection.TimeSeriesSplit(n_splits=3)

2025-01-10 15:47:30
sklearn.metrics.pairwise.rbf_kernel()
  • References/Python/scikit-learn/API Reference/metrics

sklearn.metrics.pairwise.rbf_kernel(X, Y=None, gamma=None)

2025-01-10 15:47:30