3.3.
  • References/Python/scikit-learn/Guide

There are 3 different approaches to evaluate the quality of predictions of a model: Estimator score

2025-01-10 15:47:30
sklearn.preprocessing.scale()
  • References/Python/scikit-learn/API Reference/preprocessing

sklearn.preprocessing.scale(X, axis=0, with_mean=True, with_std=True, copy=True)

2025-01-10 15:47:30
Supervised learning
  • References/Python/scikit-learn/Tutorials

The problem solved in supervised learning

2025-01-10 15:47:30
sklearn.metrics.f1_score()
  • References/Python/scikit-learn/API Reference/metrics

sklearn.metrics.f1_score(y_true, y_pred, labels=None, pos_label=1, average='binary', sample_weight=None)

2025-01-10 15:47:30
sklearn.metrics.silhouette_score()
  • References/Python/scikit-learn/API Reference/metrics

sklearn.metrics.silhouette_score(X, labels, metric='euclidean', sample_size=None, random_state=None, **kwds)

2025-01-10 15:47:30
SVM-Kernels
  • References/Python/scikit-learn/Examples/Support Vector Machines

Three different types of SVM-Kernels are displayed below. The polynomial and RBF are especially useful when the data-points are not linearly separable.

2025-01-10 15:47:30
feature_selection.RFE()
  • References/Python/scikit-learn/API Reference/feature_selection

class sklearn.feature_selection.RFE(estimator, n_features_to_select=None, step=1, verbose=0)

2025-01-10 15:47:30
exceptions.NonBLASDotWarning
  • References/Python/scikit-learn/API Reference/exceptions

class sklearn.exceptions.NonBLASDotWarning

2025-01-10 15:47:30
ensemble.IsolationForest()
  • References/Python/scikit-learn/API Reference/ensemble

class sklearn.ensemble.IsolationForest(n_estimators=100, max_samples='auto', contamination=0.1, max_features=1.0, bootstrap=False

2025-01-10 15:47:30
Linear and Quadratic Discriminant Analysis with confidence ellipsoid
  • References/Python/scikit-learn/Examples/Classification

Plot the confidence ellipsoids of each class and decision boundary print(__doc__)

2025-01-10 15:47:30