Hashing feature transformation using Totally Random Trees

RandomTreesEmbedding provides a way to map data to a very high-dimensional, sparse representation, which might be beneficial for classification

2017-01-15 04:22:49
Gradient Boosting Out-of-Bag estimates

Out-of-bag (OOB) estimates can be a useful heuristic to estimate the ?optimal? number of boosting iterations. OOB estimates are almost identical to cross-validation

2017-01-15 04:22:43
OOB Errors for Random Forests

The RandomForestClassifier is trained using bootstrap aggregation, where each new tree is fit from a bootstrap sample of the training observations

2017-01-15 04:24:50
Feature importances with forests of trees

This examples shows the use of forests of trees to evaluate the importance of features on an artificial classification task. The red bars are the feature

2017-01-15 04:22:02
Discrete versus Real AdaBoost

This example is based on Figure 10.2 from Hastie et al 2009 [1] and illustrates the difference in performance between the discrete SAMME [2] boosting algorithm

2017-01-15 04:21:30
Partial Dependence Plots

Partial dependence plots show the dependence between the target function

2017-01-15 04:24:53
Prediction Intervals for Gradient Boosting Regression

This example shows how quantile regression can be used to create prediction intervals.

2017-01-15 04:25:04
IsolationForest example

An example using IsolationForest for anomaly detection. The IsolationForest ?isolates? observations by randomly selecting a feature and then randomly selecting

2017-01-15 04:22:55
Pixel importances with a parallel forest of trees

This example shows the use of forests of trees to evaluate the importance of the pixels in an image classification task (faces). The hotter

2017-01-15 04:24:57
Comparing random forests and the multi-output meta estimator

An example to compare multi-output regression with random forest and the

2017-01-15 04:20:48