Decision Tree Regression with AdaBoost

A decision tree is boosted using the AdaBoost.R2 [1] algorithm on a 1D sinusoidal dataset with a small amount of Gaussian noise. 299 boosts (300 decision

2017-01-15 04:21:11
Gradient Boosting regularization

Illustration of the effect of different regularization strategies for Gradient Boosting. The example is taken from Hastie et al 2009. The loss function

2017-01-15 04:22:45
Single estimator versus bagging

This example illustrates and compares the bias-variance decomposition of the expected mean squared error of a single estimator against

2017-01-15 04:25:30
Multi-class AdaBoosted Decision Trees

This example reproduces Figure 1 of Zhu et al [1] and shows how boosting can improve prediction accuracy on a multi-class problem. The classification dataset

2017-01-15 04:24:21
Two-class AdaBoost

This example fits an AdaBoosted decision stump on a non-linearly separable classification dataset composed of two ?Gaussian quantiles? clusters (see

2017-01-15 04:27:20
Plot class probabilities calculated by the VotingClassifier

Plot the class probabilities of the first sample in a toy dataset predicted by three different classifiers and averaged by the

2017-01-15 04:24:57
Feature transformations with ensembles of trees

Transform your features into a higher dimensional, sparse space. Then train a linear model on these features. First fit an ensemble of

2017-01-15 04:22:03
Gradient Boosting regression

Demonstrate Gradient Boosting on the Boston housing dataset. This example fits a Gradient Boosting model with least squares loss and 500 regression trees

2017-01-15 04:22:44
Plot the decision surfaces of ensembles of trees on the iris dataset

Plot the decision surfaces of forests of randomized trees trained on pairs of features of the iris dataset. This

2017-01-15 04:25:02
Plot the decision boundaries of a VotingClassifier

Plot the decision boundaries of a VotingClassifier for two features of the Iris dataset. Plot the class probabilities

2017-01-15 04:25:01