ADABOOST
A=ADABOOST(H) implements basic boosting algorithm using max_margin linear hyperplanes.
Hyperparameters, and their defaults
kmax = 5 -- maximum number of weak learners
Methods:
train, test
Model:
child -- hyperplanes
Example:
Use adaboost with 1-knn as weak learner and validate with 2 fold cross validation.
c1=[2,0];
c2=[-2,0];
X1=randn(50,2)+repmat(c1,50,1);
X2=randn(50,2)+repmat(c2,50,1);
[r,a]=train(cv(adaboost(knn),'folds=2'),d);
|
Reference : The boosting approach to machine learning: An overview |
|
Author : Robert E. Schapire |
There is more than one adaboost available. See also
help adaboost/adaboost.m