Boosting (machine learning)

boostingBoosting (meta-algorithm)boostedBoosting Classificationweak
In machine learning, boosting is an ensemble meta-algorithm for primarily reducing bias, and also variance in supervised learning, and a family of machine learning algorithms that convert weak learners to strong ones.wikipedia
100 Related Articles

AdaBoost

AdaBoost algorithm
Schapire and Freund then developed AdaBoost, an adaptive boosting algorithm that won the prestigious Gödel Prize.
AdaBoost, short for Adaptive Boosting, is a machine learning meta-algorithm formulated by Yoav Freund and Robert Schapire, who won the 2003 Gödel Prize for their work.

Ensemble learning

ensembles of classifiersensembleBayesian model averaging
In machine learning, boosting is an ensemble meta-algorithm for primarily reducing bias, and also variance in supervised learning, and a family of machine learning algorithms that convert weak learners to strong ones.
Some different ensemble learning approaches based on artificial neural networks, kernel principal component analysis (KPCA), decision trees with boosting, random forest and automatic design of multiple classifier systems, are proposed to efficiently identify land cover objects.

BrownBoost

There are many more recent algorithms such as LPBoost, TotalBoost, BrownBoost, xgboost, MadaBoost, LogitBoost, and others.
BrownBoost is a boosting algorithm that may be robust to noisy datasets.

Robert Schapire

Robert E. SchapireSchapire
Robert Schapire's affirmative answer in a 1990 paper to the question of Kearns and Valiant has had significant ramifications in machine learning and statistics, most notably leading to the development of boosting. The original ones, proposed by Robert Schapire (a recursive majority gate formulation) and Yoav Freund (boost by majority), were not adaptive and could not take full advantage of the weak learners.
His work led to the development of the boosting ensemble algorithm used in machine learning.

LPBoost

There are many more recent algorithms such as LPBoost, TotalBoost, BrownBoost, xgboost, MadaBoost, LogitBoost, and others.
Linear Programming Boosting (LPBoost) is a supervised classifier from the boosting family of classifiers.

Supervised learning

supervisedsupervised machine learningsupervised classification
In machine learning, boosting is an ensemble meta-algorithm for primarily reducing bias, and also variance in supervised learning, and a family of machine learning algorithms that convert weak learners to strong ones.

LogitBoost

LogitBoost algorithm
There are many more recent algorithms such as LPBoost, TotalBoost, BrownBoost, xgboost, MadaBoost, LogitBoost, and others.
In machine learning and computational learning theory, LogitBoost is a boosting algorithm formulated by Jerome Friedman, Trevor Hastie, and Robert Tibshirani.

Michael Kearns (computer scientist)

Michael KearnsKearnsM. Kearns
Boosting is based on the question posed by Kearns and Valiant (1988, 1989): "Can a set of weak learners create a single strong learner?"
The question "is weakly learnability equivalent to strong learnability?" posed by Kearns and Valiant (Unpublished manuscript 1988, ACM Symposium on Theory of Computing 1989) is the origin of boosting machine learning algorithms, which got a positive answer by Robert Schapire (1990, proof by construction, not practical) and Yoav Freund (1993, by voting, not practical) and then they developed the practical AdaBoost (European Conference on Computational Learning Theory 1995, Journal of Computer and System Sciences 1997), an adaptive boosting algorithm that won the prestigious Gödel Prize (2003).

Statistical classification

classificationclassifierclassifiers
A weak learner is defined to be a classifier that is only slightly correlated with the true classification (it can label examples better than random guessing).

Gradient boosting

boosted decision treeBoosted treesboosted decision trees
It builds the model in a stage-wise fashion like other boosting methods do, and it generalizes them by allowing optimization of an arbitrary differentiable loss function.

Margin classifier

Of particular prominence is the generalization error bound on boosting algorithms and support vector machines.

Bootstrap aggregating

baggingBootstrap aggregationbagged nearest neighbour classifier

Random forest

random forestsRandom multinomial logitRandom naive Bayes

Machine learning

machine-learninglearningstatistical learning
In machine learning, boosting is an ensemble meta-algorithm for primarily reducing bias, and also variance in supervised learning, and a family of machine learning algorithms that convert weak learners to strong ones. Robert Schapire's affirmative answer in a 1990 paper to the question of Kearns and Valiant has had significant ramifications in machine learning and statistics, most notably leading to the development of boosting.

Metaheuristic

metaheuristicsmeta-algorithmheuristics
In machine learning, boosting is an ensemble meta-algorithm for primarily reducing bias, and also variance in supervised learning, and a family of machine learning algorithms that convert weak learners to strong ones.

Leslie Valiant

Leslie G. ValiantValiantLeslie Gabriel Valiant
Boosting is based on the question posed by Kearns and Valiant (1988, 1989): "Can a set of weak learners create a single strong learner?"

Statistics

statisticalstatistical analysisstatistician
Robert Schapire's affirmative answer in a 1990 paper to the question of Kearns and Valiant has had significant ramifications in machine learning and statistics, most notably leading to the development of boosting.

Yoav Freund

Freund
Freund and Schapire's arcing (Adapt[at]ive Resampling and Combining), as a general technique, is more or less synonymous with boosting. The original ones, proposed by Robert Schapire (a recursive majority gate formulation) and Yoav Freund (boost by majority), were not adaptive and could not take full advantage of the weak learners.

Weighting

weightedweighta balance
After a weak learner is added, the data weights are readjusted, known as "re-weighting".

Recursion (computer science)

recursionrecursiverecursively
The original ones, proposed by Robert Schapire (a recursive majority gate formulation) and Yoav Freund (boost by majority), were not adaptive and could not take full advantage of the weak learners.

Adaptive behavior

adaptivemaladaptiveadaptive functioning
The original ones, proposed by Robert Schapire (a recursive majority gate formulation) and Yoav Freund (boost by majority), were not adaptive and could not take full advantage of the weak learners.