What Does AdaBoost Mean?

AdaBoost is a type of algorithm that uses an ensemble learning approach to weight various inputs. It was designed by Yoav Freund and Robert Schapire in the early 21st century. It has now become somewhat of a go-to method for different kinds of boosting in machine learning paradigms.


Techopedia Explains AdaBoost

Experts talk about AdaBoost as one of the best weighted combinations of classifiers – and one that is sensitive to noise, and conducive to certain machine learning results. Some confusion results from the reality that AdaBoost can be used with multiple instances of the same classifier with different parameters – where professionals might talk about AdaBoost "having only one classifier" and get confused about how weighting occurs.

AdaBoost also presents a particular philosophy in machine learning – as an ensemble learning tool, it proceeds from the fundamental idea that many weak learners can get better results than one stronger learning entity. With AdaBoost, machine learning experts are often crafting systems that will take in a number of inputs and combine them for an optimized result. Some take this idea to a further extent, talking about how AdaBoost can command "armies of decision stumps" that are essentially less sophisticated learners employed in large numbers to crunch data where this approach is seen favorably over using a single classifier.


Related Terms

Margaret Rouse
Technology Expert

Margaret is an award-winning technical writer and teacher known for her ability to explain complex technical subjects to a non-technical business audience. Over the past twenty years, her IT definitions have been published by Que in an encyclopedia of technology terms and cited in articles by the New York Times, Time Magazine, USA Today, ZDNet, PC Magazine, and Discovery Magazine. She joined Techopedia in 2011. Margaret's idea of a fun day is helping IT and business professionals learn to speak each other’s highly specialized languages.