Boosting is a meta-learning approach that aims at combining an ensemble of weak classifiers to form a strong classifier. Adaptive Boosting (Adaboost) implements this idea as a greedy search for a linear combination of classifiers by overweighting the examples that are misclassified by each classifier. icsiboost implements Adaboost over stumps (one-level decision trees) on discrete and continuous attributes (words and real values). This approach is one of the most efficient and simple to combine continuous and nominal values. This implementation is aimed at allowing training from millions of examples by hundreds of features in a reasonable amount of time/memory.
|Tags||Scientific/Engineering Artificial Intelligence|
Release Notes: This release brings a few bugfixes in training and test procedures, and error rate reports on multi-class problems. Moreover, optimization of the most called functions brought nice training speed improvements. This release also updates the documentation and tries to improve the handling of rare cases. The F-measure framework has been widely tested on diverse classification problems.
Release Notes: Options for easier feature and parameter selection were added. The classifier can display F-measure and save a model at the iteration maximizing it. At test time, posterior probabilities can be output. Training can be interrupted and resumed later from a partial model.
Release Notes: This release brings multi-label classification, n/s/f-gram experts, optimal iterations on a development set, regular-expression based column selection, and a few bugfixes.
No changes have been submitted for this release.