Projects / icsiboost


Boosting is a meta-learning approach that aims at combining an ensemble of weak classifiers to form a strong classifier. Adaptive Boosting (Adaboost) implements this idea as a greedy search for a linear combination of classifiers by overweighting the examples that are misclassified by each classifier. icsiboost implements Adaboost over stumps (one-level decision trees) on discrete and continuous attributes (words and real values). This approach is one of the most efficient and simple to combine continuous and nominal values. This implementation is aimed at allowing training from millions of examples by hundreds of features in a reasonable amount of time/memory.

Operating Systems

Recent releases

  •  19 Mar 2008 20:23

    Release Notes: This release brings a few bugfixes in training and test procedures, and error rate reports on multi-class problems. Moreover, optimization of the most called functions brought nice training speed improvements. This release also updates the documentation and tries to improve the handling of rare cases. The F-measure framework has been widely tested on diverse classification problems.

    •  25 Jan 2008 07:01

      Release Notes: Options for easier feature and parameter selection were added. The classifier can display F-measure and save a model at the iteration maximizing it. At test time, posterior probabilities can be output. Training can be interrupted and resumed later from a partial model.

      •  07 Nov 2007 21:54

        Release Notes: This release brings multi-label classification, n/s/f-gram experts, optimal iterations on a development set, regular-expression based column selection, and a few bugfixes.

        •  24 Jul 2007 01:07

          No changes have been submitted for this release.


          Project Spotlight


          A Fluent OpenStack client API for Java.


          Project Spotlight

          TurnKey TWiki Appliance

          A TWiki appliance that is easy to use and lightweight.