Release Notes: This release switches the build system to using CMake. It adds some fancy interactive demos and ipython notebooks that you can also run in the cloud (see the links for further stats). There are other new features and many internal improvements, bugfixes, and documentation improvements. To speak in numbers, this release merges more than 2000 commits changing almost 400000 lines in more than 7000 files, and increases the number of unit tests from 50 to 600.
Release Notes: This major update adds many improvements, new features, and bugfixes. It includes everything which has been carried out before and during the Google Summer of Code 2012. Students have implemented various new features such as structured output learning, gaussian processes, latent variable SVM (and structured output learning), statistical tests in kernel reproducing spaces, various multitask learning algorithms, and various usability improvements, to name a few.
Release Notes: This release features interfaces to new languages including Java, C#, Ruby, and Lua, a model selection framework, many dimension reduction techniques, Gaussian Mixture Model estimation, and a full-fledged online learning framework.
Release Notes: With over 600 examples and polished tutorials, this release contains major documentation and Webpage improvements. In addition, several bugs have been fixed and cleanups performed (including the removal of the confusing init_kernel command). Several new methods have been implemented, including a domain adaptation support vector machine and multiclass multiple kernel learning.
Release Notes: This release contains a quite large number of bugfixes documentation updates (tutorials and a method overview are now available for C++ developers, with static and modular interfaces). Multiple Kernel Learning has been reworked, and works using interleaved optimization via SVMLight or the wrapper algorithm via any SVM like LibSVM for regression and one and two-class classification.
Release Notes: This release adds several drastic changes. Most importantly, shogun has been split into libshogun and libshogunui, multiple kernel learning (for classification) is now available via glpk, support has been added for "dotfeatures" which enable learning of linear classifiers with mixed datatypes, and shogun now runs on the iphone.
Release Notes: This release contains several major enhancements and bugfixes. Experimental support for the modular R interface was added. All python-modular examples have been ported to r-modular. The "send_command" legacy is no longer necessary; numbers can be used as such and don't have to be given as strings. All examples for R, Python, Octave, and Matlab have been converted to the new syntax. The command line interface has been resurrected and basic functionality was restored. The documentation was updated and a number of bugs have been fixed.
Release Notes: Shogun now fully support the octave-modular interface. All python-modular examples describing the use of kernels, classifier, distributions, features, distances, regression, and preprocessors have been ported to octave-modular. The documentation received minor updates. The swig director has been unconditionally disabled, which reduces wrapper code size and compile time and also speeds up calls to virtual functions significantly. Big speed improvements are to be expected if you were using the python-modular interface. Command-line help was improved.
Release Notes: The static R, Octave, Matlab, and Python interfaces have been rewritten from scratch, simplifying future extensions. They now use the same syntax and support the same set of commands. Toy examples describing the use of kernels, classifiers, distributions, features, distances, regression methods, and preprocessors for the static Python, R, Octave, and Matlab interface have been added. The user documentation has been improved. Support for ACML and Intel MKL and POIMs for the Python modular interface was added. Major memory leaks were fixed.
Release Notes: This release brings a more mature Python modular interface: it now contains a full fledged test suite for all implemented methods, interactive documentation, and toy examples describing everything, the use of kernels, classifier, distributions, features, distances, regression, and preprocessors. The code is now doxygen documented and many minor improvements (e.g. reading strings directly from file) were added. Several memory leaks and crashers have been fixed. The WDSVMOcas method was added. SVMOCAS and liblinear were updated, fixing minor problems.