ffnet is a fast and easy-to-use feed-forward neural network training solution for Python. You can use it to train, test, save, load, and use an artificial neural network with sigmoid activation functions. Any network connectivity without cycles is allowed (not only layered). Training can be performed with several optimization schemes, including genetic alorithm based optimization. There is access to exact partial derivatives of network outputs versus its inputs. Normalization of data is handled automatically by ffnet.
|Tags||Scientific/Engineering Artificial Intelligence Software Development Libraries Python Modules|
|Operating Systems||Windows Windows Windows POSIX|
Release Notes: Neural network training can now use multi-processor systems (see the example mptrain.py). Attributes for calculation of network derivatives are generated on demand. Compatibility with the newest versions of numpy, scipy, and networkx is enhanced. Support for *export to java* and *drawing network with drawffnet* is dropped. The basic API is left almost untouched. Exactly the same training scripts as for older versions should work without problems.
Release Notes: This release contains minor enhancements and compatibility improvements. ffnet now works with networkx 0.99 or later. Neural network can be called now with a 2D array of inputs; it also returns a numpy array instead of a Python list. The readdata function is now an alias to numpy.loadtxt. The docstrings were improved.
Release Notes: This is mainly a bugfix release. The "readdata" function was added, which simplifies reading training data from ASCII files. A bug that prevented ffnet from working with scipy-0.6.0 was fixed. Importing ffnet no longer needs matplotlib. Corrections were made in the Fortran code generators.
Release Notes: New features added in this release include an rprop training algorithm, a function for exporting a network to the Fortran subroutines, a new architecture generator, and a draft network drawing tool.
No changes have been submitted for this release.