Narval is a framework dedicated to the setting up of intelligent personal assistants (IPAs). It includes a language, an interpreter, and a GUI/IDE. It is based on artificial intelligence and agent technologies. It executes recipes (sequences of actions) to perform tasks. It is easy to specify new actions using XML and to implement them using Python. Recipes can be constructed graphically (without programming) by linking blocks representing the actions.
aiParts is a set of C++ classes that can be used to implement artificial intelligence, including classes that implement the HighHope technique. Sample programs include "find the shortest path" and "assign people and/or equipment to projects". A problem assembled from subclasses of the High-Hope classes knows how to solve itself by searching for a good solution.
Mindmeld is an enterprise-capable knowledge sharing system designed for any Web community that needs to capture and share information. It is unique in that the knowledge base grows smarter every time it's used. It incorporates terms used in each search into a contextual map of the answer itself, continually improving its ability to derive contextual information from a given search. The system learns how people typically search for an answer by identifying which terms are most valuable in any specific context.
GENESIS (short for GEneral NEural SImulation System) is a general purpose simulation platform that was developed to support the simulation of neural systems ranging from subcellular components and biochemical reactions to complex models of single neurons, simulations of large networks, and systems-level models. It was developed as a research tool to provide a standard and flexible means for constructing structurally realistic models of biological neural systems.
ffnet is a fast and easy-to-use feed-forward neural network training solution for Python. You can use it to train, test, save, load, and use an artificial neural network with sigmoid activation functions. Any network connectivity without cycles is allowed (not only layered). Training can be performed with several optimization schemes, including genetic alorithm based optimization. There is access to exact partial derivatives of network outputs versus its inputs. Normalization of data is handled automatically by ffnet.