ffnet is a fast and easy-to-use feed-forward neural network training solution for Python. You can use it to train, test, save, load, and use an artificial neural network with sigmoid activation functions. Any network connectivity without cycles is allowed (not only layered). Training can be performed with several optimization schemes, including genetic alorithm based optimization. There is access to exact partial derivatives of network outputs versus its inputs. Normalization of data is handled automatically by ffnet.
PySWIP is a Python/SWI-Prolog bridge that enables you to query in prolog using SWI-Prolog in your Python programs. It includes both a SWI-Prolog foreign language interface and a utility class that makes it easy to query with SWI-Python. Since it uses SWI-Prolog as a shared library and ctypes to access it, it doesn't require compilation to be installed.
libphidgets is a user-space access library for the Phidget devices. It provides a generic and flexible way to access and interact with the Phidgets, and comes with all the advantages of a user-space library. It is based on libhid (which is based on libusb), thus it requires no HID support in the kernel. Furthermore, it aims to support all operating system supported by libusb/libhid: Linux, BSD, OS X, and Windows.
Pyro, Python Robotics is a top-down approach to programming real and simulated robots. It is a library, GUI, and set of objects in Python that allows beginning and experienced roboticists alike to easily control mobile robots. It comes with a simulator, and also works with Player/Stage/ Gazebo. Hardware supported includes ActivMedia's Pioneer, K-Team's Khepera and Hemisson, Sony's AIBO, Evolution's ER1, and others. It also contains Python code for artificial neural networks, genetic algorithm/programming, vision (V4L), self-organizing maps, mapping, localization, and more AI-related code.
PyStem is a fast Python module with the the Porter stemming algorithm (a process for removing the commoner morphological and inflexional endings from words in English; its main use is as part of a term normalisation process that is usually done when setting up Information Retrieval systems).